• KillingTimeItself@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 days ago

    i have no problem with ai porn assuming it’s not based on any real identities, i think that should be considered identity theft or impersonation or something.

    Outside of that, it’s more complicated, but i don’t think it’s a net negative, people will still thrive in the porn industry, it’s been around since it’s been possible, i don’t see why it wouldn’t continue.

    • ubergeek@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 days ago

      i have no problem with ai porn assuming it’s not based on any real identities

      With any model in use, currently, that is impossible to meet. All models are trained on real images.

      • KillingTimeItself@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 days ago

        With any model in use, currently, that is impossible to meet. All models are trained on real images.

        yes but if i go to thispersondoesnotexist.com and generate a random person, is that going to resemble the likeness of any given real person close enough to perceptibly be them?

        You are literally using the schizo argument right now. “If an artists creates a piece depicting no specific person, but his understanding of persons is based inherently on the facial structures of other people that he knows and recognizes, therefore he must be stealing their likeness”

        • ubergeek@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          No, the problem is a lack of consent of the person being used.

          And now, being used to generate depictions of rape and CSAM.

          • KillingTimeItself@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            13 hours ago

            yeah but like, legally, is this even a valid argument? Sure there is techically probably like 0.0001% of the average person being used in any given result of an AI generated image. I don’t think that gives anyone explicit rights to that portion however.

            That’s like arguing that a photographer who captured you in a random photo in public that became super famous is now required to pay you royalties for being in that image, even though you are literally just a random fucking person.

            You can argue about consent all you want, but at the end of the day if you’re posting images of yourself online, you are consenting to other people looking at them, at a minimum. Arguably implicitly consenting to other people being able to use those images. (because you can’t stop people from doing that, except for copyright, but that’s not very strict in most cases)

            And now, being used to generate depictions of rape and CSAM.

            i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim, otherwise it’s no different than me editing an image of someone to make it look like they got shot in the face. Is that shitty? Sure. But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.

            The fundamental problem here is that you’re in an extremely uphill position to even begin the argument of “well it’s trained on people so therefore it uses the likeness of those people”

            Does a facial structure recognition model use the likeness of other people? Even though it can detect any person that meets the requirements established by its training data? There is no suitable method to begin to breakdown at what point that persons likeness begins, and at what point it ends. it’s simply an impossible task.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 days ago

      Identity theft only makes sense for businesses. I can sketch naked Johny Depp in my sketchbook and do whatever I want with it and no one can stop me. Why should an AI tool be any different if distribution is not involved?

      • KillingTimeItself@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 days ago

        revenge porn, simple as. Creating fake revenge porn of real people is still to some degree revenge porn, and i would argue stealing someones identity/impersonation.

        To be clear, you’re example is a sketch of johnny depp, i’m talking about a video of a person that resembles the likeness of another person, where the entire video is manufactured. Those are fundamentally, two different things.

          • KillingTimeItself@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 days ago

            sort of. There are arguments that private ownership of these videos is also weird and shitty, however i think impersonation and identity theft are going to the two most broadly applicable instances of relevant law here. Otherwise i can see issues cropping up.

            Other people do not have any inherent rights to your likeness, you should not simply be able to pretend to be someone else. That’s considered identity theft/fraud when we do it with legally identifying papers, it’s a similar case here i think.

            • Dr. Moose@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              4 days ago

              But the thing is it’s not a relevant law here at all as nothing is being distributed and no one is being harmed. Would you say the same thing if AI is not involved? Sure it can be creepy and weird and whatnot but it’s not inhertly harmful or at least it’s not obvious how it would be.

              • KillingTimeItself@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                12 hours ago

                the only perceivable reason to create these videos is either for private consumption, in which case, who gives a fuck. Or for public distribution, otherwise you wouldn’t create them. And you’d have to be a bit of a weird breed to create AI porn of specific people for private consumption.

                If AI isn’t involved, the same general principles would apply, except it might include more people now.

                • Dr. Moose@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  12 hours ago

                  I’ve been thinking about this more and I think one interesting argument here is “toxic culture growth”. As in even if the thing is not distributed it might grow undesired toxic cultures througu indirect exposures (like social media or forums discussions) even without the direct sharing.

                  I think this is slippery to the point of government mind control but maybe there’s something valuable to research here either way.