• Sans_Seraph@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 days ago

    What I find incredible is that they named this thing WanX, which can alternatively be pronounced Wanks. Nominative Determinism at its finest

  • Aggravationstation@feddit.uk
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    5 days ago

    The year is 1997. A young boy is about to watch a porn video for the first time on a grainy VHS tape. An older version of himself from the far off year of 2025 appears.

    Me: “You know, in the future, you’ll make your own porn videos.”

    90s me: “Wow, you mean I’ll get to have lots of hot sex!?!?”

    Me: “Ha! No. So Nvidia will release this system called CUDA…”

    • turnip@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      Then another company called Deepseek will release a system called low level programming that replaces CUDA.

    • CodexArcanum@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      I thought this was going to go Watchmen for a moment. Like…

      It is 1997, I am a young boy, I am jerking off to a grainy porno playing over stolen cinemax.

      It is 2007, i am in my dorm, i am jerking off to a forum thread full of hi-res porno.

      It is 2027, i am jerking off to an ai porno stream that mutates to my desires in real time. I am about to nut so hard that it shatters my perception of time.

  • Ulrich@feddit.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 days ago

    Oh my God! That’s disgusting! AI porn online!? Where!? Where do they post those!? There’s so many of them, though. Which one?

  • MoonlightFox@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    5 days ago

    First off, I am sex positive, pro porn, pro sex work, and don’t believe sex work should be shameful, and that there is nothing wrong about buying intimacy from a willing seller.

    That said. The current state of the industry and the conditions for many professionals raises serious ethical issues. Coercion being the biggest issue.

    I am torn about AI porn. On one hand it can produce porn without suffering, on the other hand it might be trained on other peoples work and take peoples jobs.

    I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.

    • KillingTimeItself@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 days ago

      i have no problem with ai porn assuming it’s not based on any real identities, i think that should be considered identity theft or impersonation or something.

      Outside of that, it’s more complicated, but i don’t think it’s a net negative, people will still thrive in the porn industry, it’s been around since it’s been possible, i don’t see why it wouldn’t continue.

      • ubergeek@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 days ago

        i have no problem with ai porn assuming it’s not based on any real identities

        With any model in use, currently, that is impossible to meet. All models are trained on real images.

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          With any model in use, currently, that is impossible to meet. All models are trained on real images.

          yes but if i go to thispersondoesnotexist.com and generate a random person, is that going to resemble the likeness of any given real person close enough to perceptibly be them?

          You are literally using the schizo argument right now. “If an artists creates a piece depicting no specific person, but his understanding of persons is based inherently on the facial structures of other people that he knows and recognizes, therefore he must be stealing their likeness”

          • ubergeek@lemmy.today
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 days ago

            No, the problem is a lack of consent of the person being used.

            And now, being used to generate depictions of rape and CSAM.

            • KillingTimeItself@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              13 hours ago

              yeah but like, legally, is this even a valid argument? Sure there is techically probably like 0.0001% of the average person being used in any given result of an AI generated image. I don’t think that gives anyone explicit rights to that portion however.

              That’s like arguing that a photographer who captured you in a random photo in public that became super famous is now required to pay you royalties for being in that image, even though you are literally just a random fucking person.

              You can argue about consent all you want, but at the end of the day if you’re posting images of yourself online, you are consenting to other people looking at them, at a minimum. Arguably implicitly consenting to other people being able to use those images. (because you can’t stop people from doing that, except for copyright, but that’s not very strict in most cases)

              And now, being used to generate depictions of rape and CSAM.

              i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim, otherwise it’s no different than me editing an image of someone to make it look like they got shot in the face. Is that shitty? Sure. But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.

              The fundamental problem here is that you’re in an extremely uphill position to even begin the argument of “well it’s trained on people so therefore it uses the likeness of those people”

              Does a facial structure recognition model use the likeness of other people? Even though it can detect any person that meets the requirements established by its training data? There is no suitable method to begin to breakdown at what point that persons likeness begins, and at what point it ends. it’s simply an impossible task.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 days ago

        Identity theft only makes sense for businesses. I can sketch naked Johny Depp in my sketchbook and do whatever I want with it and no one can stop me. Why should an AI tool be any different if distribution is not involved?

        • KillingTimeItself@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 days ago

          revenge porn, simple as. Creating fake revenge porn of real people is still to some degree revenge porn, and i would argue stealing someones identity/impersonation.

          To be clear, you’re example is a sketch of johnny depp, i’m talking about a video of a person that resembles the likeness of another person, where the entire video is manufactured. Those are fundamentally, two different things.

            • KillingTimeItself@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              4 days ago

              sort of. There are arguments that private ownership of these videos is also weird and shitty, however i think impersonation and identity theft are going to the two most broadly applicable instances of relevant law here. Otherwise i can see issues cropping up.

              Other people do not have any inherent rights to your likeness, you should not simply be able to pretend to be someone else. That’s considered identity theft/fraud when we do it with legally identifying papers, it’s a similar case here i think.

              • Dr. Moose@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                4 days ago

                But the thing is it’s not a relevant law here at all as nothing is being distributed and no one is being harmed. Would you say the same thing if AI is not involved? Sure it can be creepy and weird and whatnot but it’s not inhertly harmful or at least it’s not obvious how it would be.

                • KillingTimeItself@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  12 hours ago

                  the only perceivable reason to create these videos is either for private consumption, in which case, who gives a fuck. Or for public distribution, otherwise you wouldn’t create them. And you’d have to be a bit of a weird breed to create AI porn of specific people for private consumption.

                  If AI isn’t involved, the same general principles would apply, except it might include more people now.

      • MoonlightFox@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 days ago

        It can generate combinations of things that it is not trained on, so not necessarily a victim. But of course there might be something in there, I won’t deny that.

        However the act of generating something does not create a new victim unless there is someones likeness and it is shared? Or is there something ethical here, that I am missing?

        (Yes, all current AI is basically collective piracy of everyones IP, but besides that)

        • surewhynotlem@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 days ago

          Watching videos of rape doesn’t create a new victim. But we consider it additional abuse of an existing victim.

          So take that video and modify it a bit. Color correct or something. That’s still abuse, right?

          So the question is, at what point in modifying the video does it become not abuse? When you can’t recognize the person? But I think simply blurring the face wouldn’t suffice. So when?

          That’s the gray area. AI is trained on images of abuse (we know it’s in there somewhere). So at what point can we say the modified images are okay because the abused person has been removed enough from the data?

          I can’t make that call. And because I can’t make that call, I can’t support the concept.

          • Petter1@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 days ago

            With this logic, any output of any pic gen AI is abuse… I mean, we can 100% be sure that there are CP in training data (it would be a very bug surprise if not) and all output is result of all training data as far as I understand the statistical behaviour of photo gen AI.

              • Petter1@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 day ago

                😆as if this has something to do with that

                But to your argument: It is perfectly possible to tune capitalism using laws to get veerry social.

                I mean every “actually existing communist country” is in its core still a capitalist system, or how you argue against that?

              • Petter1@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 day ago

                Well AI is by design not able to curate its training data, but companies training the models would in theory be able to. But it is not feasible to sanitise this huge stack of data.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 days ago

      I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.

      I’ve been thinking about this recently too, and I have similar feelings.

      I’m just gonna come out and say it without beating around the bush: what is the law’s position on AI-generated child porn?

      More importantly, what should it be?

      It goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn’t?

      If we’re basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour (i.e. increasing demand), or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour (I.e. lowering demand).

      And to know that, we’d need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don’t know whether it would or won’t), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody’s touching that with a ten foot pole.

      • michaelmrose@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 days ago

        Let’s play devils advocate. You find Bob the pedophile with pictures depicting horrible things. 2 things are true.

        1. Although you can’t necessarily help Bob you can lock him up preventing him from doing harm and permanently brand him as a dangerous person making it less likely for actual children to be harmed.

        2. Bob can’t claim actual depictions of abuse are AI generated and force you to find the unknown victim before you can lock him and his confederates up. If the law doesn’t distinguish between simulated and actual abuse then in both cases Bob just goes to jail.

        A third factor is that this technology and the inherent lack of privacy on the internet could potentially pinpoint numerous unknown pedophiles who can even if they haven’t done any harm yet be profitably persecuted to societies ultimate profit so long as you value innocent kids more than perverts.

        • shalafi@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 days ago

          Am I reading this right? You’re for prosecuting people who have broken no laws?

          I’ll add this; I have sexual fantasies (not involving children) that would be repugnant to me IRL. Should I be in jail for having those fantasies, even though I would never act on them?

          This sounds like some Minority Report hellscape society.

          • michaelmrose@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            4 days ago

            Am I reading this right? You’re for prosecuting people who have broken no laws?

            No I’m for making it against the law to simulate pedophile shit as the net effect is fewer abused kids than if such images were to be legal. Notably you are free to fantasize about whatever you like its the actual creation and sharing of images that would be illegal. Far from being a minority report hellscape its literally the present way things already are many places.

            • Petter1@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              4 days ago

              Lol, how can you say that do confidently? How would you know that with fewer AI CP you get less abused kids? And what is the logic behind it?

              Demand doesn’t really drop if something is illegal (same goes for drugs). The only thing you reduce is offering, which just resulting in making the thing that got illegal more valuable (this wakes attention of shady money grabbers that hate regulation / give a shit about law enforcement and therefore do illegal stuff to get money) and that you have to pay a shitton of government money maintaining all the prisons.

              • michaelmrose@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                1
                ·
                4 days ago

                Basically every pedo in prison is one who isn’t abusing kids. Every pedo on a list is one who won’t be left alone with a young family member. Actually reducing AI CP doesn’t actually by itself do anything.

                • AwesomeLowlander@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  4 days ago

                  Wrong. Every pedo in prison is one WHO HAS ALREADY ABUSED A CHILD, whether directly or indirectly. There is an argument to be made, and some studies that show, that dealing with Minor Attracted People before they cross the line can be effective. Unfortunately, to do this we need to be able to have a logical and civil conversation about the topic, and the current political climate does not allow for that conversation to be had. The consequence is that preventable crimes are not being prevented, and more children are suffering for it in the long run.

  • Jo Miran@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 days ago

    I am going to make this statement openly on the Internet. Feel free to make AI generated porn of me as long as it involves adults. Nobody is going to believe that a video of me getting railed by a pink wolf furry is real. Everyone knows I’m not that lucky.

    • Technus@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      Fortunately, most of my family is so tech illiterate that even if a real video got out, I could just tell them it’s a deepfake and they’d probably believe me.

  • TachyonTele@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 days ago

    Who are the girls in the picture? We can do this, team. Left to right, starting at the top.

    1. Gwen Stacy

    2. ??

    3. bayonet

    4. little mermaid

    5. ??

    6. ??

    7. Jinx

    8. ??

    9. Rei

    10. Rei

    11. lol Rei

    12. Aerith

      • brb@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 days ago

        How would one go about installing this? Asking because I don’t want to accidentally install it on my system

        • Sturgist@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          So, I definitely didn’t click! But having clicked on GitHub links in the past I can surmise that there’s a step by step install guide and also one for model acquisition. Just be sure not to click the link, and definitely do not follow what I assume is a very well written and easily understood step by step install guide.