• Allero@lemmy.today
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 days ago

    I’m afraid Europol is shooting themselves in the foot here.

    What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.

    Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there’s no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it’s easier to escalate, and that’s dangerous.

    As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.

    • raptir@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      What would stop someone from creating a tool that tagged real images as AI generated?

      Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.

      • Allero@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        Some form of digital signatures for allowed services?

        Sure, it will limit the choice of where to legally generate content, but it should work.

        • raptir@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 days ago

          I highly doubt any commercially available service is going to get in on officially generating photorealistic CSAM.

            • raptir@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              22 hours ago

              …and then we’re back at “someone can take that model and tag real images to appear AI-generated.”

              You would need a closed-source model run server-side in order to prevent that.

    • Fungah@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      3 days ago

      I haven’t read any of this research because, like, the only feelings I have about pedophiles are outright contempt and a small amount of pity for the whole fucking destructive evilness of it all, but I’ve been told having access to drawings and images and whatnot makes people more likely to act on their impulses.

      And like. I don’t think images of CSAM in any form, no matter how far removed they are from real people, actually contribute anything worthwhile st all yo the world, so like. I dunno.

      Really couldn’t give two squirts of piss of about anything that makes a pedophiles life harder. Human garbage.

    • turnip@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      3 days ago

      You can download the models and compile them yourself, that will be as effective as the US government was at banning encryption.