Europol has supported authorities from 19 countries in a large-scale hit against child sexual exploitation that has led to 25 arrests worldwide. The suspects were part of a criminal group whose members were engaged in the distribution of images of minors fully generated by artificial intelligence (AI).
I’m afraid Europol is shooting themselves in the foot here.
What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.
Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there’s no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it’s easier to escalate, and that’s dangerous.
As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.
I haven’t read any of this research because, like, the only feelings I have about pedophiles are outright contempt and a small amount of pity for the whole fucking destructive evilness of it all, but I’ve been told having access to drawings and images and whatnot makes people more likely to act on their impulses.
And like. I don’t think images of CSAM in any form, no matter how far removed they are from real people, actually contribute anything worthwhile st all yo the world, so like. I dunno.
Really couldn’t give two squirts of piss of about anything that makes a pedophiles life harder. Human garbage.
I hope they don’t have access to a cloud computing provider somewhere, otherwise this is going to be a tough thing to enforce without a great firewall larger than China has.
It will be hilarious to see them attempt it though.
I’m afraid Europol is shooting themselves in the foot here.
What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.
Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there’s no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it’s easier to escalate, and that’s dangerous.
As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.
What would stop someone from creating a tool that tagged real images as AI generated?
Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.
Some form of digital signatures for allowed services?
Sure, it will limit the choice of where to legally generate content, but it should work.
I highly doubt any commercially available service is going to get in on officially generating photorealistic CSAM.
Open-source models exist and can be forked
…and then we’re back at “someone can take that model and tag real images to appear AI-generated.”
You would need a closed-source model run server-side in order to prevent that.
Yep, essentially. But that’s for the hyperrealistic one.
I haven’t read any of this research because, like, the only feelings I have about pedophiles are outright contempt and a small amount of pity for the whole fucking destructive evilness of it all, but I’ve been told having access to drawings and images and whatnot makes people more likely to act on their impulses.
And like. I don’t think images of CSAM in any form, no matter how far removed they are from real people, actually contribute anything worthwhile st all yo the world, so like. I dunno.
Really couldn’t give two squirts of piss of about anything that makes a pedophiles life harder. Human garbage.
You can download the models and compile them yourself, that will be as effective as the US government was at banning encryption.
https://www.theverge.com/policy/621848/uk-killing-encryption-e2e-apple-adp-privacy
I hope they don’t have access to a cloud computing provider somewhere, otherwise this is going to be a tough thing to enforce without a great firewall larger than China has.
It will be hilarious to see them attempt it though.