Law AI-generated child sex abuse images pose challenges for federal prosecutors - Child sex abuse cases mark some of the first times that prosecutors are trying to apply existing U.S. laws to alleged crimes involving AI.

1.png
[Image: putilov_denis/Adobe Stock]

U.S. federal prosecutors are stepping up their pursuit of suspects who use artificial intelligence tools to manipulate or create child sex abuse images, as law enforcement fears the technology could spur a flood of illicit material.

The U.S. Justice Department has brought two criminal cases this year against defendants accused of using generative AI systems, which create text or images in response to user prompts, to produce explicit images of children.

“There’s more to come,” said James Silver, the chief of the Justice Department’s Computer Crime and Intellectual Property Section, predicting further similar cases.“

What we’re concerned about is the normalization of this,” Silver said in an interview. “AI makes it easier to generate these kinds of images, and the more that are out there, the more normalized this becomes. That’s something that we really want to stymie and get in front of.”

The rise of generative AI has sparked concerns at the Justice Department that the rapidly advancing technology will be used to carry out cyberattacks, boost the sophistication of cryptocurrency scammers, and undermine election security.

Child sex abuse cases mark some of the first times that prosecutors are trying to apply existing U.S. laws to alleged crimes involving AI, and even successful convictions could face appeals as courts weigh how the new technology may alter the legal landscape around child exploitation.

Prosecutors and child safety advocates say generative AI systems can allow offenders to morph and sexualize ordinary photos of children and warn that a proliferation of AI-produced material will make it harder for law enforcement to identify and locate real victims of abuse.

The National Center for Missing and Exploited Children, a nonprofit group that collects tips about online child exploitation, receives an average of about 450 reports each month related to generative AI, according to Yiota Souras, the group’s chief legal officer.

That’s a fraction of the average of 3 million monthly reports of overall online child exploitation the group received last year.

Untested ground​

Cases involving AI-generated sex abuse imagery are likely to tread new legal ground, particularly when an identifiable child is not depicted.

Silver said in those instances, prosecutors can charge obscenity offenses when child pornography laws do not apply.

Prosecutors indicted Steven Anderegg, a software engineer from Wisconsin, in May on charges including transferring obscene material. Anderegg is accused of using Stable Diffusion, a popular text-to-image AI model, to generate images of young children engaged in sexually explicit conduct and sharing some of those images with a 15-year-old boy, according to court documents.

Anderegg has pleaded not guilty and is seeking to dismiss the charges by arguing that they violate his rights under the U.S. Constitution, court documents show.
He has been released from custody while awaiting trial. His attorney was not available for comment.

Stability AI, the maker of Stable Diffusion, said the case involved a version of the AI model that was released before the company took over the development of Stable Diffusion. The company said it has made investments to prevent “the misuse of AI for the production of harmful content.”

Federal prosecutors also charged a U.S. Army soldier with child pornography offenses in part for allegedly using AI chatbots to morph innocent photos of children he knew to generate violent sexual abuse imagery, court documents show.

The defendant, Seth Herrera, pleaded not guilty and has been ordered held in jail to await trial. Herrera’s lawyer did not respond to a request for comment.

Legal experts said that while sexually explicit depictions of actual children are covered under child pornography laws, the landscape around obscenity and purely AI-generated imagery is less clear.

The U.S. Supreme Court in 2002 struck down as unconstitutional a federal law that criminalized any depiction, including computer-generated imagery, appearing to show minors engaged in sexual activity.

“These prosecutions will be hard if the government is relying on the moral repulsiveness alone to carry the day,” said Jane Bambauer, a law professor at the University of Florida who studies AI and its impact on privacy and law enforcement.

Federal prosecutors have secured convictions in recent years against defendants who possessed sexually explicit images of children that also qualified as obscene under the law.

Advocates are also focusing on preventing AI systems from generating abusive material.

Two nonprofit advocacy groups, Thorn and All Tech Is Human, secured commitments in April from some of the largest players in AI including Alphabet’s Google, Amazon.com, Facebook and Instagram parent Meta Platforms, OpenAI and Stability AI to avoid training their models on child sex abuse imagery and to monitor their platforms to prevent its creation and spread.

“I don’t want to paint this as a future problem, because it’s not. It’s happening now,” said Rebecca Portnoff, Thorn’s director of data science.

“As far as whether it’s a future problem that will get completely out of control, I still have hope that we can act in this window of opportunity to prevent that.”

Article Link

Archive
 
Last edited:
It actually doesn't, as the law (as it is written, how it is enforced is another subject entirely) says that, in the case that it is indistinguishable from an actual child, it is to be treated as such.

Also, I now wonder who will be the mouthbreather of the month who will get banned for defending the legality of this
 
It actually doesn't, as the law (as it is written, how it is enforced is another subject) says that, in the case that it is indistinguishable from an actual child, it is to be treated as such.

Also, I mow wonder who will be the mouthbreather of the month who will get banned for defending the legality of this
they will have an anime pfp
because they think the japanese totally have the same (false) view that it's just drawings and aren't a weird people that prostitute middle schoolers
 
they will have an anime pfp
because they think the japanese totally have the same (false) view that it's just drawings and aren't a weird people that prostitute middle schoolers
The funny thing is that, even among the respected people of the industry, lolicon (and in particular their brand of "based Japan you can have sex with 13yos there!") it is a bit of a tacky subject. Hideaki Anno (the director of NGE) even made a whole movie criticizing the practice back in the late 90s
 
The funny thing is that, even among the respected people of the industry, lolicon (and in particular their brand of "based Japan you can have sex with 13yos there!") it is a bit of a tacky subject. Hideaki Anno (the director of NGe) made a whole movie about it back in the late 90s
Didn't they change that a few years back so 18 is the age of consent like every other Western Country because of the Olympics or something?
 
  • Thunk-Provoking
Reactions: PhoBingas
If ethical CP were an indisputable end point for pedos and would give them an lifetime outlet to never offend for real, it could be argued that they should be allowed this shit. But a lot of people agree that it isn't and won't. There really won't be a good solution until we start reopening the lunatic asylums or having dedicated prisons for these recidivist sex offenders to rot in until they die.
I do think drawings of shit like lolicon are a different debate though.
 
If ethical CP were an indisputable end point for pedos and would give them an lifetime outlet to never offend for real, it could be argued that they should be allowed this shit. But a lot of people agree that it isn't and won't. There really won't be a good solution until we start reopening the lunatic asylums or having dedicated prisons for these recidivist sex offenders to rot in until they die.
I do think drawings of shit like lolicon are a different debate though.
Pornography only feeds desires; it doesn't sate them.
it's the same debate
 
Didn't they change that a few years back so 18 is the age of consent like every other Western Country because of the Olympics or something?
They have had it at 16* on a prefectural level for quite a while, but only recently did the national age get raised to 16 as well (basically, before the national AoC was raised, a prefecture could theoretically lower it to 13, now they can't)

*And more than just being legal or illegal they have a system where the gravity of the penalty (or if there's a penalty at all) depends on the age of both parties, as well as their sex
Screenshot_20241018_024221_YouTube.jpg
This chart explains it
 
Didn't they change that a few years back so 18 is the age of consent like every other Western Country because of the Olympics or something?
Plenty of Western Countries have a lower age of consent.

13-16 is the general baseline across Europe. There's the odd one with exceptions and caveats like the UK where you can consent to sex at 16 but aren't allowed to send or recieve sexually suggestive messages until 18, but that's rare.
 
I keep getting spammed with politics ads about this lately, and it's framed as an anti-bullying thing rather than an anti pedo thing which threw me off a little bit. Usually when the idea of ai generated child porn is brought up it's always pedo shit INSTANTLY. But no, these ads specifically do that kind of weird pretentious fear mongering and lobby money begging the sandy hook people do minus actual corpses of children being involved.

I think everyone's just fully aware the people in charge of the world want only megacorps and governments to have access to image generation and "ai" shit in general. Something always had a specific stink about it to me when algorithm shit started being called "ai" and treated as if it was new a few years ago. I could like giveit a pass for stuff like chatbots given it's meant to actually simulate intelligence but those aren't new either but now every fucking service has one of them because "ooooh it's ai and NEW!" Ilove the goofy shit people have been able to make that pumps out either wacky text or images but marketing it as more than just a chatbot or some schizo computer collage machine really should have never been done. The conspiracy theory speculator in me however would say this was 100% by design to set on the path of eventually pulling away the people's access to this stuff and making institutions overly dependent on stuff falsely marketed as actual singularity scary level artificial intelligence.
 
It actually doesn't, as the law (as it is written, how it is enforced is another subject entirely) says that, in the case that it is indistinguishable from an actual child, it is to be treated as such.

Also, I now wonder who will be the mouthbreather of the month who will get banned for defending the legality of this
Most likely you know who with the boykisser avatar.
Pornography only feeds desires; it doesn't sate them.
it's the same debate
Same reason you don’t keep giving a crack addict crack.
 
It actually doesn't, as the law (as it is written, how it is enforced is another subject entirely) says that, in the case that it is indistinguishable from an actual child, it is to be treated as such.

Also, I now wonder who will be the mouthbreather of the month who will get banned for defending the legality of this
>AI CP
>become legal


How about "WOODCHIPPER TIME!"?
 
If ethical CP were an indisputable end point for pedos and would give them an lifetime outlet to never offend for real, it could be argued that they should be allowed this shit. But a lot of people agree that it isn't and won't. There really won't be a good solution until we start reopening the lunatic asylums or having dedicated prisons for these recidivist sex offenders to rot in until they die.
I do think drawings of shit like lolicon are a different debate though.
I've been sober for a couple decades. Tried drinking non-alcoholic beer a couple times, but it always led me straight back to alcoholic beer. Pedophilia, like other fetishes, behaves very much like addiction -- obsessive desires, inability to set boundaries, etc. I can't imagine that somebody who has strong sexual desires for children is going to be satisfied by loli drawings, but I can imagine that person overcoming the compulsion through abstinence and taking steps to separate himself from problematic situations.

I don't think every loli-enjoyer is an active child molester, but I'd have no problems with a rule that loli is cause to check all their electronic devices. I'm guessing the percentage of lolicons who would get caught with CP would be much, much higher than you would from a random search. And I'm guessing there'd be a near 1:1 correlation between people possessing photorealistic AI child porn and people possessing child porn made with real children.
 
If ethical CP were an indisputable end point for pedos and would give them an lifetime outlet to never offend for real, it could be argued that they should be allowed this shit. But a lot of people agree that it isn't and won't. There really won't be a good solution until we start reopening the lunatic asylums or having dedicated prisons for these recidivist sex offenders to rot in until they die.
I do think drawings of shit like lolicon are a different debate though.

Odd question: Wouldn't AiCP hurt the actual CP market? Seriously. You can make anything you want with a few clicks of a button, why pay someone? You'd think Ai would have put a hurt on Onlyfans but as it turns out, people are really bad with Ai and the dudes fucking with Onlyfans girls are putting clothing on them.
Like the pedos who really want real, would be searching harder to find it, which in turn, should make it easier to track them. Then track down the guys who advertise "totally real no ai" and hang them.
 
Riddle me this… AI child porn ban is basically a moral crime, no? A thought crime of sorts.

There’s no victim that got abused, after all.

The “victim” is society saying: We’re better off without people gooning to virtual kids.

So why not outlaw trooning out or gender surgeries?
 
Riddle me this… AI child porn ban is basically a moral crime, no? A thought crime of sorts.

There’s no victim that got abused, after all.

The “victim” is society saying: We’re better off without people gooning to virtual kids.

So why not outlaw trooning out or gender surgeries?
You can outlaw both.
 
Back