Law AI-generated child sex abuse images pose challenges for federal prosecutors - Child sex abuse cases mark some of the first times that prosecutors are trying to apply existing U.S. laws to alleged crimes involving AI.

1.png
[Image: putilov_denis/Adobe Stock]

U.S. federal prosecutors are stepping up their pursuit of suspects who use artificial intelligence tools to manipulate or create child sex abuse images, as law enforcement fears the technology could spur a flood of illicit material.

The U.S. Justice Department has brought two criminal cases this year against defendants accused of using generative AI systems, which create text or images in response to user prompts, to produce explicit images of children.

“There’s more to come,” said James Silver, the chief of the Justice Department’s Computer Crime and Intellectual Property Section, predicting further similar cases.“

What we’re concerned about is the normalization of this,” Silver said in an interview. “AI makes it easier to generate these kinds of images, and the more that are out there, the more normalized this becomes. That’s something that we really want to stymie and get in front of.”

The rise of generative AI has sparked concerns at the Justice Department that the rapidly advancing technology will be used to carry out cyberattacks, boost the sophistication of cryptocurrency scammers, and undermine election security.

Child sex abuse cases mark some of the first times that prosecutors are trying to apply existing U.S. laws to alleged crimes involving AI, and even successful convictions could face appeals as courts weigh how the new technology may alter the legal landscape around child exploitation.

Prosecutors and child safety advocates say generative AI systems can allow offenders to morph and sexualize ordinary photos of children and warn that a proliferation of AI-produced material will make it harder for law enforcement to identify and locate real victims of abuse.

The National Center for Missing and Exploited Children, a nonprofit group that collects tips about online child exploitation, receives an average of about 450 reports each month related to generative AI, according to Yiota Souras, the group’s chief legal officer.

That’s a fraction of the average of 3 million monthly reports of overall online child exploitation the group received last year.

Untested ground​

Cases involving AI-generated sex abuse imagery are likely to tread new legal ground, particularly when an identifiable child is not depicted.

Silver said in those instances, prosecutors can charge obscenity offenses when child pornography laws do not apply.

Prosecutors indicted Steven Anderegg, a software engineer from Wisconsin, in May on charges including transferring obscene material. Anderegg is accused of using Stable Diffusion, a popular text-to-image AI model, to generate images of young children engaged in sexually explicit conduct and sharing some of those images with a 15-year-old boy, according to court documents.

Anderegg has pleaded not guilty and is seeking to dismiss the charges by arguing that they violate his rights under the U.S. Constitution, court documents show.
He has been released from custody while awaiting trial. His attorney was not available for comment.

Stability AI, the maker of Stable Diffusion, said the case involved a version of the AI model that was released before the company took over the development of Stable Diffusion. The company said it has made investments to prevent “the misuse of AI for the production of harmful content.”

Federal prosecutors also charged a U.S. Army soldier with child pornography offenses in part for allegedly using AI chatbots to morph innocent photos of children he knew to generate violent sexual abuse imagery, court documents show.

The defendant, Seth Herrera, pleaded not guilty and has been ordered held in jail to await trial. Herrera’s lawyer did not respond to a request for comment.

Legal experts said that while sexually explicit depictions of actual children are covered under child pornography laws, the landscape around obscenity and purely AI-generated imagery is less clear.

The U.S. Supreme Court in 2002 struck down as unconstitutional a federal law that criminalized any depiction, including computer-generated imagery, appearing to show minors engaged in sexual activity.

“These prosecutions will be hard if the government is relying on the moral repulsiveness alone to carry the day,” said Jane Bambauer, a law professor at the University of Florida who studies AI and its impact on privacy and law enforcement.

Federal prosecutors have secured convictions in recent years against defendants who possessed sexually explicit images of children that also qualified as obscene under the law.

Advocates are also focusing on preventing AI systems from generating abusive material.

Two nonprofit advocacy groups, Thorn and All Tech Is Human, secured commitments in April from some of the largest players in AI including Alphabet’s Google, Amazon.com, Facebook and Instagram parent Meta Platforms, OpenAI and Stability AI to avoid training their models on child sex abuse imagery and to monitor their platforms to prevent its creation and spread.

“I don’t want to paint this as a future problem, because it’s not. It’s happening now,” said Rebecca Portnoff, Thorn’s director of data science.

“As far as whether it’s a future problem that will get completely out of control, I still have hope that we can act in this window of opportunity to prevent that.”

Article Link

Archive
 
Last edited:
Doubt it. It's not like regular porn has stopped being produced. If anything it might hurt law enforcement if AI becomes indistinguishable from real victims.
I can’t even begin to imagine how it would affect “the market”.

(For one thing, wouldn’t you need some kind of certificate? Otherwise, how to tell AI from non AI CP. So we’re basically having the government involved in producing/certifying CP which is some real
dystopian shit.)

Wouldn’t it also popularize it? Regular groomers getting caught up with it, like what happened with tranny shit?

Idk, the whole “But real children won’t get hurt, there’ll be no incentive to produce it!” Argument is alluring, but I doubt it would be as simple as that.

Also: There’s a particular kind of CP collectors who don’t even goon to kids, but just collect it. Autists regularly get busted with hundreds of thousands or million of images, and can’t explain why they do it, because they’re neither particularly sexual nor turned on by children. I doubt AICP would make a difference for those weirdos.
 
  • Like
Reactions: spiritofamermaid
Riddle me this… AI child porn ban is basically a moral crime, no? A thought crime of sorts.

There’s no victim that got abused, after all.

The “victim” is society saying: We’re better off without people gooning to virtual kids.

So why not outlaw trooning out or gender surgeries?

AI images work off of existing images compiled into a neural network. If there's AI CP, it compiles based off existing images. There is a victim in it.
 
I can’t even begin to imagine how it would affect “the market”.

(For one thing, wouldn’t you need some kind of certificate? Otherwise, how to tell AI from non AI CP. So we’re basically having the government involved in producing/certifying CP which is some real
dystopian shit.)

The government already does verify CP. It was all over the news a few years ago, they actively allow it to circulate so they can slam dunk cases.

Also: There’s a particular kind of CP collectors who don’t even goon to kids, but just collect it. Autists regularly get busted with hundreds of thousands or million of images, and can’t explain why they do it, because they’re neither particularly sexual nor turned on by children. I doubt AICP would make a difference for those weirdos.

As a collector of drawing models and a once massive folder of newspaper comic strips, I can attest to this. You just start clicking save on every image you see that fits a criteria ( mine was dark humor and calvin and hobbes, even though I have all the fucking books) . It's an addiction that only developed in the internet era. It's like scrapbooking but a million times easier. You don't feel any way towards it, just an inclination to click" Save as".
Took me a long time to get over it, but now I'm kicking myself because I can't fucking find the comic strip on the fucking internet anymore what the fuck
 
AI images work off of existing images compiled into a neural network. If there's AI CP, it compiles based off existing images. There is a victim in it.
Those images are not necessarily child porn though. It may be normal, completely innocent images of children combined with porn of adults. AI image generators are perfectly capable of doing that.

You can still argue that creates a victim, but let's be clear what we're talking about.
 
I'll take these people seriously when they go after all of the discord child predators and the Reddit child predators it literally takes me 5 minutes to find child predators on those websites but the FBI tons of blind eye to those people

and it's not me supporting these types of people it is an inanimate object but you're still a sick freak if you're making this.
But I'll trust the FBI about them actually going after pedophiles after they spent several years covering for Jeffrey Epstein
 
It isn't really that hard... are these disgusting people we would be better off with if they stopped existing? If yes then hang them.

I don't get why law has degrees its basic bitch common sense.
 
It actually doesn't, as the law (as it is written, how it is enforced is another subject entirely) says that, in the case that it is indistinguishable from an actual child, it is to be treated as such.

Also, I now wonder who will be the mouthbreather of the month who will get banned for defending the legality of this
They probably already got banned from gooning in that DM
 
  • Thunk-Provoking
Reactions: Toji Suzuhara
There’s a particular kind of CP collectors who don’t even goon to kids, but just collect it. Autists regularly get busted with hundreds of thousands or million of images, and can’t explain why they do it, because they’re neither particularly sexual nor turned on by children.
Probably the same reason people collect gore and crime/war stuff. It's the shock-content aspect, an uncensored look at the worst of humanity.
 
I'll take these people seriously when they go after all of the discord child predators and the Reddit child predators it literally takes me 5 minutes to find child predators on those websites but the FBI tons of blind eye to those people

and it's not me supporting these types of people it is an inanimate object but you're still a sick freak if you're making this.
But I'll trust the FBI about them actually going after pedophiles after they spent several years covering for Jeffrey Epstein
The fact they haven't v&'d someone and then taking their internet persona for subterfuge tells me they don't take this seriously.
They did worse for someone making fun of their cult of the year.
 
  • Agree
Reactions: Vyse Inglebard
The fact they haven't v&'d someone and then taking their internet persona for subterfuge tells me they don't take this seriously.
They did worse for someone making fun of their cult of the year.
Of course they don't take it seriously giving bathtub estrogen to six year olds is multiple felonies drug trafficking creating an illicit substance distributing an unlicensed substance endangerment of a minor
 
These prosecutions will be hard if the government is relying on the moral repulsiveness alone to carry the day,” said Jane Bambauer, a law professor at the University of Florida who studies AI and its impact on privacy and law enforcement.
No, they won't be.

The Miller test is "put that shit in front of a jury and ask them if it's obscene" (this is an oversimplification of the law but let's not pretend that behind closed doors this isn't exactly what happens) and if the defendant is creating something that looks remarkably like CP using AI, even if no CP proper was used to generate it, 12 people are going to say "uh, ew, where's the woodchipper?" and return a guilty on obscene images of a child (counts 1 - 43852).

What he means is "prosecutors will have to present a case to a judge that the images should not be covered by 1A, perhaps present a pre-trial appeal of the judge's decision, or god forbid, they make it to a jury" which, yes, is harder than just securing guilty pleas without any significant lawyering on a strict liability crime like in the case of "verified" CP, but let's not pretend this isn't pretty smooth sailing all things considered: the defendant still has images that look like CP.

The instant a judge tells the defense attorney the images his client generated can be put in front of a jury for Millering that's gonna change from "not guilty" to "let's cut a deal" really fucking fast.

I get that prosecutors don't like obscenity because it's got any chance of being a losing argument, but you just have to not be retarded with what you're charging...
 
If it is so photorealistic as to be impossible to tell from the real stuff, then just treat it as real.

Consider it in the same way as a gun being pointed at you. If it's a toy water gun, obviously fake with large plastic bubbles and designs you would never consider it a threat. But if it is a realistic prop which is indistinguishable from a real gun outside of the fact it has no internals then you would feel threatened because you can't tell.

Treating it as real saves people the trouble of pedos insisting they don't have to go to prison because "akshually they are all AI generated" and makes it so they will be scared of even generating it in the first place.

Any other proposed solution seems to me like a trojan horse to try and control AI.
 
Thats just fear mongering for normies, "AI" cant even make anime girls that dont look uncanny/dont all look the same but suddenly I am supposed to believe "AI" (mind you, most likely privately hosted outdated versions) can generate photorealistic images of children being raped. With nothing but some glowies word to back it up.
 
Back