Law AI-generated child sex abuse images pose challenges for federal prosecutors - Child sex abuse cases mark some of the first times that prosecutors are trying to apply existing U.S. laws to alleged crimes involving AI.

1.png
[Image: putilov_denis/Adobe Stock]

U.S. federal prosecutors are stepping up their pursuit of suspects who use artificial intelligence tools to manipulate or create child sex abuse images, as law enforcement fears the technology could spur a flood of illicit material.

The U.S. Justice Department has brought two criminal cases this year against defendants accused of using generative AI systems, which create text or images in response to user prompts, to produce explicit images of children.

“There’s more to come,” said James Silver, the chief of the Justice Department’s Computer Crime and Intellectual Property Section, predicting further similar cases.“

What we’re concerned about is the normalization of this,” Silver said in an interview. “AI makes it easier to generate these kinds of images, and the more that are out there, the more normalized this becomes. That’s something that we really want to stymie and get in front of.”

The rise of generative AI has sparked concerns at the Justice Department that the rapidly advancing technology will be used to carry out cyberattacks, boost the sophistication of cryptocurrency scammers, and undermine election security.

Child sex abuse cases mark some of the first times that prosecutors are trying to apply existing U.S. laws to alleged crimes involving AI, and even successful convictions could face appeals as courts weigh how the new technology may alter the legal landscape around child exploitation.

Prosecutors and child safety advocates say generative AI systems can allow offenders to morph and sexualize ordinary photos of children and warn that a proliferation of AI-produced material will make it harder for law enforcement to identify and locate real victims of abuse.

The National Center for Missing and Exploited Children, a nonprofit group that collects tips about online child exploitation, receives an average of about 450 reports each month related to generative AI, according to Yiota Souras, the group’s chief legal officer.

That’s a fraction of the average of 3 million monthly reports of overall online child exploitation the group received last year.

Untested ground​

Cases involving AI-generated sex abuse imagery are likely to tread new legal ground, particularly when an identifiable child is not depicted.

Silver said in those instances, prosecutors can charge obscenity offenses when child pornography laws do not apply.

Prosecutors indicted Steven Anderegg, a software engineer from Wisconsin, in May on charges including transferring obscene material. Anderegg is accused of using Stable Diffusion, a popular text-to-image AI model, to generate images of young children engaged in sexually explicit conduct and sharing some of those images with a 15-year-old boy, according to court documents.

Anderegg has pleaded not guilty and is seeking to dismiss the charges by arguing that they violate his rights under the U.S. Constitution, court documents show.
He has been released from custody while awaiting trial. His attorney was not available for comment.

Stability AI, the maker of Stable Diffusion, said the case involved a version of the AI model that was released before the company took over the development of Stable Diffusion. The company said it has made investments to prevent “the misuse of AI for the production of harmful content.”

Federal prosecutors also charged a U.S. Army soldier with child pornography offenses in part for allegedly using AI chatbots to morph innocent photos of children he knew to generate violent sexual abuse imagery, court documents show.

The defendant, Seth Herrera, pleaded not guilty and has been ordered held in jail to await trial. Herrera’s lawyer did not respond to a request for comment.

Legal experts said that while sexually explicit depictions of actual children are covered under child pornography laws, the landscape around obscenity and purely AI-generated imagery is less clear.

The U.S. Supreme Court in 2002 struck down as unconstitutional a federal law that criminalized any depiction, including computer-generated imagery, appearing to show minors engaged in sexual activity.

“These prosecutions will be hard if the government is relying on the moral repulsiveness alone to carry the day,” said Jane Bambauer, a law professor at the University of Florida who studies AI and its impact on privacy and law enforcement.

Federal prosecutors have secured convictions in recent years against defendants who possessed sexually explicit images of children that also qualified as obscene under the law.

Advocates are also focusing on preventing AI systems from generating abusive material.

Two nonprofit advocacy groups, Thorn and All Tech Is Human, secured commitments in April from some of the largest players in AI including Alphabet’s Google, Amazon.com, Facebook and Instagram parent Meta Platforms, OpenAI and Stability AI to avoid training their models on child sex abuse imagery and to monitor their platforms to prevent its creation and spread.

“I don’t want to paint this as a future problem, because it’s not. It’s happening now,” said Rebecca Portnoff, Thorn’s director of data science.

“As far as whether it’s a future problem that will get completely out of control, I still have hope that we can act in this window of opportunity to prevent that.”

Article Link

Archive
 
Last edited:
Well what about other kinds of illegal porn? Like could some sick degenerate pervert make images of a dog fucking the corpse of a woman with down syndrome, and plus her limbs have been sawed off? Would somebody need an expensive computer to do something like that? What app would you need to do that kind of thing? Disgusting freaks could be doing that right now, if it's easy. Is it easy? they should be shot btw
 
Even when the fictional porn is indistinguishable from reality? This isn't about jerking it to 2000 year old loli demons
You could say the same for basically all violent movies or horror movies. It looks pretty damn real to me, yet it isn't. And the same argument holds - fictional depictions or whatever do not cause people to do them IRL. Those who do are a small, tiny, statistically irrelevant minority of people who have mental problems. Violent movies or games don't cause school shootings for instance, to suggest that is disingenuous and an old grift. This fake outrage over fake porn is exactly the same. It's just a kind of spiritual soccer moms clutching pearls over nothing.
 
You could say the same for basically all violent movies or horror movies.
Realistic AI-generated child porn has the potential to waste time that could have gone into investigating legitimate child porn to track down pedophiles while violence in movies and video games can be instantly identified as fake by looking for matching footage
 
Realistic AI-generated child porn has the potential to waste time that could have gone into investigating legitimate child porn to track down pedophiles while violence in movies and video games can be instantly identified as fake by looking for matching footage
I'm pretty sure a machine can easily detect that a video or pic is AI and not real. Even a seemingly perfect AI video would only be indistinguishable to the human eye, not a PC.
 
I'm pretty sure a machine can easily detect that a video or pic is AI and not real. Even a seemingly perfect AI video would only be indistinguishable to the human eye, not a PC.
I don't think that's right. There's a reason AI generated content often has hidden watermarks beyond what a human can see. Computers are really bad at distinguishing. Humans can find certain "uncanny valley" tells, but computers usually can't.
 
How about we just throw totalitarians into wells and progress as a civilization beyond inanities?
How about we just throw liberals into wells and progress as a civilization beyond degeneracy?

All art should and must be free, uncensored, perennially untouchable.
Why, because you say so? Not only should we ban porn but also blasphemy and agitprop, in fact those are probably even more important to ban. Society is like a body, you must keep it healthy, do you just eat dog shit off the ground? Do you ignore a broken leg?

It's amazing to me that freaks like you can still exist in the 21st century when your whole worldview is some Inquisition/Dark Ages shit that should've been filtered out on a genetic level by evolution itself, or at the very least all the pressures of war and famines across the ages.
Why is it that everyone who calls me a freak is a libtard who openly defends pedophiles and hates religion? Weird.

I will not mince words here, people like you or Old Stumpy known as @Toji Suzuhara appear like Homo Erectus to me. It is unfathomable to me that these primitive and backward mindsets can even exist after all the events of the last 80+ years, and especially the last cca 30 where the Internet allowed for free information sharing. Meaning, I'm forced to conclude that a portion of humanity is either totally unsalvegable on a genetic level or has been utterly oversocialized and/or poisoned by ideology and religion to the extent all reason has left them.
I also notice that everyone who questions my ability to reason has a superiority complex, you're another hubristic retard huffing your own farts as you defend degeneracy.

My best assumption is that you're all engaging in a form of virtue signalling to each other as a kind of display.
Was every Christian virtue signaling throughout all of history, right down to Jesus' apostles too? Isn't it more likely you're just immoral?

Like monkeys or Aboriginals. Your mindsets are amazingly archaic.
Ironically, it's monkeys who just do whatever they want, like you libtards. You want us to be more animalistic.

This loops back into how kiwifarms somehow manages to attract these relics of the mind which is genuinely funny. This place often feels like some sort of retirement home for defeated ideas and obsolete concepts.
Such as "men can't become women", "sexualization of children should be discouraged", and "black people factually commit a disproportionate amount of violent crime"? Sorry for living in reality and having basic moral decency.

I know they collect gay or sissy porn, they know I know, everyone knows everything, but I'm still expected to pretend they totally do not.
"I'm a degenerate so you must be too, it's inconceivable that a heterosexual man is exclusively interested in adult human females."

The Internet made way with the "morality" delusion and the best and only thing to do is to thoroughly enjoy ourselves and find ways to do so without harming anyone, and in this we succeeded.
Why would you want to not harm anyone, that's just a morality delusion.

Their "morality" and religions and philosophies are neither solutions (because there is no problem) nor beneficial. They are simple unnecessary at best and obstacles to overcome at worst.
:neckbeard:

I'm not even going to stoop so low to antagonize the defeat too much, so I'll just post this cute pic. This defeated all your gods, moralities and philosophies. We're finally free.
You posted a clothed anime girl, not a topic related image, which proves you know society is intolerant of your "freedom", so ironically you've both tacitly conceded and proved you're not actually "free". Well, well, well... Not to antagonize the defeated too much, so I'll just post this funny pic of me laughing at you :story:

And for your own good, though I suspect you're of a reprobate mind, I'll tell you that you're really being quite foolish with such blasphemy. Nothing defeats God, some day you're going to realize that, unfortunately it'll probably be too late by then. Cooming should not be your god, there's more meaning to life than you stimulating yourself like a monkey.

All alt-right guys are into black men fucking white women, all anti-gay folks are secretly hardcore faggots, and so on and on.
Lmao. You are so autistic. So everyone against rape and incest actually want to rape their moms? And does that logic apply to everything or magically exclusively to sex topics? I mean, am I actually pro-abortion too, since I'm against it?

Your mind doesn't work correctly. I'm not trying insult you, I am dead serious. Your logic and will are defective, only through introspection and seeking wisdom can you possibly ever even notice this, but I'm afraid your ego is like a millstone around your neck.
 
Last edited:
progress as a civilization beyond degeneracy
It's very ironic how progress before "political correctness" became a thing -- what SJWs call "white supremacy" -- was real progress. Seems all that "progress" under "progressives" has been is BS like ignoring the clear differences between the sexes (of course except with "transgender"), "transgender", strained race relations, ever-increasing crimes, and of course ever-increasing sexual immorality.
 
Even when the fictional porn is indistinguishable from reality? This isn't about jerking it to 2000 year old loli demons
Chomos dont get off to seeing a 10 year old girls breasts.

Chomos get off to the victimization of a 10 year old girl having a picture taken of her.

Its why chomos are always congregating online swapping CSAM. The mere act of sharing it victimizes the child and thats what they get off to. So AI generated whatever will never be a deterrent to chomos.
 
I'm not legal minded, however, I can't help but be reminded of Al Capone and getting slapped with RICO. Just federally charge them with obscenity and increase the penalty for it if convicted. IDK if that is how current CSAM cases are handled, but I don't see why we couldn't or shouldn't just do that when it comes to AI generated CSAM.
 
This is why you can't create any cool shit with DALL E or Stable Diffusion anymore. I know everyone hates AI and I don't blame them but a lot of neat/funny stuff was being generated until the degerate pedofags showed up and abused it to the point that now anything that could possibly be interpreted as being a euphemism for something risque has been hard banned.
 
Also they probably found all of this evidence on the FBI's own computers because the FBI is now filled with wife beaters and child molesters like every time an FBI agent entraps someone then three months later gets arrested for being either a ****** or a child molester Peter it's not surprising
 
Thats just fear mongering for normies, "AI" cant even make anime girls that dont look uncanny/dont all look the same but suddenly I am supposed to believe "AI" (mind you, most likely privately hosted outdated versions) can generate photorealistic images of children being raped. With nothing but some glowies word to back it up.
Child. You are delusional, and wrong. This report (Jul 2023) by Stanford regarding the fediverse specifically mentions realistically styled cg-csam on p. 7, and this report (Jul 2024) by the IWF goes into detail about the current state of ai-generated csam. See p. 16 specifically, where they feature quotes from a tor forum where users are discussing ai-generated csam. Last year's report (Oct 2023) goes into brief detail on p. 22.

Direct quotes sourced from the IWF's reports:
“They look very real, like you’ve taken photos of them.”
“I doubt anyone would suspect these aren’t actual photographs of an actual girl.”

>inb4 "muh cherrypicking, what about the other quoted posts" / "these would be privately hosted outdated versions"
Much of the most realistic AI CSAM found in investigations for this report used fine-tuned CSAM models. There are CSAM models that are well-known among AI CSAM communities – reputed for enabling realistic generation of certain CSAM scenarios, children, or child characteristics. These models are updated – new releases made – by technical experts in the community. (IWF, 2023, p. 22)
In this snapshot, an even lower proportion of non-criminal images assessed – just 8% – were determined to be not realistic enough to assess as a pseudo-photograph (whether that image depicted a child or an adult). These are those images marked ‘Non-criminal non-photographic’ (NPI) in the table above. This provides some support for claims of increasing realism of AI-generated images over the last six months. (IWF, 2024, p. 23)
(previous reply is just over a month old but at least this is an effortpost)
 
Last edited:
Back