Law Man in Florida arrested for creating Child porn through AI prompts

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

he investigation kicked off after Indian River County Sheriff's Office received tips that McCorkle was prompting an AI image generator to make child sexual imagery and distributing it via the social media app Kik.

McCorkle's arrest was part of a county operation taking down people possessing child pornography, but the generative AI wrinkle in this particular arrest shows how technology is generating new avenues for crime and child abuse.
 
Last edited:
Legally it's going to be hard to get a conviction and/or it will be appealed becasue SCOTUS will have little choice if they issue a ruling impartially, logically and based on prior decisions.

1724455075217.png

This is a criminal case so it absolutely does matter; who was the victim here? As far as we know there was no single human being, child or otherwise, harmed by his action in creating this imagery.

Morally and ethically this is utterly reprehensible, of course. It is vile and disgusting and should be illegal, but that would require an act of Congress, and we know how useless they are. If Congress passed a law which specified the creation and distribution of AI generated child pornography models based on human beings was against the law and carried the same penalties as child porn using IRL children then I think that would satisfy everyone (except free speech absolutists).
 
If Congress passed a law which specified the creation and distribution of AI generated child pornography models based on human beings was against the law and carried the same penalties as child porn using IRL children then I think that would satisfy everyone (except free speech absolutists).
How would this do if the content generated displays people who are young, but there's no way to tell if they're supposed to be 22, 20, 17 or less.

So a model that is trained on adults, but generates ambiguous looking individuals in terms of age.
 
This is a criminal case so it absolutely does matter; who was the victim here? As far as we know there was no single human being, child or otherwise, harmed by his action in creating this imagery.

Morally and ethically this is utterly reprehensible, of course. It is vile and disgusting and should be illegal, but that would require an act of Congress, and we know how useless they are. If Congress passed a law which specified the creation and distribution of AI generated child pornography models based on human beings was against the law and carried the same penalties as child porn using IRL children then I think that would satisfy everyone (except free speech absolutists).
Go read the laws on CP, they don't ever require a victim, and no real child needs to be involved.
Legally it's going to be hard to get a conviction
No, it's not.
 
Legally it's going to be hard to get a conviction and/or it will be appealed becasue SCOTUS will have little choice if they issue a ruling impartially, logically and based on prior decisions.

View attachment 6340435

This is a criminal case so it absolutely does matter; who was the victim here? As far as we know there was no single human being, child or otherwise, harmed by his action in creating this imagery.

Morally and ethically this is utterly reprehensible, of course. It is vile and disgusting and should be illegal, but that would require an act of Congress, and we know how useless they are. If Congress passed a law which specified the creation and distribution of AI generated child pornography models based on human beings was against the law and carried the same penalties as child porn using IRL children then I think that would satisfy everyone (except free speech absolutists).
Federal law criminalizes realistic, lifelike pornography of children, regardless of how it was made. They will not have a difficult time prosecuting McCorkle.
 
How would this do if the content generated displays people who are young, but there's no way to tell if they're supposed to be 22, 20, 17 or less.

So a model that is trained on adults, but generates ambiguous looking individuals in terms of age.
They would run the Miller Test throught it and see whether it passes or fails it. Unless the images are found to have been made by using a real underage kid as it basis (like with nudify AI, which uses AI to create naked versions of the people depicted in a photo), in that case, that's gonna get categorized as child pornography 10 times out of 10
 
Alert Number: I-032924-PSA

March 29, 2024
Child Sexual Abuse Material Created by Generative AI and Similar Online Tools is Illegal
The FBI is warning the public that child sexual abuse material (CSAM) created with content manipulation technologies, to include generative artificial intelligence (AI), is illegal. Federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession of any CSAM,1 including realistic computer-generated images.

Background
Individuals have been known to use content manipulation technologies and services to create sexually explicit photos and videos that appear true-to-life. One such technology is generative AI, which can create content — including text, images, audio, or video — with prompts by a user. Generative AI models create responses using sophisticated machine learning algorithms and statistical models that are trained often on open-source information, such as text and images from the internet. Generative AI models learn patterns and relationships from massive amounts of data, which enables them to generate new content that may be similar, but not identical, to the underlying training data. Recent advances in generative AI have led to expansive research and development as well as widespread accessibility, and now even the least technical users can generate realistic artwork, images, and videos — including CSAM — from text prompts.

Examples
Recent cases involving individuals having altered images into CSAM include a child psychiatrist and a convicted sex offender:

In November 2023, a child psychiatrist in Charlotte, North Carolina, was sentenced to 40 years in prison, followed by 30 years of supervised release, for sexual exploitation of a minor and using AI to create CSAM images of minors. Regarding the use of AI, the evidence showed the psychiatrist used a web-based AI application to alter images of actual, clothed minors into CSAM2.
In November 2023, a federal jury convicted a Pittsburgh, Pennsylvania registered sex offender of possessing modified CSAM of child celebrities. The Pittsburgh man possessed pictures that digitally superimposed the faces of child actors onto nude bodies and bodies engaged in sex acts3.
There are also incidents of teenagers using AI technology to create CSAM by altering ordinary clothed pictures of their classmates to make them appear nude.

Recommendations
For more information on altered images, see the FBI June 2023 PSA titled "Malicious Actors Manipulating Photos and Videos to Create Explicit Content and Sextortion Schemes" at https://www.ic3.gov/Media/Y2023/PSA230605.
If you are aware of CSAM production, including AI generated material, please report it to the following:
National Center for Missing and Exploited Children [1-800-THE LOST or www.cybertipline.org]
FBI Internet Crime Complaint Center [www.ic3.gov]
Source
 
They would run the Miller Test throught it and see whether it passes or fails it. Unless the images are found to have been made by using a real underage kid as it basis (like with nudify AI, which uses AI to create naked versions of the people depicted in a photo), in that case, that's gonna get categorized as child pornography 10 times out of 10
I'm not familiar with that test or how it's run.

So it says that it's the subjective criteria of the community (amongst other things) that will determine it, who is this then? How are they selected? Is it wise to leave that decision on them? What the hell are the standards of the community, if they differ from one to another?
 
No surprise at all. Once again, technology moves faster than legislators. Going to see much, much more of this in the future, expect exponential increases.

The crew drew lots for this one, I stayed out. Ana, please warm up the helicopter, you're flying left seat. Ken, please fly right seat. Milton and Bob, please assist Mike, the crew chief, after the rest of us have completed the special processing.
 
Capturelel.PNG

@Suel Forrester I know that you're really eager to defend your CP collection but combine your posts you fucking spastic
I'm not familiar with that test or how it's run.

So it says that it's the subjective criteria of the community (amongst other things) that will determine it, who is this then? How are they selected? Is it wise to leave that decision on them? What the hell are the standards of the community, if they differ from one to another?
Where does your neck stop and your head begin? Just because you maybe can't say for certain does not invalidate that you in fact have a head and neck that we as humans can intuitively identify and consider separately from one another. How do you factually know that 18 is an appropriate age to vote? We don't its an arbitrary line we drew because we recognize the need for a line even if you might disagree with the precise delineation.
 
Last edited:
Federal law criminalizes realistic, lifelike pornography of children, regardless of how it was made. They will not have a difficult time prosecuting McCorkle.
That's never been tested at the SCOTUS level but this will be, because of the ramifications this has on the various other applications of AI, for example, being used in hiring and use in media, not to mention companies stating they intend to replace workers with AI.
This will end up in front of SCOTUS when someone claims their AI generated child porn is art, much like photographers were caught up in getting charged for images of naked kids at the beach decades ago.

It's new tech but of course no legislation that will pass muster with SCOTUS has been enacted yet becasue Congress is useless.
 
I'm not familiar with that test or how it's run.

So it says that it's the subjective criteria of the community (amongst other things) that will determine it, who is this then? How are they selected? Is it wise to leave that decision on them? What the hell are the standards of the community, if they differ from one to another?
I suppose they would just pick a group of average joes from the local area, show them the pictures while on the court or something like that, and see if they find it obscene or not. The standards of the community are the standards of the local community, and by their nature they will differ from place to place. But in cases such as these, they will most likely go with New York V. Feber (whole document), which says that CP is always obscene. That could be appealed using Ashcroft V. Free Speech Coalition, which says that, if they only appear to be minors, but actually aren't, they could not be considered obscene (which means we are back to square one and using the Miller Test)
 
This will end up in front of SCOTUS when someone claims their AI generated child porn is art
Cert denied.
much like photographers were caught up in getting charged for images of naked kids at the beach decades ago.
There's a test for that. https://en.wikipedia.org/wiki/Dost_test
It's new tech but of course no legislation that will pass muster with SCOTUS has been enacted yet becasue Congress is useless.
You don't need a new law or ruling for every advancement in tech, that's why the first amendment works with computer tech.
 
I didn't say anything about lolicons or pedos.

The dude is a pedo, but...I find it odd that kiwifarm people would be advocating imprisonment for something victimless, like words. If this guy can be arrested, so can you. The pixel police won't make distinctions

I also dislike the inconsistency of pedophile hate
Bro, folks here wanna give people the rope for much less
 
Back