Law Man in Florida arrested for creating Child porn through AI prompts

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

he investigation kicked off after Indian River County Sheriff's Office received tips that McCorkle was prompting an AI image generator to make child sexual imagery and distributing it via the social media app Kik.

McCorkle's arrest was part of a county operation taking down people possessing child pornography, but the generative AI wrinkle in this particular arrest shows how technology is generating new avenues for crime and child abuse.
 
Last edited:
Bro. An ai image is a cartoon. It's computer generated. It's not real. It's pixels creating a picture of something that doesn't exist

Jesus christ no wonder they have you faggots wearing masks
Jannies, this inmate has lost his pink triangle somewhere between the train station and the gas chamber.
 
I didn't say anything about lolicons or pedos.

The dude is a pedo, but...I find it odd that kiwifarm people would be advocating imprisonment for something victimless, like words. If this guy can be arrested, so can you. The pixel police won't make distinctions
Agreed. As disgusting as photorealistic paedo shit is, I fear that it being made illegal is the thin end of the wedge.
 
An AI image, by its very nature, needs pre-existing data to be created. Now, what kind of data would an AI model need to create nude images of children?
Is it possession if it's already in the model though? They say he was arrested for creating the images, but they also mentioned this:
McCorkle's arrest was part of a county operation taking down people possessing child pornography, but the generative AI wrinkle in this particular arrest shows how technology is generating new avenues for crime and child abuse.
So it's not clear if he got busted for having child porn on his computer or just for making it via AI. AFAIK 'artistically generated' child porn is in a grey zone in the USA.
 
I wanted to see a mugshot. Fingers crossed for a soy beard balding redditor.
e9630162-d13b-4cc7-b88f-0f79f93b3411-PhotoVideotemplate76.jpg
 
If I asked ai to create an image of a unicorn with a machine gun horn...that isn't trained on any such real thing either.
The images of a unicorn in said scenario are real. They depict something that doesn't exist, but the images themselves did exist. The model didn't create a unicorn ex nihilo.

Pictures of children, which people voluntarily post on the internet by the millions.
Yes, and thus, it is using pictures of children and putting the children depicted in it in sexual situations, ergo, it is child pornography. This has always been illegal, if you create an image that's indistinguishable from a real child in a sexual context, even if it is a drawing or an edited photo, you are creating child pornography, no ifs and buts.

Agreed. As disgusting as photorealistic paedo shit is, I fear that it being made illegal is the thin end of the wedge.
Those images, due to their photorealistic nature, lose the Miller test by default. It is already illegal
 
Is it possession if it's already in the model though? They say he was arrested for creating the images, but they also mentioned this:

So it's not clear if he got busted for having child porn on his computer or just for making it via AI. AFAIK 'artistically generated' child porn is in a grey zone in the USA.
They also mention "deepfakes of real children" being involved in other cases. If I'd have to guess, the assumption in this case is the training data was based off of actual CSAM which is where the crime is originating. Makes sense to me.
 
The images of a unicorn in said scenario are real. They depict something that doesn't exist, but the images themselves did exist. The model didn't create a unicorn ex nihilo.
But they don't exist.
Yes, and thus, it is using pictures of children and putting the children depicted in it in sexual situations, ergo, it is child pornography. This has always been illegal, if you create an image that's indistinguishable from a real child in a sexual context, even if it is a drawing or an edited photo, you are creating child pornography, no ifs and buts

Pornography is photogrpah or video of a sexual act. You can make the argument for drawing is suppose too

But computer generated images are none of this

It's not putting any child in any situation at all, there is no child

I just hope you keep this train of thought when you go prison for antisemitism?

Every time
 
But computer generated images are none of this
If the computer generated images are based off of the model learning off of sexual abuse imagery, then it's not crazy different than these freaks tracing off of this material as well, in either case there's possession. Too many degrees of separation with how generic the article is explaining it, but it's understandable if what I'm assuming is going on went on.
 
An AI image, by its very nature, needs pre-existing data to be created. Now, what kind of data would an AI model need to create nude images of children?
i hate to have to explain this for the 20th time, but that's not how current gen AI works. AI has gotten smart enough where you can train it on pictures of full clothed, normal children so it learns what the dataset for "child" looks like. then you can train it on pictures of naked adults so it learns what the dataset for "naked human" looks like. then you can generate naked children by combining the two datasets

that's why it's so difficult to legislate this problem away. unless you create a model that doesn't know what children are, or doesn't know what naked humans are, you can't stop someone from making images. and even if you could, with open source tools you could train it at home very easily with a mid-range computer. and because there's no guaranteed way to force neural nets to behave a certain way it would be almost impossible technology-wise to enforce this

This has always been illegal, if you create an image that's indistinguishable from a real child in a sexual context, even if it is a drawing or an edited photo, you are creating child pornography, no ifs and buts
ultimately it comes down to this. making child porn though any mean or method is illegal. whether it was created through intense artistic talent or by taking real pictures, if it looks anything like real life child porn there should be no reason for prosecutors to have to wade through red tape to prove it's 100% real
 
But they don't exist.
That's something which has no bearing on how the law is applied.

Pornography is photogrpah or video of a sexual act. You can make the argument for drawing is suppose too

But computer generated images are none of this

It's not putting any child in any situation at all, there is no child

I just hope you keep this train of thought when you go prison for antisemitism?
Our guy here thought the same thing, look how it turned out for him
 
Bro. An ai image is a cartoon. It's computer generated. It's not real. It's pixels creating a picture of something that doesn't exist

Jesus christ no wonder they have you faggots wearing masks
Pornography is photogrpah or video of a sexual act. You can make the argument for drawing is suppose too

But computer generated images are none of this

It's not putting any child in any situation at all, there is no child

I just hope you keep this train of thought when you go prison for antisemitism?
2/10 bait. Using the same logic that lolicons use gives your ruse away. You also have very weird opinions in general. Do you use AI to generate images of children?

I believe it is wrong to look at images of naked children and I am glad this case is well underway. Hopefully something can be done to combat this in the future.
 
The same people who will arrest you for thinking about pedo shit are the same who will and want to arrest you for thinking about "racist" shit
Basically this. They always use bad stuff happening to children to clamp down on shit that threatens TPTB, e.g. school shootings being used to justify gun grabbing. Maybe I'm just jaded, but if they really cared about children they'd deal with the rampant child trafficking that open borders facilitate, not import millions of rape happy shitskins, not cover up troons raping girls etc. instead of arresting this guy for what is essentially a victimless crime.
 
If the computer generated images are based off of the model learning off of sexual abuse imagery, then it's not crazy different than these freaks tracing off of this material as well, in either case there's possession. Too many degrees of separation with how generic the article is explaining it, but it's understandable if what I'm assuming is going on went on.
Assuming something is gay, especially when the government is open about their attempts to prosecute thoughts

2/10 bait. Using the same logic that lolicons use gives your ruse away. You also have very weird opinions in general. Do you use AI to generate images of children?
I have no "weird opinions". This unfounded accusation is 2/10 trolling and is very gay and makes it nigh impossible to trust your posts, judgment, etc

Ai is satanic shit, I don't use it for anything. It's a deception and a lie. I don't participate in such things
I believe it is wrong to look at images of naked children and I am glad this case is well underway. Hopefully something can be done to combat this in the future.
I don't care about what is right or wrong, I care about if there is a victim or not.

That's something which has no bearing on how the law is applied.
That's a separate discussion
Our guy here thought the same thing, look how it turned out for him
Arrested for a crime with no victim

Basically this. They always use bad stuff happening to children to clamp down on shit that threatens TPTB, e.g. school shootings being used to justify gun grabbing. Maybe I'm just jaded, but if they really cared about children they'd deal with the rampant child trafficking that open borders facilitate, not import millions of rape happy shitskins, not cover up troons raping girls etc. instead of arresting this guy for what is essentially a victimless crime.
The president fucked his granddaughter and we're going after guys who had pixels on his computer

Creating ai CP normalizes CP. the content can be used to groom real kids. (See? Other kids are doing this!)

Imagine if AI CP was legal and could be found all over social media for unsupervised kids to see and idealize. That’s a bad bad world. Burn it and keep it illegal.
Cp IS legal, just for politicians and such
 
Back