Law Man in Florida arrested for creating Child porn through AI prompts

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

he investigation kicked off after Indian River County Sheriff's Office received tips that McCorkle was prompting an AI image generator to make child sexual imagery and distributing it via the social media app Kik.

McCorkle's arrest was part of a county operation taking down people possessing child pornography, but the generative AI wrinkle in this particular arrest shows how technology is generating new avenues for crime and child abuse.
 
Last edited:
I knew that this would happen the second that AI got good enough in creating human models. Woodchipper this pedo at once.

P.S Also, this isn't the whole article, add waht's left, OP
The manufacture of CSAM is evil, and the use of pictures of real children is a grave offense that must be punished. This man is a predator who either has already or will attack a child someday. And what makes you say these AI images are cartoons? Why make that assumption? Cartoons aren't mentioned anywhere.
 
Bro. An ai image is a cartoon. It's computer generated. It's not real. It's pixels creating a picture of something that doesn't exist

Jesus christ no wonder they have you faggots wearing masks
Here comes the lolicon "I'm totally not a pedo!" defense force these sorts of things always drag out.
 
Bro. An ai image is a cartoon. It's computer generated. It's not real. It's pixels creating a picture of something that doesn't exist

Jesus christ no wonder they have you faggots wearing masks
An AI image, by its very nature, needs pre-existing data to be created. Now, what kind of data would an AI model need to create nude images of children?
 
Here comes the lolicon "I'm totally not a pedo!" defense force these sorts of things always drag out.
I didn't say anything about lolicons or pedos.

The dude is a pedo, but...I find it odd that kiwifarm people would be advocating imprisonment for something victimless, like words. If this guy can be arrested, so can you. The pixel police won't make distinctions

I also dislike the inconsistency of pedophile hate
 
I'll post the whole article:

Man Arrested for Creating Child Porn Using AI
Sharon Adarlo
Thu, August 22, 2024 at 6:30 PM GMT-5
2 min read


Florida Man
A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

The investigation kicked off after Indian River County Sheriff's Office received tips that McCorkle was prompting an AI image generator to make child sexual imagery and distributing it via the social media app Kik.

McCorkle's arrest was part of a county operation taking down people possessing child pornography, but the generative AI wrinkle in this particular arrest shows how technology is generating new avenues for crime and child abuse.

What Safeguards
The increasing tide of generated AI child sexual abuse imagery has prompted federal, state and local lawmakers to push legislation to make this type of porn illegal, but it's not clear how effectively it can be stopped.

Last year, the National Center for Missing & Exploited Children received 4,700 reports of generated AI child porn, with some criminals even using generative AI to make deepfakes of real children to extort them.

A 2023 study from Stanford University also revealed that hundreds of child sex abuse images were found in widely-used generative AI image data sets.

Put it all together, and you've got a seemingly uncontrollable problem.

"The content that we’ve seen, we believe is actually being generated using open source software, which has been downloaded and run locally on people’s computers and then modified," Internet Watch Foundation chief technology officer Dan Sexton told The Guardian last year. "And that is a much harder problem to fix."
 
An AI image, by its very nature, needs pre-existing data to be created. Now, what kind of data would an AI model need to create nude images of children?
Pictures of children, which people voluntarily post on the internet by the millions.

Fwiw, that's not really how ai works but even still, it's a depiction of something that isn't real either way

If I asked ai to create an image of a unicorn with a machine gun horn...that isn't trained on any such real thing either.

It's a simulation of something that doesn't exist or events that never happened.

Guys, you're falling for this shit, it's embarrassing lol
 
Back