Law Man in Florida arrested for creating Child porn through AI prompts

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

he investigation kicked off after Indian River County Sheriff's Office received tips that McCorkle was prompting an AI image generator to make child sexual imagery and distributing it via the social media app Kik.

McCorkle's arrest was part of a county operation taking down people possessing child pornography, but the generative AI wrinkle in this particular arrest shows how technology is generating new avenues for crime and child abuse.
 
Last edited:
I really, really don't think you're correct on this one. Suel was not only defending AI-generated CP, but he was also refusing to acknowledge the aforementioned illegal aspects of it while being a tremendous douchebag about it.
I’m a retard and don’t know the law, but this makes sense. FWIW, loli porn should also be considered illegal.
 
It's possible that these AI models are combining clothed children and nude adults.
AI can generalize really well. To the point where even if you trained a model with no images of human children at all, but you included baby animals like puppies and kittens, it might learn that "baby" means "smaller version of the real thing, with a big forehead and big eyes and cute," and still be able to intuit what human children should look like.

The idea of making AI which is incapable of generating CSAM is probably a pipe dream, similar to trying to make a version of Photoshop where it's impossible to draw CSAM, or a version of Notepad where it's impossible to type CSAM.
 
Last edited:
I don't know the exact law off the top of my head, and I could be misremembering, but I believe that it is indeed illegal in the US to make pornography - drawn, 3D generated, or otherwise - of actual, existing minors. For example, Shadman got in serious trouble for drawing porn of Keemstar's daughter.
It varies by jurisdiction. Some places don't prosecute it, and in some places that have tried, courts have been swayed by "it's just pixels and it's not real" arguments (since the primary legal theory behind the prosecution of CSAM is that it cannot be created, distributed or possessed without the presence of an actual victim and commission of another crime, and ergo a drawing or computer rendering, which never had any original source "counterpart," cannot have the same legal theory applied to it). I might be remembering incorrectly but I don't think this question has ever reached the Supreme Court either.

It comes down to the age-old question -- can you imprison someone for creating an offensive work from scratch if it covers a specific "protected" subject? If yes, which subjects? Who gets to decide that? What specific requirements must be met to run afoul of the law? Which law? Who sets those standards?

Prosecuting CSAM under the legal principle that it cannot exist without a victim has made it much easier to prosecute, but the tradeoff is that stuff like drawings are in a legal grey area specifically because of that principle.

I'm not being pedantic to be contrarian or support/condone that kind of shit; I raise these questions to highlight how fucking hard it is to nail these perverts to the wall with a bulletproof legal process. It's not nearly as easy as it sounds, which sucks, because it's obviously something that deserves far more attention than it's getting.

ETA: To help further illustrate the point, consider anti-bestiality laws. We've finally reached a point where (I think) all states have specific laws that punish sexual abuse of animals as its own kind of crime, versus the more generic "animal abuse" statutes on the books in most places. Why was it so hard to get every state on board? How do those laws stand up to scrutiny and challenge? Remember, we've similarly abolished anti-sodomy laws, so "sexual deviance" in itself isn't criminalized anymore, so why do bestiality laws stick even though animal abuse laws already exist?

And just to be a smartass, since dolphins are known to be a little bit ... "rapey" ... if a woman gets a free struggle cuddle from a dolphin while she's out for a swim, has she broken bestiality laws? Did they think of that possibility when they wrote those laws? Or does it get the CSAM treatment -- the event cannot have happened without activity that violates the law, so the law has been violated?
 
Last edited:
I really, really don't think you're correct on this one. Suel was not only defending AI-generated CP, but he was also refusing to acknowledge the aforementioned illegal aspects of it while being a tremendous douchebag about it.
I won't deny he was being a huge retard about it but he wasn't defending AI-generated CP, he was stating it was a pointless endevour to be after those AI generating CP when there are worst people distributing and creating real CP out there, still doing it. He was making good points on why we are so focused on those who look and distribute it without ever making headway on the source, this AI-generated story being an example. What did the KF tards in this thread do? Call him a pedophile.

Watching, distributing, creating, etc child pronography should be illegal and punishable by death.

so then ban AI, don't put your kids pictures on the internet, etc.

we create an environment that offers up children as prey on a consistent basis. Arresting guys that imagine fucking kids will do absolutely nothing to fix the problem

What are your solutions to actually stop the pedo epidemic?

what kids? "the kids want it" is an insane thing to post, and im honestly pretty disgusted.

im not a pedo sympathizer.
 
I’m a retard and don’t know the law, but this makes sense. FWIW, loli porn should also be considered illegal.
It is, but sort of. This content technically falls under obscenity laws, but those laws aren't particularly strong and not worth the time of law enforcement or DAs unless the case in question is egregious.
It varies by jurisdiction. Some places don't prosecute it, and in some places that have tried, courts have been swayed by "it's just pixels and it's not real" arguments (since the primary legal theory behind the prosecution of CSAM is that it cannot be created, distributed or possessed without the presence of an actual victim and commission of another crime, and ergo a drawing or computer rendering, which never had any original source "counterpart," cannot have the same legal theory applied to it). I might be remembering incorrectly but I don't think this question has ever reached the Supreme Court either.

It comes down to the age-old question -- can you imprison someone for creating an offensive work from scratch if it covers a specific "protected" subject? If yes, which subjects? Who gets to decide that? What specific requirements must be met to run afoul of the law? Which law? Who sets those standards?

Prosecuting CSAM under the legal principle that it cannot exist without a victim has made it much easier to prosecute, but the tradeoff is that stuff like drawings are in a legal grey area specifically because of that principle.

I'm not being pedantic to be contrarian or support/condone that kind of shit; I raise these questions to highlight how fucking hard it is to nail these perverts to the wall with a bulletproof legal process. It's not nearly as easy as it sounds, which sucks, because it's obviously something that deserves far more attention than it's getting.
Thank you very much for your input. My understanding is that the Child Pornography Prevention Act of 1996 had some laws banning stuff like lolicon. Then SCOTUS struck those down in Ashcroft as overbroad because its wording would prohibit productions of Romeo & Juliet (among other "works of art") and kicked the lolicon can back to Congress. Then the PROTECT Act came in 2003 in direct response to Ashcroft, but the laws are not very stout and are hard to successfully prosecute.
 
Can someone please explain to me how this shit is (rightfully) illegal but drawn pedo shit that Japan makes isn't? Is it some nonsense first amendment argument or has no one gotten the balls to actually arrest these people?

Personally I think if you consume anything involving minors (drawn, real or otherwise) or what a reasonable person would consider a minor (looking at you "but she's 10,000 years old) for the purpose of sexual exploitation/sexual use (including shit like cuties) you should be worked to death/shot. I'm just wondering why any the line was drawn where it was or why we don't have a line at all for certain things.


Btw: Kill all pedos, their supporters and those who shield them.

Also I'm not for loosening the laws, I want them to be strengthened. Anime drawn shit and that crap on Netflix and what trannies make should be illegal
The Supreme Court made an exception for images deemed artistically, culturally, or historically significant. So potentially even the genuine article may be considered lawful according to the Bench's interpretation. That is why.
 
Keyword: a blend of everything in the dataset. An AI is capable of picking apart subjects and mixing concepts. That's where almost all its utility comes from. If an AI could only generate things it has seen before then it would be pretty fucking pointless. "Real content" could mean anything.

Use of generative AI to satiate pedophiles is a disgusting use case, but the point being made here seems to be whether or not this legally counts as distribution of illicit material based on the fact it was generated and not photographed. If it is not trained off of child pornography then the law does not have an explicit rule for this.

Law enforcement understandably agrees that it should count as photography since it is convincing enough. But the law itself is not written with AI generated images in mind and it's something that will need to be challenged at the judicial level before we actually know.

TL:biggrin:R; Suel's point was never to justify AI pornography of children, the point was that an arrest like this is legally questionable because the law treats artwork and photography differently and we don't know how this all works yet.
As was said again and again across the thread. If this is indistinguishable from a real child, it fails Miller, if it fails Miller, it is obscene, if it is obscene, it is not protected from the 1A and thus able to be prosecuted. No matter the origin of the picture, the degree of resemblance that the picture has to a real child is what matters, I could (with enough skill and a degenerate enough mind) draw a photorealistic image of a child in a sexual situation just using my imagination, if someone discovers that image, the feds are knocking down my door and arresting me for possesion of child pornography, simple as that
 
  • Agree
Reactions: Grimoire Canthari
Fuck it, I feel like putting my reaction score to the negatives. You all are retarded niggers and @Suel Forrester didn't deserve his ban. This makes me disappointed in KF and disgusted by the tactics used here. It is no different than libtards on Twitter. For shame.

The man was a retard who didn't fucking know how anything worked but made good points anyhow, and instead of arguing his points at face value, you retarded faggots decided to call him a pedophile despite him claiming the contrary many times. I fully expected to read this thread and see the most retarded takes ever from this user but it was everyone else with them instead.

Should AI CP be legal? Fuck no, it is disgusting as shit, and I know this. I'm no pedophile, lolicon enjoyer, whatever fucking terms they slap on to cover what it really is, but the guy talked about things that I think should be discussed and you all slandered him with such. So let this be known here: Give me a double barrel and I'll happily blow the brains on any pedophile ever. Now do you think we should really be focusing our efforts on getting faggots who AI generate CP, or should we go after the sources of actual CP?
I think AI anything should be illegal depending on how realistic the thing your making is. By that I mean, if its realistic enough to make someone think its real, then it should be treated as real. If I said "I have a bomb planted outside of your house" and AI'd an image of your house and a bomb in a photorealistic manner, me saying "Oh, it was just AI!" shouldn't bail me out of trouble.

Same if I had AI dead bodies to intimidate you, or if I had realistic depictions of minors in sexual scenarios. If they look exactly like the real thing, enough to where people would assume its the real deal, then it just being AI shouldn't get me off the hook. I will admit the argument of "If no kid was harmed, why should I be treated like a kid was harmed?" is valid, but I also believe AI creates pictures from taking images, separating those images and then rearraging them to be a new image. So I could argue children were harmed.

And that's not even to mention possible AI models that are actually trained on child pornography. Of course, if you're using that, then you're basically handling CP and all arguments go out of the window, you know? Now, am I passively saying "if the lolicon isn't realistic, you shouldn't be arrested for it"? Maybe. Maybe people shouldn't be arrested for clear fiction even if I do think you might be a pedo for liking loli, but I really think we should focus on how realistic the depiction of children are here. And can we nuke Kik from orbit already? Seems like its nothing but pedos.
 
Fuck it, I feel like putting my reaction score to the negatives. You all are retarded niggers and @Suel Forrester didn't deserve his ban. This makes me disappointed in KF and disgusted by the tactics used here. It is no different than libtards on Twitter. For shame.

The man was a retard who didn't fucking know how anything worked but made good points anyhow, and instead of arguing his points at face value, you retarded faggots decided to call him a pedophile despite him claiming the contrary many times. I fully expected to read this thread and see the most retarded takes ever from this user but it was everyone else with them instead.

Should AI CP be legal? Fuck no, it is disgusting as shit, and I know this. I'm no pedophile, lolicon enjoyer, whatever fucking terms they slap on to cover what it really is, but the guy talked about things that I think should be discussed and you all slandered him with such. So let this be known here: Give me a double barrel and I'll happily blow the brains on any pedophile ever. Now do you think we should really be focusing our efforts on getting faggots who AI generate CP, or should we go after the sources of actual CP?
New copypasta landed.
 
I won't deny he was being a huge retard about it but he wasn't defending AI-generated CP, he was stating it was a pointless endevour to be after those AI generating CP when there are worst people distributing and creating real CP out there, still doing it.
That’s not how it came across. Perhaps he’s autistic and has Down syndrome. But the main reason he shouldn’t be allowed within a mile of a school was his dismissal of the fact that image generation relies on existing images and videos. It’s not just pixels. He’s either a retard or a coping pedo.

Either way he fell for classic journalist bait. The chances of this Florida guy, or anyone, who generates these types of images not having seen, produced, or distributed other actual CP is near 0. My bet is the person arrested has other illegal images as well. But a pedophile being arrested for CP won’t generate the same amount of clicks as a poor valid redditorino being thrown in jail “for just some pixels.” So of course they’ll focus on image generation for the click bait.
 
Last edited:
This bit from the article really tells me everything I need to know:
"The content that we’ve seen, we believe is actually being generated using open source software, which has been downloaded and run locally on people’s computers and then modified," Internet Watch Foundation chief technology officer Dan Sexton told The Guardian last year. "And that is a much harder problem to fix."
Oh yes, we should outlaw open source software, to protect the children of course. Only responsible AI corporations can be trusted with this kind of technology. It's so fucking transparent.
 
Bro. An ai image is a cartoon. It's computer generated. It's not real. It's pixels creating a picture of something that doesn't exist

Jesus christ no wonder they have you faggots wearing masks
Except that AI generated images is supposed to be something meant to resemble real life. And as has been said, AI companies and mechanisms train on and with real things.

This could easily be like Shadman.
An AI image, by its very nature, needs pre-existing data to be created. Now, what kind of data would an AI model need to create nude images of children?
Ain’t that the hard question Columbo would ask to nail somebody to the wall.
 
I think AI anything should be illegal depending on how realistic the thing your making is. By that I mean, if its realistic enough to make someone think its real, then it should be treated as real. If I said "I have a bomb planted outside of your house" and AI'd an image of your house and a bomb in a photorealistic manner, me saying "Oh, it was just AI!" shouldn't bail me out of trouble.
I'm not sure how this could be enforced, because of how subjective it is. Like with deep fakes, or AI enhanced images or whatever, some of them seem obviously fake to me, but then you see retards on Facebook seemingly taken in by them. Personally, I kind of wish that we could put the AI genie back in the bottle, but it's probably impossible now. What I fear, as I aluded to up thread, is that publicly available AI image generation will either be banned or severely limited/controlled, while tptb have access to the sota models that can be used to generate ever more effective propaganda. Anyway, I guess that we can only wait and see where this all goes.
 
I'm not sure how this could be enforced, because of how subjective it is. Like with deep fakes, or AI enhanced images or whatever, some of them seem obviously fake to me, but then you see retards on Facebook seemingly taken in by them. Personally, I kind of wish that we could put the AI genie back in the bottle, but it's probably impossible now. What I fear, as I aluded to up thread, is that publicly available AI image generation will either be banned or severely limited/controlled, while tptb have access to the sota models that can be used to generate ever more effective propaganda. Anyway, I guess that we can only wait and see where this all goes.
Firstly, I think there's clear evidence that most people just aren't familiar with AI. I think the average person going "But it looked like a guy planting a bomb outside of my house; how was I supposed to know it was fake?" would be a compelling argument for most. The only exception is if a person is clearly knowledgeable about AI and then used that argument. But some grandmom? Some average dude that works a 9 to 5?

I don't think we need to put that AI genie back in the bottle; for AI CP, it could be reported like regular CP. If a person's making threats using AI We can report that. Some people are smart enough to make finding them difficult, but there's CP and other illegal rings that are caught and shut down every day, so I assume it would work out just like that. Catch an idiot who made it easy, make him flip, catch more guys. I still think companies will use this event as a tool to get regulations, but that's more of them using trickery IMO. Using events to make whatever regulations they want happen.
 
Back