Law Man in Florida arrested for creating Child porn through AI prompts

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

he investigation kicked off after Indian River County Sheriff's Office received tips that McCorkle was prompting an AI image generator to make child sexual imagery and distributing it via the social media app Kik.

McCorkle's arrest was part of a county operation taking down people possessing child pornography, but the generative AI wrinkle in this particular arrest shows how technology is generating new avenues for crime and child abuse.
 
Last edited:
Right, but it's short sighted. Would they be so hype on a guy getting arrested for an ai story of a guy killing blacks?
I can pick and choose.
Same way I can stand for the 2A without wanting everyone to have access to nuclear devices.
Your sophistry will not avail you, nonce
 
Guaranfuckingtee this guy has an FA account. Keep an eye on him
Screenshot_20240823_204906_Brave.jpg
 
Really the part that gets me is that he is SO ANIMATED about this topic that he couldn't spit those posts out fast enough, double and triple posting multiple times
What's that about?
i looked at his other posts, he definitely has a very remarkable interest in little kids.
absolute degenerate who thought he would find camaraderie and friendship here, of all places.(:_(
 
My $0.02, this entire thing is AI generated, and the perp doesn't even exist. Why? To incite precisely this kind of debate and convince people that more government regulation of emerging technologies is a good thing "to protect children." And as always, "think of the children" is the wedge used to further erode civil liberties due to it being a highly emotional (and effective) tactic.

As to the technology itself, I see two possibilities:
  1. Nonces will use realistic AI cheeze pizza to entice/groom real children (as has been pointed out here).
  2. It could provide a "safe" outlet to keep nonces from offending, and tag them as being in need of close scrutiny and monitoring to prevent them from ever having unsupervised contact with a child. This is predicated on the assumption that AI has advanced to a point (as others have suggested) that it is working from the merging of two separate, accepted datasets rather than just photoshopping real cheeze pizza, and can't be tied to any real individual likeness.
Possible solutions:
  • Anyone found using this to generate AI cheeze pizza is irrevocably labeled a nonce, and treated as such.
  • Un-muddy the law and say once and for all that any sexualized depictions of children in any form constitute cheeze pizza.
I'm not a legal expert, but it seems there's quite a bit of wiggle room in the way things are with regards to "possession" (in the legal sense) of illicit material. The argument is that possession drives demand, which drives abuse. But if the algorithm is coming up with something that's an amalgam of two disparate sources (rather than editing existing illegal material), then there's no abuse actually happening, so what the person is in possession of is merely "obscene material", and standards for obscenity WRT fictional material vary depending on the community standards in place. The second solution above would be a way to solve that without requiring extra legislation of new technologies, or using emotional hot-buttons as a hammer to stifle more mundane things down the line, simply because those in charge at the time don't like it.
 
Unfortunately I think they're probably going to run into trouble in court. CP is illegal precisely because real children were harmed and this is random bits of misc data that an AI cobbled together to create an image.

They might have issues answering questions like "who's the victim?" or "who was harmed?"
Unlikely, as this guy also distributed the images, so the obscenity charges at the very least will possibly stick. And if this fails Miller, this guy is gonna get his ass handed to him in prison.

I'm not a legal expert, but it seems there's quite a bit of wiggle room in the way things are with regards to "possession" (in the legal sense) of illicit material. The argument is that possession drives demand, which drives abuse. But if the algorithm is coming up with something that's an amalgam of two disparate sources (rather than editing existing illegal material), then there's no abuse actually happening, so what the person is in possession of is merely "obscene material", and standards for obscenity WRT fictional material vary depending on the community standards in place. The second solution above would be a way to solve that without requiring extra legislation of new technologies, or using emotional hot-buttons as a hammer to stifle more mundane things down the line, simply because those in charge at the time don't like it.
The images are most likely indistinguishable from a real child, which would make it fail Miller, which would make it child pornography, which would land this chomo in prison for being a distributor of child pornography
 
Last edited:
Unlikely, as this guy also distributed the images, so the obscenity charges at the very least will possibly stick. And if this fails Miller, this guy is gonna get his ass handed to him in prison.


The images are most likely indistinguishable from a real child, which would make it fail Miller, which would make it child pornography, which would land this chomo in prison for being a distributor of child pornography
You are 100% correct about the distribution aspect of this case.

It is highly likely that his AI-generated content will fail to pass a Miller test. However, never underestimate the power of a plea bargain.

According to the article, he is only being hit with obscenity charges right now, but the sheriff is trying to elevate the charges. Looks like he's only getting slapped with local statutes, but - with a good lawyer - it could get appealed up the food chain to test federal statutes already on the books.
 
If this guy can be arrested, so can you. The pixel police won't make distinctions
he got arrested for making ai child porn. that's something i can never be arrested for, since i am not a pedo and would never do that.
I also dislike the inconsistency of pedophile hate
map rights, right? neck now don't wait
 
I think it should be illegal. I also think this:
I fear that it being made illegal is the thin end of the wedge.
is a valid concern. I also think our government is too retarded to find an appropriate limiting principle, but I’ll propose one (shouting into the void where nobody will ever hear me)

AI CP should be illegal because real CP is evidence of a crime - a particularly heinous crime. If you took that photograph, you were committing a crime. With AI (especially as it gets better and better) you can now conceal your crime by creating ten million similar photos and storing them on the same HD.

By contrast, other images (even disgusting ones) are not necessarily depictions of a crime, and can’t be used to camouflage real crimes.

That's the distinction I’d like to see made … but I’m quite certain won’t be made.
 
The manufacture of CSAM is evil, and the use of pictures of real children is a grave offense that must be punished. This man is a predator who either has already or will attack a child someday. And what makes you say these AI images are cartoons? Why make that assumption? Cartoons aren't mentioned anywhere.

It's not a discussion. This guy has been arrested and, if the pre-existing law is followed, will be convicted and put into a sex offenders registry. No ifs and buts.


Whataboutism. They should both be put to death.

How would this do if the content generated displays people who are young, but there's no way to tell if they're supposed to be 22, 20, 17 or less.

So a model that is trained on adults, but generates ambiguous looking individuals in terms of age.

Federal law criminalizes realistic, lifelike pornography of children, regardless of how it was made. They will not have a difficult time prosecuting McCorkle.

These are all interesting views, and I actually agree. All pedos, CP apologists, and the like, to the wood-chipper. But the guy you're arguing with does bring up some good points, although he's going about them in the wrong way, on the wrong subject. AI legislation, at least in the US, is is WAY BEHIND the pace that the technology is capable of, and we know that technology is rapidly advancing every day. So I can kind of see some of the point he's making. If AI is, by current law, unable to be copyrighted, especially when it comes to some of the AI generated songs or photos, can it be criminalized? Yeah, they got Generic Pedophilic Slobby White Guy #3,457 and arrested him on these charges, but there is something to be said about the almost insidious nature of AI in general. Forget the CSAM angle, but what if it was instead a music producer who put out an entire AI generated music album trained on, say Taylor Swift, and put out for sale, but with a completely different AI generated artist profile and make-up, and prompted to only generate songs in the style of Madonna with a hint of polka-jazz-fusion? Would "Tay-Tay" be able to sue for damages? Or again, since the entire thing would be the creation of a computer, and as of now isn't able to be copyrighted, would the entire AI system be subject to an investigation to discover the algorithms and learning models? Or, could it be argued that the theoretical music producer is in fact creating their own "art", and therefore  CAN legally copyright their computer generated abomination?

Back to the subject at hand, I did a literal 2 minute inquiry for an online image generator, and used the prompt "two kids playing at the beach", and created this abomination...

output.jpg
Does that now make me a CP producer? Are the cyberpolice going to backtrace my IP? Will the consequences never be the same? There is something to be said about the legal limits of AI legislation now and in the future. And there is indeed some "thought crime" aspects that have the potential to be truly horrifying.

But I will say, if the pre-existing obscenity and CP laws hold up in the AI generated world, then I fully expect the authorities to then go after other degenerates like "Sophie" Labelle who was caught tracing photos of actual kids for his "cub diaperfur" drawings. Likewise, the US authorities and INTERPOL should then go after every faggot anime producer that has sexual depictions of little girls who "are really 500 year old vampires".
 
Policing cartoons is a good use of resources, fuck eating and gas for the car
Pictures of children, which people voluntarily post on the internet by the millions.

Fwiw, that's not really how ai works but even still, it's a depiction of something that isn't real either way

If I asked ai to create an image of a unicorn with a machine gun horn...that isn't trained on any such real thing either.

It's a simulation of something that doesn't exist or events that never happened.

Guys, you're falling for this shit, it's embarrassing lol

I would normally agree, but in this sort of case I think there is a pretty valid logic to treat it as real unlike say chinese cartoons or blender models.

AI generated stuff is coming very close to looking real. With enough time and effort put on carefully wording and building your prompts you can create seamlessly perfect pictures indistinguishable from the real stuff.

Doing so with children to create pretend CSAM as such presents a problem in that because it looks so real it makes people think a actual child was hurt. It takes away resources to investigate it, supplies the demand to pedos who look for it. And once caught the pedophile will claim "it's not real" and try and get a expensive and lengthy examination to prove or disprove it's authenticity.

Treating it as real CP not only encourages them to not try and make anything photorealistic, it also ensures that no pedo will ever try and get out of a sure conviction by claiming that "akshually those 16tb of child porn are all AI generated and you can't arrest me for that".
 
Agreed. As disgusting as photorealistic paedo shit is, I fear that it being made illegal is the thin end of the wedge.
Not raping children is not the thin end of the wedge. It’s actually like, the one thing people care about more than anything else, including murder. You did the UK spelling of paedo, your laws safe so unbelievably fucked the last thing you need to be worrying is legislation against AI CSAM would encroach on free speech.

Someone can go to a schools website, pick a child that’s been posted, and have AI porn of them in seconds. Children have already committed suicide over this. Little kids unironically cyber-raped. Allowing this has been proven ad naseum to create the demand which continues the torture of real IRL kids. These people are fucking pathological, they don’t even compare to general rapists, shut the fuck up.
 
Back