Why do we put so much stock into loli/underaged characters as a direct link to being a pedo?

Isn't it basically guaranteed that an AI that can generate realistic CP has been trained on such material? Anyone who is developing and distributing such a thing should be in jail.
It's not a guarantee, just highly likely with current models. It's not unthinkable one of these systems could combine the two concepts of a child and adult sex acts into original realistic CP, but it would probably give pretty unreliable results unless the model was specifically trained on CP.
I write short stories about gay guys having sex in the ass and jerk myself off when I'm done. But I'm not gay because they're just words on a page, not actual gay men! They're not real sodomites! I am 100% straight.
If you only knew how many men actually try to use that very logic to prove why they aren't gay.
 
Isn't it basically guaranteed that an AI that can generate realistic CP has been trained on such material? Anyone who is developing and distributing such a thing should be in jail.
the uncensored models can absolutely create child porn, edgy fuckers were trying that in the AI art threads on 4chan, but its just the reality of scraping the entire internet. It's gross, but it's not like someone is individually marking each photo as "good" or "bad"

At the end of the day, I made this thread because of the sheer disconnect between what my friend who likes loli showed me and as I said a few pages back, the survivor stories I had to listen to in college. It just seems crazy to compare the two. The real life stories had an actual victim, someone who was clearly still fucked up about it ten years later. Loli is literally an anime girl who looked like a teenager being fucked. Obviously they're related but to act like its the same thing is wild.

And then I look at lolcow threads and mixed among actual awful things they've done, or really gross shit it'll just say "oh, and he likes loli too." Predictor? Sure, but it seems completely different from all the other stuff. I don't know if I was wrong to take it as "the loli part is just as bad as the fucked shit they did" and if I was, then I apologize for wasting time.
 
Last edited:
Isn't it basically guaranteed that an AI that can generate realistic CP has been trained on such material?
To completely prevent its generation, it wouldn't be enough to make sure the imageset used for training is clear of the content. You'd need to go and remove the concepts of nudity and children completely from the neural network forming the diffusion model's natural language processing. You'd probably need to go and remove something like two or three degrees of separation in order to prevent similar terms/concepts being used in place of just singular terms.

Even then, it's hard to say it would totally prevent it. I imagine there's been internal testing in corporations on how to best cut it at the source, but it's still "black box"-levels of processing, and training new systems when you're dealing with hundreds of billions of hyperparameters is costly with no guarantees.

And then, even with all that being done, all it takes is someone going and feeding in images to use as a method of fine-tuning through LoRA and you're effectively back to square one.
 
The other big problem is how can you prove a model didn't use existing CP as part of its dataset to generate said content?
In fact, I think certain models were proven to have done this.

Plus, even if one may argue loli isn't obscene, there's no way a reasonable person wouldn't find lifelike AI art to be obscene, regardless of whether it generates anything based on illegal content or not. There's no excuse for that shit whatsoever, any veneer of nuance is gone.

I'm glad someone understands this. Sane people don't get sexual enjoyment from violent media and if they did that would be the reddest of red flags and major cause for concern.
I agree. And if they get a sick thrill from a particularly realistically violent game, even if it's not a sexual thrill, that could be a red flag too. It'd have to be pretty bad though, like even much worse than the edgelord mass shooter game Hatred, which is the "worst" that comes to mind.


By banning, do you mean outlawing under existing definitions of explicit material or preventing its generation in the first place? The former is possible but likewise increases the workload on CP investigators greatly due to volume, but the latter may be next to impossible in the long run. Online services already attempt to do so by virtue of secondary checks for certain content and keywords, but locally run models don't.
I really don't know what would be the best way to handle it, but I guess they could just treat it like real CP in the case of distribution, considering it would be a form of obstruction of justice. That might scare people into not spreading it around at least.
 
In fact, I think certain models were proven to have done this.

Plus, even if one may argue loli isn't obscene, there's no way a reasonable person wouldn't find lifelike AI art to be obscene, regardless of whether it generates anything based on illegal content or not. There's no excuse for that shit whatsoever, any veneer of nuance is gone.


I really don't know what would be the best way to handle it, but I guess they could just treat it like real CP in the case of distribution, considering it would be a form of obstruction of justice. That might scare people into not spreading it around at least.
Agreed about the nuance, it might as well be the real stuff. At that point it's just "could this be used by a genuine pedophile so he doesn't hurt kids", distributed in a very controlled manner to pedos who say "I need help". I'm not sympathizing with them, I'm trying to do harm reduction. Pedophillia is a mental disease. "They can't help themselves" is not a fucking excuse, so please don't say that's what i'm saying, but if we could do anything other than hope to god we catch them to stop kids getting hurt would be great.

And in a perfect world where we could find every last pedophile magically, lock them up forever, or hell, shoot them. But since we don't have that...
 
  • Dumb
Reactions: Lidl Drip
Agreed about the nuance, it might as well be the real stuff. At that point it's just "could this be used by a genuine pedophile so he doesn't hurt kids", distributed in a very controlled manner to pedos who say "I need help". I'm not sympathizing with them, I'm trying to do harm reduction. Pedophillia is a mental disease. "They can't help themselves" is not a fucking excuse, so please don't say that's what i'm saying, but if we could do anything other than hope to god we catch them to stop kids getting hurt would be great.

And in a perfect world where we could find every last pedophile magically, lock them up forever, or hell, shoot them. But since we don't have that...
We could just execute them instead. It's already been proven many times over that giving pedos CP does not prevent them from molesting a child and instead desensitizes them further and makes it more likely to actually harm a child you retard nigger.
 
We could just execute them instead. It's already been proven many times over that giving pedos CP does not prevent them from molesting a child and instead desensitizes them further and makes it more likely to actually harm a child you retard nigger.
I haven't seen those studies. If that doesn't work, then absolutely.
 
I read the first two pages and I don't feel like reading the other 16 pages.
But I just wanted to drop my two cents.

I'm not a doctor or psychologist. But I would say a rush you get in violent video games isn't the same from a sexual rush.
This feels like the "traps aren't gay" argument all over again.

We know sexualizing the lil' ones is absolutely and objectively wrong. Loli is a depiction of CP. Even if it's a 'victimless crime' I would argue that you probably shouldn't feed that arousal in the first place. Most people who see or hear something that even comes close to sexualizing a minor, will have a built-in automatic rejection and disgust response. Sort of like incest. It's normal and good to have that rejection response. If you don't have it, you should seek help.

Feeding it via a 'loophole' isn't healthy and is possibly opening dangerous doors.
Not everyone will go too far, but there are people who get addicted and when you feed an addiction too much, you need more and heavier doses. Sometimes it goes into the extreme. We here at the farms have seen this countless times; where a victimless crime leads into an actual crime.
Especially in this day and age where being an absolute degenerate coomer is pushed to be 'normalized'.
Sexualizing kids and their behavior is a slippery slope and you absolutely can condition your brain into associating things sexually. I've seen some shit on xitter and how disgusting the lolicons get.

We have oxygen thieves like the Giggly goon clown as a shining example.
 
I read the first two pages and I don't feel like reading the other 16 pages.
But I just wanted to drop my two cents.

I'm not a doctor or psychologist. But I would say a rush you get in violent video games isn't the same from a sexual rush.
This feels like the "traps aren't gay" argument all over again.

We know sexualizing the lil' ones is absolutely and objectively wrong. Loli is a depiction of CP. Even if it's a 'victimless crime' I would argue that you probably shouldn't feed that arousal in the first place. Most people who see or hear something that even comes close to sexualizing a minor, will have a built-in automatic rejection and disgust response. Sort of like incest. It's normal and good to have that rejection response. If you don't have it, you should seek help.

Feeding it via a 'loophole' isn't healthy and is possibly opening dangerous doors.
Not everyone will go too far, but there are people who get addicted and when you feed an addiction too much, you need more and heavier doses. Sometimes it goes into the extreme. We here at the farms have seen this countless times; where a victimless crime leads into an actual crime.
Especially in this day and age where being an absolute degenerate coomer is pushed to be 'normalized'.
Sexualizing kids and their behavior is a slippery slope and you absolutely can condition your brain into associating things sexually. I've seen some shit on xitter and how disgusting the lolicons get.

We have oxygen thieves like the Giggly goon clown as a shining example.
#347 basically sums up why I made this post.
 
If the AI is good enough it could create confusion and more workload for investigators, wasting their time, effort, and resources on determining if something is real or not, unlike with hentai. At the very least such AI art should be banned for practical reasons.
Thats by far the strongest argument against a subset of loli.

But the cats already out of the bag. You already have uncensored image generators that can operate on just a computer and the underlying technology is out in the public. You'd have to have require 1984 government malware installed on every computer to stop it.

Will these AI generators lead to an explosion of real CP? I don't think so, If anything AI would destroy the market for CP. Will it increase the work of LEO? Again, I'm unsure of the mechanics of this. How exactly would this happen? The vast majority of child abuse is caught by tipsters, stings, and old fashioned police work as far as I'm aware. The algorithms scanning images that people fear AI will interfere with are mostly good for the elimination of the images themselves not really preventing abuse. And its not clear that AI will even interfere with them. IE Ai detection is converging past the point where theres an incentive to defeat it except if you're some superspy agency. Fake images won't prevent a real person from being identified in a real image through your minority report facial database. etc

Isn't it basically guaranteed that an AI that can generate realistic CP has been trained on such material? Anyone who is developing and distributing such a thing should be in jail.

You can AI generate all sorts of fantasy shit with no obvious 1:1 stock image origin so probably not.
 
And in a perfect world where we could find every last pedophile magically, lock them up forever, or hell, shoot them. But since we don't have that...
We're already on the cusp of this being possible, since ML models can accurate use people's brainwaves to predict what they're thinking about and output matching imagery. Matching in terms of theme and content, not a 1:1 reproduction, since there's a compviz model in between. But it's precise enough to determine if it's child sexual abuse material

Hook someone up to this for 24 hours and you'll be able to pretty accurately determine whether someone is a pedophile or not, just via intrusive thoughts.

But before you cum in your pants and invite our society to advance into the next stage of hell on earth, just take a minute to think about whether you yourself will be able to stand upright in the winds that will blow then.
 
  • Like
Reactions: The Foxtrot
We're already on the cusp of this being possible, since ML models can accurate use people's brainwaves to predict what they're thinking about and output matching imagery. Matching in terms of theme and content, not a 1:1 reproduction, since there's a compviz model in between. But it's precise enough to determine if it's child sexual abuse material

Hook someone up to this for 24 hours and you'll be able to pretty accurately determine whether someone is a pedophile or not, just via intrusive thoughts.

But before you cum in your pants and invite our society to advance into the next stage of hell on earth, just take a minute to think about whether you yourself will be able to stand upright in the winds that will blow then.
I doubt it will go that far. Biometrics at airports are still a huge issue. I can't even imagine liberals co-signing brainscans.
Isn't it basically guaranteed that an AI that can generate realistic CP has been trained on such material? Anyone who is developing and distributing such a thing should be in jail.
They don't just go and say "And this image... and this image... whoops it was CP", they train on millions and millions of images, so it's not exactly possible to guarantee no CP gets in there.
Also a smart enough image generation model that has all the safety rails off could totally queue up CP if you prompt it, because it will do what you ask. It's smart enough to associate nudity shapes, etc. Still very gross.
 
Last edited:
In fact, I think certain models were proven to have done this.

Plus, even if one may argue loli isn't obscene, there's no way a reasonable person wouldn't find lifelike AI art to be obscene, regardless of whether it generates anything based on illegal content or not. There's no excuse for that shit whatsoever, any veneer of nuance is gone.

Well you'd have to get to wiping every nude image of children from existence, medical photographs, to the creepy art photographers that do underaged nudes. It seems like an impossible task to do. And all for what? You've wiped out pedo adjacent images for about a day before pedos somewhere start churning out more stuff.

All these hypotheticals make it seem like time has frozen and you suddenly are the Emperor of Earth.
 
I'm not a doctor or psychologist. But I would say a rush you get in violent video games isn't the same from a sexual rush.

Approaching this from the angle of "are two feelings of rush equivalent" is just shorthand for "okay Pandora, we can criminalize thoughts and feelings but NOT FROM MY VIDYA" which in my opinion is pretty shortsighted and naivë.

The only sane foundation from which you can approach criminalizing image generation AI models and their users is from the fact of the sexual exploitation material coming originally from actual victims. Hopefully then a reasonable argument can be made that distributing the model is somehow tantamount or equivalent to possessing and distributing the original material.

The legality of e.g. a video game hinging on whether the player is experiencing feelings of enjoyment that are in a category that is acceptable to you, is an insane basis for whether it should be illegal.
 
Im late to the thread, and im sure this has been said in different ways, but you can't really compare violent video game enjoyment the way most people so to sexual gratification of loli.

Like if you need the stimulation to see someone being mutalated to cum, you are fucked up. Thats the only way "violent video games are the same as porn" argument can work.

Does viewing more porn make you hornier and thereby a bigger degenerate? Yes. Thats how that part of the brain works.

The part of the brain that enjoys violence is a whole nother part all together.
 
Im late to the thread, and im sure this has been said in different ways, but you can't really compare violent video game enjoyment the way most people so to sexual gratification of loli.

Like if you need the stimulation to see someone being mutalated to cum, you are fucked up. Thats the only way "violent video games are the same as porn" argument can work.

Does viewing more porn make you hornier and thereby a bigger degenerate? Yes. Thats how that part of the brain works.

The part of the brain that enjoys violence is a whole nother part all together.
You're missing the point of the argument.

People constantly say shit like "if you look at loli you're a pedophile, pedophiles molest children, therefore loli must be destroyed". It's simply not true that all pedophiles molest children, but there's an astounding amount of room temperature IQ people (even on Kiwifarms) who steadfastly believe this without any evidence to support it. There's no point to getting rid of loli/shota art since it wouldn't protect children, same as getting rid of violent video games would not decrease the amount of violence in the world.

It's already been proven many times over that giving pedos CP does not prevent them from molesting a child and instead desensitizes them further and makes it more likely to actually harm a child you retard nigger.
No it doesn't, fuckhead. There is zero proof of this. Prove your claims.

Lolicon/shotacon art is free expression and needs to be protected as such. It can be considered immoral, but there is NO measurable or proven harm from its existence, since the act of drawing or consuming said art does not harm a child. No, you do not go from looking at lolicon to molesting children, that's not how it works.
We do not live in a world where we ban or destroy things just because something is icky and harms your personal sensibilities, only Redditors and ResetEra users want that. At the end of the day, the only justification for banning this garbage is your personal disgust. And that's not cutting it.
Kiwifarmers talking about taking a shit on free expression when the values of free expression is what lets this forum exist is fucking hilarious. With these freedoms, we have to live with the fact they will be used for bad things or things we don't agree with. Fucked up art is included in those things. You can choose not to consume it, but scrambling for some justification to get rid of it is pathetic.
 
Lolicon/shotacon art is free expression and needs to be protected as such. It can be considered immoral, but there is NO measurable or proven harm from its existence, since the act of drawing or consuming said art does not harm a child. No, you do not go from looking at lolicon to molesting children, that's not how it works.
ill be frank and say child porn and drawings of children being fucked should be excluded from freedom of expression because they are the most immoral things out there
 
@Lightsaber Dildo seems to be a nonce.
I'm not, I'm defending art, even art I don't like. You're too stupid to have this conversation, go back to the man hate thread and be a retard there instead.
ill be frank and say child porn and drawings of children being fucked should be excluded from freedom of expression because they are the most immoral things out there
I completely understand where you're coming from and I agree with this shit being highly immoral, and any consumers of it should be rightly questioned and shamed. But they are just drawings, bro. The drawings aren't the problem. That's literally all there is to it.
 
Back