- Joined
- Nov 15, 2021
I write short stories about gay guys having sex in the ass and jerk myself off when I'm done. But I'm not gay because they're just words on a page, not actual gay men! They're not real sodomites! I am 100% straight.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
It's not a guarantee, just highly likely with current models. It's not unthinkable one of these systems could combine the two concepts of a child and adult sex acts into original realistic CP, but it would probably give pretty unreliable results unless the model was specifically trained on CP.Isn't it basically guaranteed that an AI that can generate realistic CP has been trained on such material? Anyone who is developing and distributing such a thing should be in jail.
If you only knew how many men actually try to use that very logic to prove why they aren't gay.I write short stories about gay guys having sex in the ass and jerk myself off when I'm done. But I'm not gay because they're just words on a page, not actual gay men! They're not real sodomites! I am 100% straight.
the uncensored models can absolutely create child porn, edgy fuckers were trying that in the AI art threads on 4chan, but its just the reality of scraping the entire internet. It's gross, but it's not like someone is individually marking each photo as "good" or "bad"Isn't it basically guaranteed that an AI that can generate realistic CP has been trained on such material? Anyone who is developing and distributing such a thing should be in jail.
To completely prevent its generation, it wouldn't be enough to make sure the imageset used for training is clear of the content. You'd need to go and remove the concepts of nudity and children completely from the neural network forming the diffusion model's natural language processing. You'd probably need to go and remove something like two or three degrees of separation in order to prevent similar terms/concepts being used in place of just singular terms.Isn't it basically guaranteed that an AI that can generate realistic CP has been trained on such material?
In fact, I think certain models were proven to have done this.The other big problem is how can you prove a model didn't use existing CP as part of its dataset to generate said content?
I agree. And if they get a sick thrill from a particularly realistically violent game, even if it's not a sexual thrill, that could be a red flag too. It'd have to be pretty bad though, like even much worse than the edgelord mass shooter game Hatred, which is the "worst" that comes to mind.I'm glad someone understands this. Sane people don't get sexual enjoyment from violent media and if they did that would be the reddest of red flags and major cause for concern.
I really don't know what would be the best way to handle it, but I guess they could just treat it like real CP in the case of distribution, considering it would be a form of obstruction of justice. That might scare people into not spreading it around at least.By banning, do you mean outlawing under existing definitions of explicit material or preventing its generation in the first place? The former is possible but likewise increases the workload on CP investigators greatly due to volume, but the latter may be next to impossible in the long run. Online services already attempt to do so by virtue of secondary checks for certain content and keywords, but locally run models don't.
Agreed about the nuance, it might as well be the real stuff. At that point it's just "could this be used by a genuine pedophile so he doesn't hurt kids", distributed in a very controlled manner to pedos who say "I need help". I'm not sympathizing with them, I'm trying to do harm reduction. Pedophillia is a mental disease. "They can't help themselves" is not a fucking excuse, so please don't say that's what i'm saying, but if we could do anything other than hope to god we catch them to stop kids getting hurt would be great.In fact, I think certain models were proven to have done this.
Plus, even if one may argue loli isn't obscene, there's no way a reasonable person wouldn't find lifelike AI art to be obscene, regardless of whether it generates anything based on illegal content or not. There's no excuse for that shit whatsoever, any veneer of nuance is gone.
I really don't know what would be the best way to handle it, but I guess they could just treat it like real CP in the case of distribution, considering it would be a form of obstruction of justice. That might scare people into not spreading it around at least.
We could just execute them instead. It's already been proven many times over that giving pedos CP does not prevent them from molesting a child and instead desensitizes them further and makes it more likely to actually harm a child you retard nigger.Agreed about the nuance, it might as well be the real stuff. At that point it's just "could this be used by a genuine pedophile so he doesn't hurt kids", distributed in a very controlled manner to pedos who say "I need help". I'm not sympathizing with them, I'm trying to do harm reduction. Pedophillia is a mental disease. "They can't help themselves" is not a fucking excuse, so please don't say that's what i'm saying, but if we could do anything other than hope to god we catch them to stop kids getting hurt would be great.
And in a perfect world where we could find every last pedophile magically, lock them up forever, or hell, shoot them. But since we don't have that...
I haven't seen those studies. If that doesn't work, then absolutely.We could just execute them instead. It's already been proven many times over that giving pedos CP does not prevent them from molesting a child and instead desensitizes them further and makes it more likely to actually harm a child you retard nigger.
#347 basically sums up why I made this post.I read the first two pages and I don't feel like reading the other 16 pages.
But I just wanted to drop my two cents.
I'm not a doctor or psychologist. But I would say a rush you get in violent video games isn't the same from a sexual rush.
This feels like the "traps aren't gay" argument all over again.
We know sexualizing the lil' ones is absolutely and objectively wrong. Loli is a depiction of CP. Even if it's a 'victimless crime' I would argue that you probably shouldn't feed that arousal in the first place. Most people who see or hear something that even comes close to sexualizing a minor, will have a built-in automatic rejection and disgust response. Sort of like incest. It's normal and good to have that rejection response. If you don't have it, you should seek help.
Feeding it via a 'loophole' isn't healthy and is possibly opening dangerous doors.
Not everyone will go too far, but there are people who get addicted and when you feed an addiction too much, you need more and heavier doses. Sometimes it goes into the extreme. We here at the farms have seen this countless times; where a victimless crime leads into an actual crime.
Especially in this day and age where being an absolute degenerate coomer is pushed to be 'normalized'.
Sexualizing kids and their behavior is a slippery slope and you absolutely can condition your brain into associating things sexually. I've seen some shit on xitter and how disgusting the lolicons get.
We have oxygen thieves like the Giggly goon clown as a shining example.
Thats by far the strongest argument against a subset of loli.If the AI is good enough it could create confusion and more workload for investigators, wasting their time, effort, and resources on determining if something is real or not, unlike with hentai. At the very least such AI art should be banned for practical reasons.
Isn't it basically guaranteed that an AI that can generate realistic CP has been trained on such material? Anyone who is developing and distributing such a thing should be in jail.
We're already on the cusp of this being possible, since ML models can accurate use people's brainwaves to predict what they're thinking about and output matching imagery. Matching in terms of theme and content, not a 1:1 reproduction, since there's a compviz model in between. But it's precise enough to determine if it's child sexual abuse materialAnd in a perfect world where we could find every last pedophile magically, lock them up forever, or hell, shoot them. But since we don't have that...
I doubt it will go that far. Biometrics at airports are still a huge issue. I can't even imagine liberals co-signing brainscans.We're already on the cusp of this being possible, since ML models can accurate use people's brainwaves to predict what they're thinking about and output matching imagery. Matching in terms of theme and content, not a 1:1 reproduction, since there's a compviz model in between. But it's precise enough to determine if it's child sexual abuse material
Hook someone up to this for 24 hours and you'll be able to pretty accurately determine whether someone is a pedophile or not, just via intrusive thoughts.
But before you cum in your pants and invite our society to advance into the next stage of hell on earth, just take a minute to think about whether you yourself will be able to stand upright in the winds that will blow then.
They don't just go and say "And this image... and this image... whoops it was CP", they train on millions and millions of images, so it's not exactly possible to guarantee no CP gets in there.Isn't it basically guaranteed that an AI that can generate realistic CP has been trained on such material? Anyone who is developing and distributing such a thing should be in jail.
In fact, I think certain models were proven to have done this.
Plus, even if one may argue loli isn't obscene, there's no way a reasonable person wouldn't find lifelike AI art to be obscene, regardless of whether it generates anything based on illegal content or not. There's no excuse for that shit whatsoever, any veneer of nuance is gone.
I'm not a doctor or psychologist. But I would say a rush you get in violent video games isn't the same from a sexual rush.
You're missing the point of the argument.Im late to the thread, and im sure this has been said in different ways, but you can't really compare violent video game enjoyment the way most people so to sexual gratification of loli.
Like if you need the stimulation to see someone being mutalated to cum, you are fucked up. Thats the only way "violent video games are the same as porn" argument can work.
Does viewing more porn make you hornier and thereby a bigger degenerate? Yes. Thats how that part of the brain works.
The part of the brain that enjoys violence is a whole nother part all together.
No it doesn't, fuckhead. There is zero proof of this. Prove your claims.It's already been proven many times over that giving pedos CP does not prevent them from molesting a child and instead desensitizes them further and makes it more likely to actually harm a child you retard nigger.
ill be frank and say child porn and drawings of children being fucked should be excluded from freedom of expression because they are the most immoral things out thereLolicon/shotacon art is free expression and needs to be protected as such. It can be considered immoral, but there is NO measurable or proven harm from its existence, since the act of drawing or consuming said art does not harm a child. No, you do not go from looking at lolicon to molesting children, that's not how it works.
I'm not, I'm defending art, even art I don't like. You're too stupid to have this conversation, go back to the man hate thread and be a retard there instead.@Lightsaber Dildo seems to be a nonce.
I completely understand where you're coming from and I agree with this shit being highly immoral, and any consumers of it should be rightly questioned and shamed. But they are just drawings, bro. The drawings aren't the problem. That's literally all there is to it.ill be frank and say child porn and drawings of children being fucked should be excluded from freedom of expression because they are the most immoral things out there