AI Art Seething General

That said, this isn't something I feel that's legally defensible in most countries (and possibly not even in the U.S.), considering what the model has to be trained on.
Regardless "TLDR: hes used real images of kids and edited them, then shared them" is a retarded way to describe what an image generator actually does. But I wouldn't put it past a pedo to train an ai on actual cp.
 
It'll be interesting to see where that goes, I'm calling it now that this will be used as a justification to infringe on privacy again. 'We can't rely on the traditional methods of monitoring because the AI bypasses hashes so you must let us install spyware on every pc'
 
Regardless "TLDR: hes used real images of kids and edited them, then shared them" is a retarded way to describe what an image generator actually does. But I wouldn't put it past a pedo to train an ai on actual cp.
Or he could have taken actual images of clothed children and tell the AI "remove clothes". There are tons of such AIs.
 
Or he could have taken actual images of clothed children and tell the AI "remove clothes". There are tons of such AIs.
That's basically just a tool built in to already existing generators, a feature called image to image. I don't think that's what is going on here if this is just a reddit repost of the case discussed on here a while back.
 
considering what the model has to be trained on
Even if it isn't (which is possible), the output is still illegal unless it's so absolutely shit-tier that nobody could mistake it for real, and even then it could still be illegal depending on the exact laws of the country/state/city. If it looks so real that it might as well be real, it's basically the same thing for all intents and purposes. These AI CP freaks don't have a single leg to stand on. I can't wait until there are legal clarifications released around this shit to stop anyone from trying to excuse it, and I hope they do it soon before pedos start making terabytes of legal* AI CP and hiding the real stuff in those collections to try to "prove" they didn't know the real ones were there, or other such idiocy.

*not actually legal but they're retarded and will say it is
 
Or he could have taken actual images of clothed children and tell the AI "remove clothes". There are tons of such AIs.
Yeah and if it's trained on anything but naked children it will probably put tits on them and the like.
*not actually legal but they're retarded and will say it is
It might or might not be legal, and might or might not be "CSAM" in the legal sense. I think a lot of it would violate obscenity laws in the U.S. if it had no artistic purpose and was solely intended as porn (and used as such by whoever posssessed it). Even if it somehow did pass legal muster, it would probably be good enough for probable cause to see if the person creating it had in fact used illegal material to generate.
 
  • Like
Reactions: Neurotypical Mantis
It might or might not be legal, and might or might not be "CSAM" in the legal sense.
See the problem isn't so much the whole "well it's not real" thing, but the "there is absolutely no way to prove it isn't real" thing. You could just as easily throw real CP through an AI to generate a near-identical (read: visually identical to the human eye) image that would clock as "AI generated" to pretty much every tool that can detect that sort of thing, so you simply cannot treat any of it like it isn't real or else you're basically legalizing CP so long as you pass it through a filter. Since it's already ridiculously hard to ID the kids in CP a lot of the time for a variety of reasons, it's not like you can really rely on that either. Awful as it is, a lot of those kids are dead well before the images surface, and the bodies don't get found.

Basically, unless the courts are flat-out retarded and bungle this on a level worse than anything I've ever seen, there's zero chance it won't be ruled as legally identical to CP in sane countries.
Yeah and if it's trained on anything but naked children it will probably put tits on them and the like.
You can get around this without using illegal material, hypothetically, but if anyone has a model out that does it it's certainly not public. I know the kind of data you'd need, and it's "legal", but I don't even really want to say it because it frankly disgusts me. Let's just say people are probably crawling facebook for training data at this moment.
 
I think that actually the law doesn't matter and I hope anyone generating it regardless of if they use nothing, sfw images of children, or preexisting cp, that they get shipped over to the middle east and stoned. I mean if it was anime shit then there is at least some degree of separation but like with realistic AI shit there's just nothing like that.

there's zero chance it won't be ruled as legally identical to CP in sane countries.
Let me just bump the original screenshot that started this discussion of someone being sent to jail for two decades over this. I mean I don't think it can technically be called CSAM just because there's no A in there but honestly that's just getting into the same argument of map/paedo/hebephile type shit where it doesn't matter because no one should ever leave you alone around a child regardless of what the actual name of your crime is.
 
That said, this isn't something I feel that's legally defensible in most countries (and possibly not even in the U.S.), considering what the model has to be trained on.
The model does not need to be trained on cp to produce cp. If you train it enough on normal porn chances of it being capable of producing cp increase the more you train it. It's just a small extrapolation since the neural networks are modelled to work after actual brain activity. It's like how you become capable of riding a motorcycle when you train to ride a bicycle.
 
I have a similar half-baked theory, that a lot of the whining is amateurs complaining that they can no longer sell mediocre furry art for $X a pop- emphasis more on mediocre than any particular price point.
They never made “art” in the first place. 99.99999% of the time I take the stance that poor taste and low quality doesn’t preclude art from being valuable, but I’m just not willing to pretend that shit has any artistic merit. Their drawings weren’t even “stolen” in the first place as is often the complaint because they’re awful, disgusting, and satanic. The only freaks training on that shit are the same kind of gooners that draw it. It’s fucking annoying seeing porn addicts imply the average person would ever want to buy their filth but won’t simply because they are using midjourney instead.

These models do way more than just text to image, and unsurprisingly actual artists can get more out of it than anyone else, so I’m pretty sure you’re right.

You can get around this without using illegal material, hypothetically, but if anyone has a model out that does it it's certainly not public.
Sadly don’t even have to do this. Those freaks who made ponyXL and similar trained on a shitload of loli porn. By design these models interpolate the training data (it’s a probabilistic encoding ). There are realism variants of PonyXL that can do all the anime poses with photographic style… You can piece it together. All those “pony realism” models are clearly for this purpose and they are easily the most popular on sites like civitai. Makes me sick.

That's basically just a tool built in to already existing generators, a feature called image to image. I don't think that's what is going on here if this is just a reddit repost of the case discussed on here a while back.
More likely inpaint or a simple face swap, img2img re-noises the img and usually changes faces.

It's just a small extrapolation since the neural networks are modelled to work after actual brain activity. It's like how you become capable of riding a motorcycle when you train to ride a bicycle.
No they really aren’t. The perceptron vaguely models a neuron’s synapses and activation potential mathematically, otherwise it doesn’t fulfill the function of a real neuron and your brain doesn’t share any structural or mechanistic similarities to a feed forward neural network. There is no extrapolation whatsoever. With today’s tech an img model trained on trillions of bikes will never generate a motorcycle.
 
It's amazing how hard they're trying to push the "AI child porn" narrative.
View attachment 6571488
Say "AI generated CP should be just as severely punished as drawn fictional CP" to a lolicon and I promise you you'd get an insane mind-bending argument as a response. One time I saw someone say that AI CP is worse than drawn one because drawn child porn was made with soul.
 
I think regardless, introducing the concept of "soul" as a reason why one pornographic depiction of children is better than the other is kinda eyebrow-raising.
No that's entirely logical. Paedophiles often don't just get off to children but also specifically the destruction of innocence, there's quite a few who want to fuck a child because they want to ruin that child. If they're wanking over something that doesn't contain children then there's no innocence being stripped so they're less turned on.
 
It's gonna be weird that in 5, 10, 20 years time people and new kids will look back on AI Generated images of today and have some sort of fond "nostalgia" for it and consider it an aesthetic, like how people consider early CGI "cool", "interesting" and "charming" despite it looking very ugly for its time.
Same kind of thing as the 'dreamlike' early-late 90s CGI renders. It's surrealism.
 
Pedos in my experience are incredibly low IQ people who love letting everyone know they're pedos if they feel safe and anonymous enough, so I remember some of these fuckers spamming AI CP to places like 4chan (which is neither safe nor anonymous, hence low IQ people). It was basically creepy looking naked dwarves. Considering a lot of the new models even have filtered perfectly legal adult nudity and struggle with generating nude adults (because they literally don't know what a nude adult human looks like), I don't think any of them are capable of generating realistic naked children out-of-the-box. So if there's any realistic AI child porn, this could've only been made by tuning a model on actual images of child porn. This is actually not that simple and needs a significant amount of effort and (real) material. It's a kinda "where there's smoke there's fire" situation, if somebody runs a local model he has tuned on generating CP, he probably also owns real CP and these tunes could've never been made without real CP so no matter if it's generated AI CP, children were harmed at one point to make it and by creating or consuming AI CP, you are complicit in harming these children. They are not hypothetical, at all.
 
Last edited:
This is actually not that simple and needs a significant amount of effort and (real) material. It's a kinda "where there's smoke there's fire" situation, if somebody runs a local model he has tuned on generating CP, he probably also owns real CP and these tunes could've never been made without real CP so no matter if it's generated AI CP, children were harmed at one point to make it and by creating or consuming AI CP, you are complicit in harming these children. They are not hypothetical, at all.
I actually think that even in the instance (possible in the U.S.) where it isn't found to be CSAM itself or is "protected speech" somehow, a sufficiently advanced model would at least have some evidentiary value toward establishing probable cause to look for the real stuff.
 
This guy creates impressive AI movie mini-series and (not so) recently launched a separate channel to publish the AI-generated OST for these series.


Here’s his reply regarding his process for creating music using AI.

3252341464.png


As you see, it requires a lot of effort to made a genuinely good product using AI
 
While I can get behind using AI to assist with things like improvising written work, correcting post processing mistakes and rendering artwork in 3D, etc., AI can't replicate the human touch and should only be used to enhance pieces work, not completely replace humans. I absolutely detest AI generated "art" and "videos" made without any effort on the person's part. They're ugly, talentless, sloppy, and lazy.

I will die on this hill and I don't care if I get negrated to oblivion for this opinion.
 
Back