Stable Diffusion, NovelAI, Machine Learning Art - AI art generation discussion and image dump

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
I've been messing with this since late september. This tech is moving fast. I thought I'd contribute some things that might help others.

First off, I personally do not want to mess with any of the online services for several reasons. If you have a good GPU, you can use these projects below to set up a local instance and generate images locally.

This one is the easiest to install, has some good features and ease of use functionality to help you.

This one is only slightly more involved to install but still easy. Has some more advanced features, too.

On my PC with two GTX Titan XP cards, I can generate an image, upscale it and apply face correction in about 20 seconds.

You can find other models to try out here: https://rentry.org/sdmodels

Be warned, this can be very addictive. It's got a lot of randomness to it, so it's like pulling the lever on a slot machine hoping to hit a jackpot over and over. Can easily pass hours messing with it.
 
The Tonight Show Starring Tupac Shakur

77609420_The_Tonight_Show_starring_Tupac_Shakur.png
 
This is a GUI version of Stable Diffusion for the computer illiterate. If you understand how to unzip a compressed file you can use this. It's less of a comprehensive experience than the webGUI version, but if you don't want to have to download a bunch of shit from 8 different sites and just wanna jump into generating images it's perfectly fine.
 
This thing hasn't been around for 2 months and people already found a way to use it to make child porn with it so my opinions of it is low. Letting people download the ai with none of the content restrictions was a terrible mistake
Yes, because completely castrating the AI to the point where it cannot do anything meaningful just so it doesn't get misused would be a better choice. Or better yet, castrating it and then never releasing it to the public due to fear of it being misused. Either accept that it will get misused or scrap the project and don't fucking bother developing an AI you're just gonna fuck up anyways.
 
This thing hasn't been around for 2 months and people already found a way to use it to make child porn with it so my opinions of it is low. Letting people download the ai with none of the content restrictions was a terrible mistake
Yes, because completely castrating the AI to the point where it cannot do anything meaningful just so it doesn't get misused would be a better choice. Or better yet, castrating it and then never releasing it to the public due to fear of it being misused. Either accept that it will get misused or scrap the project and don't fucking bother developing an AI you're just gonna fuck up anyways.

If we purposefully cripple technological progress because of a certain group of people who should be lined up against a wall, we'll just be stuck in the stone age. Just look what happened to AI Dungeon.
 
Yes, because completely castrating the AI to the point where it cannot do anything meaningful just so it doesn't get misused would be a better choice. Or better yet, castrating it and then never releasing it to the public due to fear of it being misused. Either accept that it will get misused or scrap the project and don't fucking bother developing an AI you're just gonna fuck up anyways.
Motherfucker i don't mean destroy it beyond repair, hell i don't even mean ban the regular porn,but certainly there should have been a hardcoded way to make it recognize kids and not make naked pictures of them.

You can't tell me there's not a way because its been done before on other things
 
Motherfucker i don't mean destroy it beyond repair, hell i don't even mean ban the regular porn,but certainly there should have been a hardcoded way to make it recognize kids and not make naked pictures of them.

You can't tell me there's not a way because its been done before on other things
It's open source though, so anyone could just remove that feature if they wanted to
 
Motherfucker i don't mean destroy it beyond repair, hell i don't even mean ban the regular porn,but certainly there should have been a hardcoded way to make it recognize kids and not make naked pictures of them.

You can't tell me there's not a way because its been done before on other things
That's specifically the sort of thing you couldn't do, you could reduce the weighting of child related words or totally omit images of children from training. If this tech was closed source and under lock and key you could control that. But it's not and someone can just go and train a model on nothing but images of children and petite women. This has already likely been done by pedophiles or the government. It's a Pandora's box that cannot be closed unless we EMP the whole planet
 
We are only here to share generated images... Mass Debates is probably the better home for ethical discussions about fake naked children.

By a request of @ConspicuousArdiunoDue:
Sorry for the waiting. It requires more adjustment, but it's a start. CFG varies between 6.5 and 10.0


gnome1.jpggnome2.jpggnome3.jpg

gnome4.jpggnome5.jpggnome6.jpg

gnome7.jpggnome8.jpggnome9.jpg

gnome10.jpggnome11.jpggnome12.jpg
 
Motherfucker i don't mean destroy it beyond repair, hell i don't even mean ban the regular porn,but certainly there should have been a hardcoded way to make it recognize kids and not make naked pictures of them.

You can't tell me there's not a way because its been done before on other things
People are training their own models. The code is open source. You can't stop it. It's not restricted to being a SAAS, it can be run completey locally and is by many people with GPUs as low-end as only having 4GBs of VRAM. The plug can't be pulled on it at this point. There will be models made specifically to generate that type of content and already are if you follow any of this at all. There's models trained on actual porn, models trained on furry porn, models trained on anime porn, more niche models trained on subsets of these categories. And the way all of this works, it inherently opens the door to generating CSAM. What feature that did serve as a NSFW filter in the original source code is easily removed and wasn't coded in such a way to be attached at the hip.

To answer your question, the only way to stop this would require hitting a retroactive delete button on the technology's existence in reality. Or destroying every hard drive that has the source code to it and line up everyone who ever used it against a wall.
 
This thing hasn't been around for 2 months and people already found a way to use it to make child porn with it so my opinions of it is low. Letting people download the ai with none of the content restrictions was a terrible mistake
thats the fedbait one. glowies are keen on shutting this thing down by any means necessary
 
Back