Stable Diffusion, NovelAI, Machine Learning Art - AI art generation discussion and image dump

For the slavs of the world, what you get in a premium model should be free/local within 6-24 months. So it's worth looking at what these models are capable of
Z-Image Turbo is already showing that promise on local, though it's yet to be seen what the full undistilled version of it will be capable of. If they incorporate NoobXL's dataset in full like people are assuming they will, and the full model will still be runnable on something like a 3080 or a 3090, then we'll have a very, very capable Apache 2.0 licensed model on our hands that might just be the one to finally dethrone all the SDXL finetunes as the king of local models.
 
Z-Image Turbo is already showing that promise on local, though it's yet to be seen what the full undistilled version of it will be capable of. If they incorporate NoobXL's dataset in full like people are assuming they will, and the full model will still be runnable on something like a 3080 or a 3090, then we'll have a very, very capable Apache 2.0 licensed model on our hands that might just be the one to finally dethrone all the SDXL finetunes as the king of local models.
On VRAM, I'm hoping that we see:
  • "Cheap" 24 GB LPDDR5X RDNA5 GPUs in 2027
    • A leak suggests that graphics chiplets can be shared between APUs and dGPUs
  • 36-48 GB VRAM for consoomer GPUs within 1-2 gens
    • 36 GB = 12x 3GB GDDR7 (192-bit or 384-bit)
    • 48 GB = 16x 3GB GDDR7 (256-bit or 512-bit)
  • 4 GB GDDR7+ modules within 2-3 gens
  • More use of LPDDR5X and LPDDR6, trading bandwidth for higher capacity at lower cost
(Fantasy land until memory crises are over.)
 
I bought a 9070 XT for gaming about a week ago and it's great for that, but now I'm second guessing if I ever want to get into image generation that maybe I should return it and get a 5070 Ti. Am I in for a world of hurt if I want to use Windows 10 and a 9070 XT? This stuff looks so impenetrable to get into I can't even tell what the difference is other than "Nvidia better" (which I knew going in, it's even true for gaming).
 
I bought a 9070 XT for gaming about a week ago and it's great for that, but now I'm second guessing if I ever want to get into image generation that maybe I should return it and get a 5070 Ti. Am I in for a world of hurt if I want to use Windows 10 and a 9070 XT? This stuff looks so impenetrable to get into I can't even tell what the difference is other than "Nvidia better" (which I knew going in, it's even true for gaming).
Nvidia for AI is better. But in the past couple of years AMD has gone from needing lots of complex fiddling around to get it to work to being fairly straightforward. ROCm on Windows has come on enormously also, boosting performance a lot. 5070TI and 9700XT both have 16GB of VRAM so the model sizes they can handle are pretty much the same. The AMD one will be a little slower. What's the price difference? £100? If I knew I was really looking to do image generation the Nvidia would be the better pick. But if I just wanted to mess around with it as a little hobby sometimes, I don't think it would have bothered me that much. Only you can convert financial cost into personal value based on what it means to you.

But I'm also of the opinion that unless someone has a really high-end card like a 5090 anyway, casual hobbyists are better off just renting some compute on a service like runpod.io . After all, what's best? Spending several hundred extra on a 5090 and occasionally doing some AI with it? Or spending $20 on Runpod and getting a couple of days of serious compute and seeing if you actually are really into this or if it's just a fad you quickly get bored with or do like once a couple of months.
 
Which is gonna be when the bubble pops and takes the entire world economy with it?
Memory prices could start to recover late 2026, early 2027 before any bubble popping, and before next-gen GPUs (and consoles) are expected to launch.

We already know memory can be in the $1-3/GB range. It can go back and even below that eventually. LPDDR5X should be cheaper per gigabyte than GDDR7.

Any resulting economic crisis would be temporary, like always. The real question is how the memory industry will react to sudden oversupply and lack of demand. They don't want memory to become cheaper due to anything other than technological improvements. They are riding the gravy train but want to get off before it derails. If they can time it just right and cut production before the pop happens, they'll do it.
 
Memory prices could start to recover late 2026, early 2027 before any bubble popping, and before next-gen GPUs (and consoles) are expected to launch.
I been hearing that since forever with GPUs and the prices just keep going up

Corpojew greed is real, a lot of shit is going up in price just cuz corpos are feeling around to see how much they can jack up the prices before normalfags stop buying and so far its working, they get more profit out of the same shit and niggercattle keeps buying overpriced shit, from cars to computers

As for excess production these niggers are gonna do a lightbulb cartel style bullshit and just agree to not release any new product until stocks are gone, that way they get rid of the excess shit without having to earn less let alone sell at a cost or at a loss
 
I been hearing that since forever with GPUs and the prices just keep going up
The difference is that with GPU's, it's Nvidia that has no competition and has been setting the prices for years, and with the memory chip shortage everyone but Micron, Samsung and SK Hynix is getting shafted. Even Nvidia's plans for SUPER refreshes got fucked up and RAM stick manufacturers are straight up telling people that their prices right now are whack and they don't want them to be like this but it's the reality of the market. This shit reaches far more and fucks over just about everybody in the space. It'll fuck over smartphone manufacturers, OEM's, every-fucking-one. And they know damn well that this isn't good for business since there will be a limit for how much people'll be willing to pay.
 
The difference is that with GPU's, it's Nvidia that has no competition and has been setting the prices for years
nvidia has no competition in AI but amd could have swiped the concsumer market in the middle of all this if they wanted, but amd rather go short profit and still be second banana while charging only slightly less than nvidia's overpriced cards
they don't want them to be like this but it's the reality of the market.
Yeah sure they dont
And they know damn well that this isn't good for business since there will be a limit for how much people'll be willing to pay.
Like I said they are greedy fucks and will keep raising the prices for as long as niggercattle keeps buying, the moment they dont they will lower the prices to before the breaking point, but the prices are never going back to what it used to before the bubble, not unless the chinks do to PC hardware what they did to ev cars, but they dont have the tech yet, those cambricon or whatever GPUs are not good enough
 
Yeah sure they dont
$100 memory sets cost $1000 now. If this would be viable they would've bumped the prices long ago, but they know this shit cannot last forever because people will inevitably stop buying shit at those prices and the total income will go down. The prices will go down to pre-bubble prices.

Here's a little thought experiment: there was a window when NVMe SSD's were piss cheap, but before and after it prices were the same. If you want to believe that these corpos are just evil and they hate you, then why would they make them so cheap during a period when it just happened that they were mass producing storage chips due to high console sales demand? After all, you believe that everything those corpos do is to screw you over and make you pay more and more, so why would they lower those prices during that window is they're Machiavellian schemers or whatever the fuck people on this forum seem to constantly think about big corpos nowadays?
 
The dynamic is different between nvmes (which can be produced by a variety of multinational manufacturers) and Nvidia GPUs (which can be produced by one American Corporation).

Post COVID American corporations figured out that the bottom 90% can be a write off, as the top 10% of income earners account for the majority on consumer spending. Something going from $40 to $100 won't effect those with a lot of discretionary income(think 200k), but it will screw the poor. Plus you can make less.

The rest of the world avoids this because it doesn't make sense with their lower wealth inequality in their country. Exports from those countries undercut American producers engaging in the 90/10% rule making it infeasible in that sector. Luckily with Tariffs we got rid of those, so Americans can suffer in their economic prison, and blame it on foreigners.
 
Last edited:
If you want to believe that these corpos are just evil and they hate you
I never said that they were evil, I said they were greedy, and thats a real thing, a ceo can and often gets sued and fired by shareholders for not making as much money as its humanely possible, if some other corpo is going full jew with their prices and raking it in but you dont then your shareholders are going to hang you
Post COVID American corporations figured out that the bottom 90% can be a write off, as the top 10% of income earners account for the majority on consumer spending. Something going from $40 to $100 won't effect those with a lot of discretionary income(think 200k), but it will screw the poor.
This is the real shit, its been happening in my country for a while now, you talk to sales people and they tell you they are moving to the semi-premium and up because they make more money that way
 
IMG_5138.jpegIMG_5139.jpegIMG_5140.jpeg
IMG_5014.jpegIMG_5142.jpeg

IMG_5074.jpeg
I used ai to do the Trump then edited him in
 
Does anyone know of good models to make a static image lipsync with recorded dialogue? I found HuMo which did an excellent job but for some reason seems capped at 3 seconds. I don't know why and can't find much on it.
 
Ars Technica: Google’s updated Veo model can make vertical videos from reference images with 4K upscaling
Google’s Veo video AI made stunning leaps in fidelity in 2025, and Google isn’t stopping in 2026. The company has announced an update for Veo 3.1 that adds new capabilities when you provide the model with reference material, known as Ingredients to Video. The results should be more consistent, and output supports vertical video and higher-resolution upscaling.

https://blog.google/innovation-and-ai/technology/ai/veo-3-1-ingredients-to-video/ (archive)







Google DeepBlacks
 
Hey guys, what did i miss?
I used SD when it was first shared here but are there any image generation models i can use which would have good/decent image retention? I am considering using this for more advanced and more basic graphics (3D renders or pixelated, 2000s style images). I can't draw and need somesort of base to create 3d models or something, nothing concrete yet.
If SD is still good for this, how to manage image retention? To me it seemed way too random how it generated images but maybe its possible to tune it, especially nowadays? Maybe use image-to-image generation.
Maybe there's something better than SD but still free and/or open-source? I have an 8Gb Gpu which worked fairly well with SD back in tha day.
 
Back
Top Bottom