- Joined
- Dec 19, 2022
There are people who build model sailboats, or Legos. Let the man cook.
I was just curious, it doesn't make much sense to me.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
There are people who build model sailboats, or Legos. Let the man cook.
The legend of Half-dude lives on.I was just curious, it doesn't make much sense to me.
Ladies and gentlemen. I must apologize. I find myself unable to produce anthropomorphic automobiles to an acceptable standard.The legend of Half-dude lives on.
I almost find that hard to believe after seeing all the horrors people make with the pony diffusion models. Something in that cesspit can do it with some taggotry autism.Ladies and gentlemen. I must apologize. I find myself unable to produce anthropomorphic automobiles to an acceptable standard.
This is not due to lack of effort, and in theory it is possible to do, but the technology isn't there yet aside from creating vehicles with realistic gaping assholes.
If AI can rotate this cow, it is capable of more imagination than a sizable portion of the human population.
I stated it could be done, it's just not to what I believe is an acceptable standard of fuckable ground vehicles.I almost find that hard to believe after seeing all the horrors people make with the pony diffusion models. Something in that cesspit can do it with some taggotry autism.
Any info about whether or not it's possible to run it on consumer GPU's? They recommend 80GB GPU's which only exist in the enterprise hardware sphere, or the home lunatic sphere where you NVLink four 3090's together.Hunyuan image to video got released
You can run Hunyuan on as low as 8-12gb with specific workflows, it seems like you need at least 24 for the img2video model at the moment but I expect that to go down as people do their magicAny info about whether or not it's possible to run it on consumer GPU's? They recommend 80GB GPU's which only exist in the enterprise hardware sphere, or the home lunatic sphere where you NVLink four 3090's together.
Excuse me that's three 3090s and a 3070Any info about whether or not it's possible to run it on consumer GPU's? They recommend 80GB GPU's which only exist in the enterprise hardware sphere, or the home lunatic sphere where you NVLink four 3090's together.
That gives me at least a Nvadia ZTX 12340, right...?Excuse me that's three 3090s and a 3070
The largest capacity PCIe GPU that Nvidia makes is the H100 96GB, which would correspond to the capacity of four 3090's, but not the TDP where a single H100 has 700W and four 3090's have 1400W, and the price where the H100 will cost you an arm and a leg, while you may be able to get four 3090's for the MSRP of a single 5090. There's still the A100 80GB as well as H100 80GB, but, again, enterprise tier GPU's.Excuse me that's three 3090s and a 3070
Most applications communicate over PCIe lanes that use large memory pools from multiple GPUs. I use Illama.cpp to get 36GB from a 3090/3080 running a 70B with a 2IQ quant.The largest capacity PCIe GPU that Nvidia makes is the H100 96GB, which would correspond to the capacity of four 3090's, but not the TDP where a single H100 has 700W and four 3090's have 1400W, and the price where the H100 will cost you an arm and a leg, while you may be able to get four 3090's for the MSRP of a single 5090. There's still the A100 80GB as well as H100 80GB, but, again, enterprise tier GPU's.
It's worth noting those have NVLink, which Nvidia removed from consumer GPU's with the 40 series. Enterprise clients get a handful of those 80/96GB GPU's and chain them together to get massive VRAM pools, yet Nvidia refuses to give more than 32GB to their most expensive consumer models that cannot be chained, and everything besides that is 16GB or lower. In case someone thought that Nvidia doesn't give you more VRAM in fear of cutting into their enterprise offer. It wouldn't, they could give you 96GB in a 5090 and it wouldn't mean much when you can't chain it together. They're just greedy.