- Joined
- Nov 15, 2021
I'm really tempted to buy one of AMD's new 7900XT. I know the price / performance point is pitched just at the level to make you go up to the 7900XTX but I think I can resist that in my case because I'm more about comparing it to what I have which is an old 480 with 8GB. I'm not a big gamer and I want something with lots of RAM for some AI noodling about. Nvidia has the edge with AI but I'm gambling that will change and I think 20GB of VRAM will offset it.
Main thing holding me back is that I expect there to be very limited supply of them at the actual MSRP and most of them will be heavily marked up. It's basically if I could wait until late January but that's going to miss Christmas which is when I actually have the most free time to play around with this. After New Years, it's going to be hectic again. Fuck - why did AMD have to wait and wait and wait to release these bloody things. It's like they want me to buy an Nvidia card.
I doubt AMD's gunning for a serious piece of the AI market. AI/ML requires the ability to churn through enormous enormous amounts of tensor arithmetic. NVIDIA is miles ahead of AMD on this front, both in hardware and the software stack. However, this comes at a cost, because Tensor Cores eat a lot of die space and can't really be used for anything else. The architecture of the Instinct datacenter GPUs suggests that AMD's going to head off in an orthogonal direction. Yes, they can do AI/ML, but their real strength is general compute, being able to churn through large arrays of FP32 and FP64 data much, much faster than NVIDIA's cards can. So what they are going for looks like "Best in class at everything other than AI/ML, but still adequate in AI/ML."

AMD Blitzes Intel, Nvidia With New Faster EPYC, Instinct Chips | CRN
AMD revealed new EPYC Milan X CPUs with 3D chiplet technology and Instinct MI200 GPUs that will create new challenges for Intel and Nvidia.
www.crn.com
But still, for playing around, 20 GB of VRAM is a lot, and the card should be fine.