GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🔧 At about Midnight EST I am going to completely fuck up the site trying to fix something.
I'm feeling that entry-level to mid-range gulf as a Leaf: from 400 dollars for entry level to starting at 700 for mid. I got 500 syrupbucks for Christmas towards a GPU so I'm right in between. I'm keeping an eye on how things are shaping up and it's...not great. It seems like I'm going to have a very limited window of purchase between CES 2025 and Trump tariffs. 8gb entry-level cards feel like a pointless buy from a 1060 6gb, so I'm keeping my eyes on 12gb or higher cards. The sales since Black Friday have been utterly pathetic, with retailers acting like 15 bucks off a 4070 is a steal, so it's been an easy decision to wait, at least.
 
I'm feeling that entry-level to mid-range gulf as a Leaf: from 400 dollars for entry level to starting at 700 for mid. I got 500 syrupbucks for Christmas towards a GPU so I'm right in between. I'm keeping an eye on how things are shaping up and it's...not great. It seems like I'm going to have a very limited window of purchase between CES 2025 and Trump tariffs. 8gb entry-level cards feel like a pointless buy from a 1060 6gb, so I'm keeping my eyes on 12gb or higher cards. The sales since Black Friday have been utterly pathetic, with retailers acting like 15 bucks off a 4070 is a steal, so it's been an easy decision to wait, at least.
Are Trump tariffs (a promise he might not keep) expected to affect your pricing, other than the initial surge of inventory being rushed to the US ahead of tariffs? There may be no window at all for you, just a longer wait until supply arrives.

NVIDIA & AMD Rush To Ship Out Next-Gen GPUs To Avoid Trump Tariffs; GeForce RTX 5090 Estimated To Cost Up To $2,500+

There was a disappointing supposed synthetic benchmark for the RX 9070 XT that put it around 7900 GRE performance and riled up a lot of people including Coreteks (preservetube archive). But another source is putting it around 4080 performance, and there's also a wide TDP range for the cards:

AMD Radeon RX 9070 XT GPU Benchmarked In Time Spy, Delivers Better Performance Than RX 7900 GRE
AMD Radeon RX 9070 XT expected to boost up to 3.1 GHz, 260-330W TBP depending on a model
It’s worth noting that the so-called “leaked” 3DMark scores reported by some sites are not accurate. The 3DMark software does not display correct names for unreleased graphics cards this early. So please keep that in mind.

3.1 GHz is high. The highest boosting RDNA3 card was the 7600 XT at 2.75 GHz, and the 7900 XTX did 2.5 GHz.
 
Last edited:
I'm feeling that entry-level to mid-range gulf as a Leaf: from 400 dollars for entry level to starting at 700 for mid. I got 500 syrupbucks for Christmas towards a GPU so I'm right in between. I'm keeping an eye on how things are shaping up and it's...not great. It seems like I'm going to have a very limited window of purchase between CES 2025 and Trump tariffs. 8gb entry-level cards feel like a pointless buy from a 1060 6gb, so I'm keeping my eyes on 12gb or higher cards. The sales since Black Friday have been utterly pathetic, with retailers acting like 15 bucks off a 4070 is a steal, so it's been an easy decision to wait, at least.
Intel just brought entry level down to $230.

NVIDIA & AMD Rush To Ship Out Next-Gen GPUs To Avoid Trump Tariffs; GeForce RTX 5090 Estimated To Cost Up To $2,500+

Just what the market is demanding, a $2500 GPU that draws 600W. This is running parallel to Sony's debacle with the PS5. There just hasn't been enough visual improvement in games in a decade to get people excited about the next generation of hardware. I was posting about this in the Call of Duty thread, observing how Black Ops 6 manages to look less realistic and compelling than COD4, but there was an inflection point twenty years ago where the hardware became so powerful that your visual design team matters more than the platform's compute ability. That's not to say that a good team can't do more with more, but since the point of these GPUs is to make games pretty, the avalanche of lazily designed DEI-infused corposlop is giving very, very little reason to break the bank, let alone turn your desk into an oven.
 
Just what the market is demanding, a $2500 GPU that draws 600W. This is running parallel to Sony's debacle with the PS5. There just hasn't been enough visual improvement in games in a decade to get people excited about the next generation of hardware. I was posting about this in the Call of Duty thread, observing how Black Ops 6 manages to look less realistic and compelling than COD4, but there was an inflection point twenty years ago where the hardware became so powerful that your visual design team matters more than the platform's compute ability. That's not to say that a good team can't do more with more, but since the point of these GPUs is to make games pretty, the avalanche of lazily designed DEI-infused corposlop is giving very, very little reason to break the bank, let alone turn your desk into an oven.
The RTX 5090 has 32 GB of VRAM, while the RTX 5080 has 16 GB. It can barely be considered a gaming GPU. It's a "prosumer" workstation AI GPU that can do fast gaming. The jump from 24 GB to 32 GB is only for LLMs and other AI models, not gaming.

Also, the price Wccfkek came up with is just retardation:
While we are unaware of NVIDIA and AMD's next-gen GPU MSRPs, a reasonable estimate would likely be $1,799 for the GeForce RTX 5090. So, factoring in a 40% tariff, NVIDIA's flagship GPU can cost around $2,500. The tariff would shake up the consumer GPU markets in terms of pricing, and will also fuel in demand for second-hand GPUs as well.

While the MSRP and especially street pricing could be that high, I'm thinking it will be closer to $1,999. But whatever it is, people will pay it. Because it's essentially another "Titan" card, but with a commensurate performance increase.

For example, 2018's Titan RTX was a $2,499 card with 24 GB VRAM, and probably less than 10% faster than the $1,199 RTX 2080 Ti with 11 GB. The gulf between the RTX 5090 and RTX 5080 will be much greater, since it looks like the 5090 has around double the cores, on a 512-bit bus (Titan RTX, RTX 3090 and 4090 were all 384-bit cards so that is an escalation).

And while gaming is fully DEI'd, local chud/coomer AI is not.

Intel just brought entry level down to $230.
Not with nearly zero supply and market share. We can revisit it in a year and see what effect they had.
 
The jump from 24 GB to 32 GB is only for LLMs and other AI models, not gaming.

I hate to sound like a redditor, but what's the evidence for this? 32 GB isn't much for AI, and the consumer-grade cards aren't designed to be mounted in parallel inside a small form factor. Sure, I'm acquainted with people who have built commercial HPC/AI servers out of 8x4090s mounted in parallel, but they voided the warranty on the cards to do it.

It looks to me more like NVIDIA is courting deep-pocketed gamers who get excited about numbers going up, the same sorts of people who insist they can "feel the responsiveness" when Rainbow Six Siege runs at 800 fps on their 240 Hz monitors. The evidence I'd pointed to here is that NVIDIA canceled the Titan line due to how it cannibalized their enterprise products (Titan Black was a disaster for their margins) and how 4090 was advertised and promoted almost entirely for its gaming & viz performance.

NVIDIA does support CUDA on its gaming GPUs because it wants to build the teenage nerd-to-professiona-developer pipeline on its products, but that's different than designing an entire product line expressly for those young hackers.

For example, 2018's Titan RTX was a $2,499 card with 24 GB VRAM, and probably less than 10% faster than the $1,199 RTX 2080 Ti with 11 GB.

There's a reason the Titan line got discontinued. It cannibalized the sales of their enterprise products.

Not with nearly zero supply and market share.
Selling out fast enough that market share is infinity % higher* than it was last year.

*It was 0% last year.
 
I hate to sound like a redditor, but what's the evidence for this? 32 GB isn't much for AI, and the consumer-grade cards aren't designed to be mounted in parallel inside a small form factor. Sure, I'm acquainted with people who have built commercial HPC/AI servers out of 8x4090s mounted in parallel, but they voided the warranty on the cards to do it.
I believe the 30B parameter LLMs are intended to fit into 32 GB of VRAM, and there will be other AI workloads that can benefit. Instead of optimizing things like Stable Diffusion to use less, users will be looking to see how they can use more.

Previously I shared the false rumor of the 5090 being 448-bit and 28 GB (maybe 5090D Chyna edition?), which would have been the proper choice if Nvidia wanted to cheap out for gaymers and avoid some further cannibalization. Workstation cards with the same die could be packing 64 GB, or maybe 48/96 GB when using 3 GB GDDR7 modules later.

I recall the RTX 3090 supported some Quadro driver features including "NVLink bridge" for connecting multiple, but these were removed by the next generation.

"Compute" has given way to AI, and that will be the primary reason to buy the 5090. Sure, there will be rich gaymers who will buy it solely for gayming. While some cannibalization will occur with the 5090, the same dies with the best yields will likely be sold as RTX Quadros with 64-96 GB VRAM. Meanwhile, the real Blackwell shit is going to be packing 288 GB of HBM soon.

Selling out fast enough that market share is infinity % higher* than it was last year.

*It was 0% last year.
It was ~0% despite there being Arc Alchemist. What will it be after a few months of Battlemage being on the market? Selling out fast can be a result of high demand or low supply. If I search B580 on Newegg, I have the option of getting it for $379 from China. Weekly resupplies have been promised, but these will still have been produced in the low millions. It's also possible that Intel is losing money on each card, which would be unsustainable.

Bottom line is, Intel is a tricky company, and we shouldn't expect Intel's Battlemage to take significant market share or force AMD or Nvidia to adjust pricing. I will consider buying a B380 though.
 
Are Trump tariffs (a promise he might not keep) expected to affect your pricing, other than the initial surge of inventory being rushed to the US ahead of tariffs?
No idea, honestly, but I'm hedging my bets closer to "yes" because part stores here already see low margins on video cards and will probably take anything they can get to justify squeezing a few more bucks. Blaming Trump is a get out of responsibility card everyone will take advantage of in Canada. Suddenly a 400 dollar card balloons by 50 bucks? Blame Trump!

Intel just brought entry level down to $230.
I'm keeping an eye on the B580 but nobody has it. The growing pains also concern me, especially since I like to play older games, but I do hope for its continued success: someone needs to think of the poors like me who just want to play RDR2 and SM2 at 60 with some growing room as I get my setup brought from barely adequate to adequate again.
 
I believe the 30B parameter LLMs are intended to fit into 32 GB of VRAM, and there will be other AI workloads that can benefit. Instead of optimizing things like Stable Diffusion to use less, users will be looking to see how they can use more.

This is because the V100, which is now 7 years old but is still in quite a few datacenters, has 32 GB of RAM (yes, there was a 16 GB model, but it was very unpopular), making 32 GB the ideal target for mass deployment of your model.

The fact that you can run an application like MPT-30B on a 5090 is not evidence that NVIDIA is primarily targeting "AI prosumers" with this card. When has a major company like this ever made enthusiast nerds running low-end or last-gen productivity software the primary target of its flagship consumer device? The evidence I'm looking for is that gooners generating anime porn with Stable Diffusion or locally running their AI gf-bots is anything but an ultra-tiny market with irrelevant purchasing power. The 4090 could run small AI models, too, but it wasn't how NVIDIA marketed the card. The messaging around that card was wall-to-wall gaming.

I think what is far more relevant here is that the 4090 couldn't run Cyberpunk 2077 or Hitman 3 at 60 fps with 4K Ultra RT settings (link). 5090 will finally be able to breach that 60 fps barrier at maximum settings, so expect NVIDIA's messaging around marquee games, not LLM performance. Maybe they'll show off 8K benchmarks, too, since 32 GB of VRAM will make even bigger gooncave screens more viable.

It was ~0% despite there being Arc Alchemist. What will it be after a few months of Battlemage being on the market? Selling out fast can be a result of high demand or low supply. If I search B580 on Newegg, I have the option of getting it for $379 from China. Weekly resupplies have been promised, but these will still have been produced in the low millions. It's also possible that Intel is losing money on each card, which would be unsustainable.

I think they're selling out simply because it's a sub-$300 card that performs like a $400 card. Gamers--at least the ones buying mid- and entry-tier hardware on Newegg--can be pretty price-savvy. If they keep it at that price, they'll sell every single one they can make...of course, if they're losing money, they won't want to make too many more. Intel isn't exactly flush with cash these days.
 
how's it performing in Linux so far? I want a new PC sometime in the new year and I don't play many AAA/AAAA titles so they've piqued my interest
Verdict: not broken, but not good as Windows, and the Mesa drivers are glitchy in some games. I'd probably stick with AMD for Linux.


Generally, you should search online for benchmarks rather than asking me to do it for you.
 
Right now I can get a Intel Arc A770 for $500CAD ($345usd) While the Arc B580 is being resold at $600CAD ($416USD). assuming the A770 is somewhat stable in price, what price makes the B580 a better deal, and how long would I have to wait for that? Or is there a better card in the $400-$425USD range?
 
Right now I can get a Intel Arc A770 for $500CAD ($345usd) While the Arc B580 is being resold at $600CAD ($416USD). assuming the A770 is somewhat stable in price, what price makes the B580 a better deal, and how long would I have to wait for that? Or is there a better card in the $400-$425USD range?
Your country is a temporary custodian for America's fresh water, maple and oil sand reserves and will be annexed in the future. Enjoy horrible prices.

Tom's Hardware review finds that the B580 is +18% (1440p Ultra), +20% (1080p Ultra), +33% (1080p Medium), +13% (4K Ultra) vs. the A770 16GB (raster, didn't check RT). So perhaps you could swallow +20% higher (600 leafs) immediately, but I think you should wait for the price to stabilize lower.

I checked Hardware Unboxed initial review (12 game avg raster) too. It's +24% (1080p), +21% (1440p) vs. the A770 16 GB.
 
Kinda kicking myself because 7800 XTs went on sale across the board and now they're out of stock everywhere. Or were never in stock to begin with. Oh, what fun.
 
When has a major company like this ever made enthusiast nerds running low-end or last-gen productivity software the primary target of its flagship consumer device?
In some way they made a move like that with the Studio Drivers and enabling 10bpc on consumer cards(for a different reason).
 
In some way they made a move like that with the Studio Drivers and enabling 10bpc on consumer cards(for a different reason).

Was this for enthusiasts playing around in their free time, or for professionals who decided they didn't really need a full workstation GPU to do their work?
 
is getting VR shit worth it in the YOOL 2024?
no, the vision pro is the best thing on the market and it will be at least a couple years before the rest of the market catches up if it even bothers to. microsoft and other companies have completely given up on VR and the entire industry is propped up by Mark Zuckerberg having an undiagnosed mental illness, its sort of like how the entire EV market is propped up by Musk except because the use cases is way fewer than a fucking car the industry is even more dead.

the few people who still are invested in VR are all moving towards those AR glasses instead, Vision pro sort of proved that even the prothusist market isn't enough to make VR viable. the latest steam surveys basically said Zuckerberg has 75% of the market and 20% is using the index and you're gonna have a bad time if you're hoping Valve releases anything ever.

all the competitors that existed 5 years ago like Sony, HP, Samsung, Vive have either announced they're dropping support and closing their divisions or massively scaling back. even the porn companies basically have all but closed down, you regularly see lifetime passes being sold for $300 or less because these companies would prefer you give money now over paying a monthly fee of $50 that you cancel once you get bored of it. Beyond that is chink start ups who have been caught completely bullshitting people on specs and stealing credit card numbers and just being massive shits.
 
  • Dumb
Reactions: Drael
Back