GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

This all sounds great, especially AV1 encoding. I just bought a 6600 so sucks to be me I guess. I've been having issues with AMDs hardware encoding, but that's something I put down to a software issue instead of a hardware one.

It's interesting to see the praise AMD is getting. I was starting to think I might be a AMD fanboy in denial for not shitting on them 24/7.

If they make a 7400 and it outperforms a 1650 or 1060, that could be really interesting. You could buy one of those old business PCs you see all over the internet, pop one of those in and bingo, you have a cheap, normie friendly gaming PC.
 
If they make a 7400/7500, hopefully it doesn't have the exact same set of asinine design choices as the 6500/6400.

That would be far into next year though. Obviously, Navi 32 and 33 are not launching soon.

Also, is the 7900 XTX about to have better price/performance than the 7900 XT?
 
Cards come out for $900 and people are celebrating this...

AM I ON FUCKING CRACK? AM I THE ONLY ONE THAT THINKS THIS NORM IS FUCKING RETARDED?
Radeon HD 5970 was $599 in 2009
Radeon HD 6990 was $699 in 2011
Radeon R9 290X was $549 in 2013
Radeon R9 Fury X was $649 in 2015
and so on.

Inflation alone is 25.2% from 2015 to 2022, 38.3% from 2009 to 2022.

Components are more expensive in general, the cards tend to use more power and are physically larger with beefier cooling, 5nm/6nm wafers are more expensive and cost per transistor is flattening. The industry was stuck on 28nm for a while, which became really cheap. GlobalFoundries was an option until they had to abandon 7nm.

AMD wants to sell off existing RX 6000 cards, which is the same strategy Nvidia was going for by launching the flagship first.

The high-end is OK. It's the low-end that is fucked, and the mid-range somewhat fucked. High costs affect the low-end more dramatically. So you aren't seeing multiple tiers of 28nm cards below $150 anymore. The low-end cards of the last gen were so compromised that spending more money to get at least the RX 6600 made more sense. APUs could wipe out the low-end in the future.

If the 7900 XTX sells at MSRP and is 70% faster than the 6900 XT, then the 6900 XT should cost no more than $600. It looks like the 6900 XT costs more than that on average right now. So price/performance slightly improved. Later, the 7000 series cards will drop in price. It won't take long if demand crashes because of a great recession and glut of GPUs.

Another point to consider is the competition. If anything, AMD can drive down Nvidia pricing, which has already started with the unlaunching of the so-called 4080 12 GB. There is absolutely no pressure for AMD to price the 7900 XTX at $700 or whatever you think is reasonable, if gaymers are willing to pay a lot more. The pressure could come later if Nvidia cuts prices or the cards don't sell. It's also possible that Nvidia buyers are willing to pay 30-60% more because of drivers, compute capability, RT performance which AMD seems to have fumbled, or they are brainwashed.

I barely play games now and only older ones. The only reason I would buy a high-end card is to try my hand at Stable Diffusion and other AI stuff. So it is nice to see AMD increasing VRAM, if only to force Nvidia to increase VRAM above 24 GB later. I expected AMD to stay at 16 GB, maybe allowing a 32 GB AIB model to exist.
 
Last edited:
So whats the deal with the Arc cards? because the prices seem alright but I don't see anyone hyped about those.
 
Last edited:
So whats the deal with the Arc cards? because the prices seem alright but I don't see anyone hyped about those.

More a promising start, indicating a promising future, than a promising present.

Might be a reasonable choice at the lower-mid tier once they sort their shit out with respect to drivers.
 
Remember when they pretended the prices were temporary and because of shortages
There were rumors about Nvidia stopping 3000 series production to create artificial scarcity which would drive the prices up and make them the new normal. The MSRP(RIP) for 3000 series cards might have been a little too good considering their performance.
 
This all sounds great, especially AV1 encoding.
What do people actually use AV1 encoding for, anyway? I was looking into it a while back because it seems to have accreted a bunch of recommendations in archival standards documents, but the support in both tooling and playback seems very limited. Seemed like a write-only format to me - which maybe is why archivists love it.
 
What do people actually use AV1 encoding for, anyway? I was looking into it a while back because it seems to have accreted a bunch of recommendations in archival standards documents, but the support in both tooling and playback seems very limited. Seemed like a write-only format to me - which maybe is why archivists love it.
Higher quality stream broadcasting, mostly.
 
So pretty much it's WEBP for video, where they save a few bytes per frame but break compatibility with everything?
Decoding support for AV1 is pretty broad, most modern devices support it. But hardware accelerated encoding is only just now gaining traction. That's my understanding anyway. My own media library is limited to h264/5, which is so absurdly easy to transcode I have a Celeron doing it with Quicksync.
 
Radeon HD 5970 was $599 in 2009
Radeon HD 6990 was $699 in 2011
Radeon R9 290X was $549 in 2013
Radeon R9 Fury X was $649 in 2015
and so on.

Inflation alone is 25.2% from 2015 to 2022, 38.3% from 2009 to 2022.

Components are more expensive in general, the cards tend to use more power and are physically larger with beefier cooling, 5nm/6nm wafers are more expensive and cost per transistor is flattening. The industry was stuck on 28nm for a while, which became really cheap. GlobalFoundries was an option until they had to abandon 7nm.

AMD wants to sell off existing RX 6000 cards, which is the same strategy Nvidia was going for by launching the flagship first.

The high-end is OK. It's the low-end that is fucked, and the mid-range somewhat fucked. High costs affect the low-end more dramatically. So you aren't seeing multiple tiers of 28nm cards below $150 anymore. The low-end cards of the last gen were so compromised that spending more money to get at least the RX 6600 made more sense. APUs could wipe out the low-end in the future.

If the 7900 XTX sells at MSRP and is 70% faster than the 6900 XT, then the 6900 XT should cost no more than $600. It looks like the 6900 XT costs more than that on average right now. So price/performance slightly improved. Later, the 7000 series cards will drop in price. It won't take long if demand crashes because of a great recession and glut of GPUs.

Another point to consider is the competition. If anything, AMD can drive down Nvidia pricing, which has already started with the unlaunching of the so-called 4080 12 GB. There is absolutely no pressure for AMD to price the 7900 XTX at $700 or whatever you think is reasonable, if gaymers are willing to pay a lot more. The pressure could come later if Nvidia cuts prices or the cards don't sell. It's also possible that Nvidia buyers are willing to pay 30-60% more because of drivers, compute capability, RT performance which AMD seems to have fumbled, or they are brainwashed.

I barely play games now and only older ones. The only reason I would buy a high-end card is to try my hand at Stable Diffusion and other AI stuff. So it is nice to see AMD increasing VRAM, if only to force Nvidia to increase VRAM above 24 GB later. I expected AMD to stay at 16 GB, maybe allowing a 32 GB AIB model to exist.
The midrange is beyond fucked. An RX 480 was around 200 USD at retail, if we account for ~2% inflation and the increase in component prices, it should be just below 300 USD if lauched today. The 6600 XT launched at 400 USD.

Stallman had it right: "We should all make these companies fail."

I'm still going to buy N33, because I'm a hypocrite and it's going to launch right in my upgrade cycle.

So whats the deal with the Arc cards? because the prices seem alright but I don't see anyone hyped about those.
Bad drivers. When it performs well, it's as good as a 3070/6800 XT; and in some games it's slower than a 6600. Chips are a multi-year investment, and Intel was greedy as usual and wanted everything. They should've stayed in the datacenter for a few years, and consolidated their designs from AI/ML to desktop IGP before making a consumer card.
 
Bad drivers. When it performs well, it's as good as a 3070/6800 XT; and in some games it's slower than a 6600. Chips are a multi-year investment, and Intel was greedy as usual and wanted everything. They should've stayed in the datacenter for a few years, and consolidated their designs from AI/ML to desktop IGP before making a consumer card.
iirc that was the plan,thats why the first ARCs were stealth launched in China without much fanfare but alas the tech youtubers found them and started test inmediately
 
Last edited:
I am cautious about the new AMD cards. Probably overcompensating the marketing wankery they threw around during the presentation.
Raster wise I understood fuck all. They relied on numbers with FSR enabled which doesn't tell me much in terms of raw performance.
Compute wise, they claim gains but I'll wait for independent testing of these cards to see if they are actually making progress in that field.
The FSR 3 announcement was unexpected, hopefully they will improve shimmering artifacts from FSR 2 that make it a deal breaker for me.
As for pricing, I miss the rx 480 times. I snagged one at MSRP back then and it was a fantastic card. While 1000 dollars is lower for top end cards, it is still too much. Hopefully lower end cards will be reasonably priced.
 
I am cautious about the new AMD cards. Probably overcompensating the marketing wankery they threw around during the presentation.
I would wait until the first production units are out and for the benchmarks but barring the corpo talk i concur that AMD played well their cards, they DID NOT need to outperform or get close to the 4090, they needed something that fucked Nvidia 4080-4070 series (thing that several reviewers noticed that AMD was oddly focused in getting shots at Nvidia) in time before the launch of those

The 4090 has become a mark of prestige that show big numbers but the correct question for getting one of those is "do i need this big ass card that give big ass numbers when barely any game is going to use the full power of it?"
 
It's the low-end that is fucked, and the mid-range somewhat fucked. High costs affect the low-end more dramatically.
Another "problem" is that IGPUs/APUs (or whatever people call them this week) are actually getting good. IGPUs can play most modern games at 1080p low or even 720p at 30-60fps.

Not great by any stretch, and tech YouTubers look down on them, but when the low end is barely providing more than what a modern CPU can, why bother?


So whats the deal with the Arc cards? because the prices seem alright but I don't see anyone hyped about those.
They had potential, but they straight up don't work according to most reviews. Making them an interesting curio, but not something practical unless you want AV1 encoding right now.

What do people actually use AV1 encoding for, anyway? I was looking into it a while back because it seems to have accreted a bunch of recommendations in archival standards documents, but the support in both tooling and playback seems very limited. Seemed like a write-only format to me - which maybe is why archivists love it.
The big selling point of AV1 is higher quality for much less space and bandwidth. This is good for everything from archiving to streaming. The gains aren't small either. I've seen all sorts of claims from 15% better quality for the same bit rate, to 13gb files being compressed to 800mb.

It's also being pushed heavily as the new standard to replace the decade+ old h264. Everyone wants to use it from YouTube to Discord to Netflix. Archivists might also like it because it can preserve noise/film grain whereas 264 can't.

The main downside right now is it's slow to encode. Video always has been, but you can't use AV1 for streaming until it can be encoded in real time, and even for my personal use it's not practical to leave my computer on at full load all night to compress a 30 minute video.

We're seeing features come in months that 264 took years to achieve like consumer level hardware encoding.

If you don't work with video, chances are you won't notice a difference. YouTube has slowly been rolling out av1 and you likely haven't noticed.
 
The 4090 has become a mark of prestige that show big numbers but the correct question for getting one of those is "do i need this big ass card that give big ass numbers when barely any game is going to use the full power of it?"
Prestige is a big thing and Intel ran the same game as Nvidia in the past where people knew that Intel* had the fastest CPU on the market. People see or hear the Nvidia is the performance king and while they're not going to buy something like the 4090 they buy what they can afford, like a 50, 60 or 70 series card.

*absurdly priced, limited quantity, mostly owned by review outlets. Nvidia did a really shady thing with the FX series where they made a top-end card that could barely be bought because not many were made, but it could be reviewed because they sent it out to the big review sites.

We're seeing features come in months that 264 took years to achieve like consumer level hardware encoding.
Part of why AV1 is moving faster with hardware encoding is probably that it isn't mired in licensing issues like h.264. When it comes to encoding Bink continued to be used in vidya because it was a flat fee and not a fee per disc pressed(as in manufactured, not sold). I think that was the reason why FF13 on Xbox 360 used Bink, it was three DVDs and not one blu-ray.
 
Last edited:
Not great by any stretch, and tech YouTubers look down on them, but when the low end is barely providing more than what a modern CPU can, why bother?
Tech vtubers only care about the big numbers, i personally like APUs because making cheap and functional small form factor computers is a niche that is barely touched, almost no one take seriously miniITXs

Part of why AV1 is moving faster with hardware encoding is probably that it isn't mired in licensing issues
Reality is that our tech overlords decide to not wanting to pay licenses because fuck those nerds, every single big tech is backing it up for that reason
 
Back