GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

AMD could price the 9060XT at $420
Seems pretty high for something that will be landing somewhere between a 7600 XT 16 GB and 7700 XT 12 GB, but not unlikely.
So I take it that the 4070 super is now the new baseline for a system, since AMD basically made a clone?
Could be, but I think the conversation is going to be about VRAM. AMD gave 16 GB to their "clone", and probably the 9060 XT (I don't trust EEC filing "leaks" but it's almost guaranteed to be the 7600 XT successor and match its VRAM). On Nvidia's side, there will be the 128-bit 4060 Ti 16 GB and it will be pretty funny if/when that outperforms the 5070.

NVIDIA’s GeForce RTX 4090 With 96GB VRAM Reportedly Exists; The GPU May Enter Mass Production Soon, Targeting AI Workloads - This is Chinese souping up the card AFAICT, and the 4 GB GDDR6X modules needed don't appear to exist. The 48 GB version can happen.

Intel’s Panther Lake SoCs Are Rumored To Be Delayed To Mid-Q4 2025; 18A Process Likely To Be The Culprit

MLID snuck a Medusa Halo leak in his latest podcast:
videoframe_5650500.png
If they really go to 384-bit, probably requiring a special I/O die, I assume they are going to have to cram 12 memory chips into designs using it. And that would get you to 192 GB at a minimum.
 
MLID snuck a Medusa Halo leak in his latest podcast:
videoframe_5650500.png
If they really go to 384-bit, probably requiring a special I/O die, I assume they are going to have to cram 12 memory chips into designs using it. And that would get you to 192 GB at a minimum.
>tfw an AMD APU is likely going to outperform Nvidia's entry-level discrete GPU for this generation
Jensensisters... not like this...
 
  • Feels
Reactions: Kane Lives
baroque requirements like yours
1. The OS I tried to boot on Hyper-V was a disk image of a system installation that wasn't Secure Boot configured
2. The Windows 7 machine is a spare, I did it for fun and the reason it had it enabled was because it used to have Proxmox on it beforehand
3. My current Win10 system can have Secure Boot enabled, but last time I tried to it bricked my motherboard cuz Gigabyte is a shit. My current system is also Win11 compatible.

Basically, Secure Boot is finicky, but a) it's optional and b) UEFI only mode is not an issue even if you're still on 7.

Anyways:
1741120843106.png

3090 = 4070 SUPER = 5070, yet if you want 24GB of VRAM you have to get a card that's two generations older, and it's a card that has NVLink, so in theory you can get 96GB with four 3090's. No other reason for Nvidia to be this cheap on VRAM other than greed. They could've given 32GB to the 5070 and it wouldn't cut into their enterprise sales margins.
>tfw an AMD APU is likely going to outperform Nvidia's entry-level discrete GPU for this generation
Jensensisters... not like this...
ROCm is still dead in the water, CUDA still dominates the market, so that theoretical 192GB of VRAM is meaningless. For gaming, 12-16GB is good enough, so the 9070 will do well in what it's advertised for, but the AI paradigm is shifting to hundreds of GB of VRAM per card. Something that Nvidia will never do for the consumer market, and something that AMD could do, but without a good software framework it's useless, and AMD themselves have admitted that ROCm is trash.
 
>tfw an AMD APU is likely going to outperform Nvidia's entry-level discrete GPU for this generation
Jensensisters... not like this...
Strix Halo can win against 4060 Mobile if given enough power, but not desktop 5060, except in any scenario limited by the 5060 only having 8 GB VRAM. I assume that's the entry-level and the RTX 5050 desktop GPU of our dreams will not materialize.

Medusa Halo? Possibly. IDK when it's coming out though.
 
Last edited:
Ah yes, the biggest Dunning-Kruger retard central on the Internet. I'll bet just about every retard that'll get mad at this already runs their OS in UEFI mode but they don't know shit about computers. The real evil is Secure Boot, that's how Microsoft wants you to use their special certs to get the OS running, but it's optional even in UEFI only mode. Remember that UEFI first released in 2006 and nowadays it has completely replaced BIOS, even though everyone still calls UEFI that when they're two separate firmwares. And if you run a system that still boots in legacy mode, chances are the OS that's running on it is too old to support the current GPU drivers either way, so it'll only be a real issue for maybe less than a dozen people.

Quick anecdote on how Secure Boot can fuck up booting: I couldn't boot in a Hyper-V VM because I enabled Secure Boot on it thinking it's UEFI mode, even though it's UEFI by default, and I couldn't boot into Windows 7 on my old PC because it also had Secure Boot enabled. In both cases disabling one setting made the system boot up perfectly fine in UEFI only mode.
I don't remember if it was on the Kiwifarms or somewhere else but I came across someone who was effectively convinced that the TPM chip was some sort of magical quantum communications chip that sent Microsoft a full dump of all data on your computer, no matter which OS you ran or what your network configuration was. That guy had lost it.
 
Could be, but I think the conversation is going to be about VRAM. AMD gave 16 GB to their "clone", and probably the 9060 XT (I don't trust EEC filing "leaks" but it's almost guaranteed to be the 7600 XT successor and match its VRAM). On Nvidia's side, there will be the 128-bit 4060 Ti 16 GB and it will be pretty funny if/when that outperforms the 5070.
What speed of ram is it though? GDDR6,6X, or 7? I can tell you from my experience with my 4070 Super, 12 gigs of GDDR6X kicks ass. I know from what @The Ugly One has said, stock GDDR6 is kinda slow. I'd be interested how that translates to RAM heavy workloads.
 
What speed of ram is it though? GDDR6,6X, or 7? I can tell you from my experience with my 4070 Super, 12 gigs of GDDR6X kicks ass. I know from what @The Ugly One has said, stock GDDR6 is kinda slow. I'd be interested how that translates to RAM heavy workloads.
AMD is using GDDR6, which keeps the costs low. Bandwidth of both 9070 XT and non-XT is at 640 GB/s using 20 Gbps memory on 256-bit, ahead of the 7800 XT that used 19.5 Gbps memory for 624 GB/s, and the 7900 GRE that used 18 Gbps for 576 GB/s. I think one of their slides for RDNA4 claimed better memory compression, but that could be a 1% improvement for all we know.

On the other hand, Nvidia's massive GDDR7 bandwidth increases for 50 series and the 5090 bus width increase did almost nothing at all for gaming performance. It made the cards more expensive to make. The 5070 has 33.3% more bandwidth than the 4070 Super, but the same performance.

I think the 9070 review embargo ends in an hour.
 
  • Informative
Reactions: WelperHelper99
What speed of ram is it though? GDDR6,6X, or 7? I can tell you from my experience with my 4070 Super, 12 gigs of GDDR6X kicks ass. I know from what @The Ugly One has said, stock GDDR6 is kinda slow. I'd be interested how that translates to RAM heavy workloads.

What matters are two things, the speed of the memory and the width of the bus. The overall bandwidth is speed x width. If I have 2 GHz memory over a 128-bit bus vs 1 GHz memory over 256-bit bus, it makes no real difference as far as speed.

I have a Radeon 6700 XT. It has 2 GHz memory on a 192-bit bus, providing 2 x 192 = 384 GB/s bandwidth. AMD positioned it against the 3070, which had 1.75 GHz memory on a 256-bit bus, providing 1.75 x 256 = 448 GB/s bandwidth.
 
AMD is using GDDR6, which keeps the costs low. Bandwidth of both 9070 XT and non-XT is at 640 GB/s using 20 Gbps memory on 256-bit, ahead of the 7800 XT that used 19.5 Gbps memory for 624 GB/s, and the 7900 GRE that used 18 Gbps for 576 GB/s. I think one of their slides for RDNA4 claimed better memory compression, but that could be a 1% improvement for all we know.

On the other hand, Nvidia's massive GDDR7 bandwidth increases for 50 series and the 5090 bus width increase did almost nothing at all for gaming performance. It made the cards more expensive to make. The 5070 has 33.3% more bandwidth than the 4070 Super, but the same performance.

I think the 9070 review embargo ends in an hour.
I see. I guess we'll see how cheaper memory shakes out in the reviews. Might be good, hopefully. Probably decent for AI.
What matters are two things, the speed of the memory and the width of the bus. The overall bandwidth is speed x width. If I have 2 GHz memory over a 128-bit bus vs 1 GHz memory over 256-bit bus, it makes no real difference as far as speed.

I have a Radeon 6700 XT. It has 2 GHz memory on a 192-bit bus, providing 2 x 192 = 384 GB/s bandwidth. AMD positioned it against the 3070, which had 1.75 GHz memory on a 256-bit bus, providing 1.75 x 256 = 448 GB/s bandwidth.
Interesting. Didn't know that.
 
AMD Radeon RX 9070 (XT) “RDNA4” Graphics Cards Review Roundup

5070 Ti = 5.9% faster than 9070 XT (1440p raster)
9070 XT = 1.7% faster than 7900 XT (1440p raster)

5070 Ti = 1.4% faster (75 vs. 74 FPS) than 9070 XT (4K raster)
9070 XT = 8.8% faster than 7900 XT (4K raster)

5070 Ti = 26% faster than 9070 XT (1440p raytracing)
9070 XT = 26% faster than 7900 XTX (1440p raytracing)

5070 Ti = 34% faster than 9070 XT (4K raytracing)
9070 XT = 27% faster than 7900 XTX (4K raytracing)


Phoronix: AMD Radeon RX 9070 + RX 9070 XT Linux Performance
9070 XT is around 7900 XT, 9070 is around 3090, results change after dropping Vulkan RT tests that didn't run on the 5700 XT.
 
Last edited:
Looks like a solid GPU. If it actually stays in stock at the lower prices it should be a popular buy....especially considering the 5070tis "real" price.
 
With less RAM and for basically the same price? Quite the value proposition.
Better at 4K in their results at least, better at raytracing, and at $600 MSRP or lower it's cheaper than the $900 MSRP 7900 XT basically ever was (I saw $620 in Nov 2024 in Slickdeals search). But if the price creeps up instead of down over the coming months, definitely not an impressive value. I'm also surprised that power efficiency looks poor compared to 7900 XT but I haven't dug into it yet (there could be overclocked models being reviewed).

AMDiscounts tend to happen, so they could be $500 (new) eventually. 9070 non-XT requires discounts and a bigger price gap with the 9070 XT to make it interesting at all.
Probably Gordon's last on-camera appearance, about a month before his death. Genuinely heartbreaking to suddenly see what state he was in. :(
>Employer forces you to soyface at gaming CPU
>You die
 
Better at 4K in their results at least, better at raytracing, and at $600 MSRP or lower it's cheaper than the $900 MSRP 7900 XT basically ever was (I saw $620 in Nov 2024 in Slickdeals search). But if the price creeps up instead of down over the coming months, definitely not an impressive value. I'm also surprised that power efficiency looks poor compared to 7900 XT but I haven't dug into it yet (there could be overclocked models being reviewed).

AMDiscounts tend to happen, so they could be $500 (new) eventually. 9070 non-XT requires discounts and a bigger price gap with the 9070 XT to make it interesting at all.
Don’t forget exclusive access to the supposedly greatly improved FSR4 upscaler… like it or not having access to a decent upscaler is important for modern gaming.
 
Don’t forget exclusive access to the supposedly greatly improved FSR4 upscaler… like it or not having access to a decent upscaler is important for modern gaming.
I thought about dropping that in there, but time will tell if people actually care for it, and it's possible that FSR4 AI upscaling is coming to RDNA3 + RDNA3.5, just not at launch.

It would be a bit crazy for FSR4 to not come to RDNA3.5 because that's all they will be selling in APUs (and the many gaming handhelds) for the next year, possibly longer.

Huc985Ry4n2nCoMjcj5tvW-970-80.png.webpjXJvkFKNAMrFvkvZwGqCpW-970-80.png.webp

9070 XT = 11.3% faster than 7900 XT (4K Ultra), 7.4% at 1440p Ultra, 5.1% at 1080p Ultra, 5.9% at 1080p Medium
 
Last edited:
The issue I have with it (and especially the issue back in the day) was Microsoft pushing it so you could effectively only boot Windows on the new PC you just bought.
The way it's designed where Microsoft has full access to sign whatever they please as a trusted entity while everyone else has to suffer is where the system falls down in my opinion. Linux was a total nightmare initially but with the UEFI shim, things are much easier and you can use Secure Boot without any dramas. What the shim does is essentially replace Secure Boot with its own verification that uses an embedded certificate shipped with your distro.

So the idea is that Red Hat ships RHEL with their version of shim that's Microsoft signed for Secure Boot and has the Red Hat certificate embedded so it can do its own copycat version of Secure Boot for executing the 2nd stage loader. It really shows how retarded the decision to implement centralized PKI for embedded firmware was for everyone except Microsoft.

My experience working with UEFI on servers lately has been around public cloud and it's just trash. The providers support it, but third party vendor support for basic shit like backups is just not there or unbelievably poorly tested. In the end I was forced to cobble together this horrible process of de-UEFIng VMs by converting disks back to MBR before migrating to public cloud as the vendor I'm stuck with has literally hardcoded legacy boot into their migration process.

Things are beginning to improve and that's only because Windows Server 2025 requires UEFI, which has been a huge kick up the ass for vendors and SysAdmins.
 
Sounds like I could line up the 9060XT as a potential upgrade for my 3060 12GB in the medium term. Still happy with it for now though, since I don't game beyond 1080p and it has yet to choke on any of the near-modern game I threw at it (the most modern being BG3, which surprisingly ran without a hitch even in the infamous Act 3).
 
  • Like
Reactions: Brain Problems
AMD Radeon RX 9070 (XT) “RDNA4” Graphics Cards Review Roundup

5070 Ti = 5.9% faster than 9070 XT (1440p raster)
9070 XT = 1.7% faster than 7900 XT (1440p raster)

5070 Ti = 1.4% faster (75 vs. 74 FPS) than 9070 XT (4K raster)
9070 XT = 8.8% faster than 7900 XT (4K raster)

5070 Ti = 26% faster than 9070 XT (1440p raytracing)
9070 XT = 26% faster than 7900 XTX (1440p raytracing)

5070 Ti = 34% faster than 9070 XT (4K raytracing)
9070 XT = 27% faster than 7900 XTX (4K raytracing)


Phoronix: AMD Radeon RX 9070 + RX 9070 XT Linux Performance
9070 XT is around 7900 XT, 9070 is around 3090, results change after dropping Vulkan RT tests that didn't run on the 5700 XT.
Those ray tracing numbers are thrown off by path tracing not working well on the 9070XT. Throwing out path tracing it's something like 10-15% RT lead for the 5070ti.
TLDR; 9070XT is perfect for 1440p ray tracing, AMD can't handle path tracing.
1440p gaming get a 9070XT.
 
Last edited:
  • Informative
Reactions: Kane Lives
Back