GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

The CPU draws 300W, the GPU draws 450W, so your best bet is to get a +1500W PSU, and even then after like one or two generations the power draw will be over 2000W because Intel, AMD and Nvidia have hit a hard wall where they cannot get more performance out of the silicon without increasing the power draw.
Thankfully we can limit power draw in BIOS/UEFI now. I run my 5950X at 95W and it feels just as fast, a few seconds here and there aren't that consequential.

I'd be interested in separate power limits for the E- and P-cores. Let the P-cores run as fast as possible and limit the E-cores to a total of 10-15W would make for a good balance; especially when combined with pinning processes, or some setting that lets me favour E-cores.
 
So I met a guy that seemed very knowledgeable about the minutiae of GPUs and he said that if more than one display was plugged in then it won't down clock to save power. Anyone know anything about that?
It's dependent on the specifc card and drivers. An old radeon 270 I had would do that and heat up the whole room. Anything more modern should be smarter about it.
 
The CPU draws 300W, the GPU draws 450W, so your best bet is to get a +1500W PSU, and even then after like one or two generations the power draw will be over 2000W because Intel, AMD and Nvidia have hit a hard wall where they cannot get more performance out of the silicon without increasing the power draw.

This is not really true.

A smaller process node means that you can fit more transistors on a chip and, due lower resistance, draw more power through them, which results in more heat to dissipate. So TDP goes up every time a new process node hits production quality, because customers want the most powerful chips they can get at a given price point, and traditionally don't really care about power draw. Some max TDPs from the Xeon line:

Ice Lake (10 nm): 275 W
Cascade Lake (14 nm): 205 W*
Haswell (22 nm): 165 W
Sandy Bridge (32 nm): 150 W

AMD and Intel are both releasing chips on smaller processes this year. Alder Lake is on Intel's 10nm process, and Ryzen Gen 4 is on TSMC's new 5nm process. That's why we're seeing power draws go up this year. Now, they could keep the TDPs constant, and they'd still be more powerful chips than the previous generation. After all, my 65W i9-12900 (10 nm) is a lot more powerful than the 65W i9-11900 (14 nm), and it curbstomps the 65W i7-2600S (32 nm). But the vast majority of buyers just care about getting the best thing, not about heat dissipation. Cooler chips are for laptops and mini PCs.

*Cascade Lake AP is actually just two Cascade Lake SP chips stuck together in a gigantic socket and has 2x the TDP, so I ignored that
 
This is monopolization of a market niche; and unless Intel will burn money for a few more years and bring competition with billions of lobby money, AMD will follow Nvidia's lead as they don't have either's deep pockets.
does amd even sell cards? prolly never seen one in the wild, only AIBs.

also don't think they'll copy that idea, if nvidia does the flagships themselves there's no reason to not buy the low-end stuff from them too, unless it's way more expensive - which it would have to be to give AIBs a space, otherwise there's no point for them. meanwhile amd has way more room where even AIBs can make a profit.
 
does amd even sell cards? prolly never seen one in the wild, only AIBs.

Yes, and they even have their own unique design


So what is the opinion of Zotac? $341 is the lowest i've seen a 3060 12GiB. My initial plan was to wait until for black friday in hopes to get one at retail price.
I always considered Zotac to be in the middle of the pack for Nvidia AIB cards but never had one of theirs die on me. Unless you are worried about getting a brick you can probably get the same card for $260~ off Ebay used. I know some people have a tendency to go "haha miner's used goods" for that kind of thing, but I've set friends up with used cards off there ranging from a 1070ti-2080 and never had issues with them either.
 
Last edited:
So we're a couple of weeks away from RDNA3 announcements: https://www.anandtech.com/show/1762...-rdna-3-gpu-livestream-event-for-november-3rd

Anyone thinking of getting a current-gen Nvidia might want to hold off a little unless they literally don't have a card at all. No confirmation on availability but the current vogue seems to be to have things ready to go. Nvidia's line up was on sale the day after announcement and AMD's CPU range were the same. I'd guess they want to get out ahead of Black Friday - they'd be insane not to.

Depending on reviews and price I might be looking at Radeon 7700. I've been looking to get a new card for a long time but didn't want to jump on the wrong side of the tech-chasm. These ones should hopefully be pretty power efficient. My guess is they're going to have lower power requirements for equivalent performance and probably around £50-£100 less for the same. Anything beyond that is a bonus.
 
Interestingly, you can downclock/undervolt both the newest Intel and AMD chips to 65 W if you're so inclined (like if you want to build a mini PC). You will still be playing with reasonably powerful hardware if you do:

I was curious, so I looked it up - a Pentium 100 MHz had a TDP of just 11W. CPU heat sinks in that era typically didn't even have fans on them. The free stream from the case fan flowing over the fins of the heat sink was enough. If you're old enough to remember, video game consoles didn't have fans until the PS2 era. I remember when video cards needing fans and getting their own PSU connectors was a big deal...a Voodoo 2 didn't need that.

A Ryzen 9 7950X is a 16-core CPU with a max TDP of 230W and a clock speed of 5.06 GHz at this speed. At 125W, it drops to 4.5 GHz. At 65W, it's at 3.2 GHz. We can estimate from this that to hit 11W, the clock speed would probably drop down to around 1.0 GHz...still a hell of a lot more powerful than a P100.
 
Last edited:
So we're a couple of weeks away from RDNA3 announcements: https://www.anandtech.com/show/1762...-rdna-3-gpu-livestream-event-for-november-3rd

Anyone thinking of getting a current-gen Nvidia might want to hold off a little unless they literally don't have a card at all. No confirmation on availability but the current vogue seems to be to have things ready to go. Nvidia's line up was on sale the day after announcement and AMD's CPU range were the same. I'd guess they want to get out ahead of Black Friday - they'd be insane not to.

Depending on reviews and price I might be looking at Radeon 7700. I've been looking to get a new card for a long time but didn't want to jump on the wrong side of the tech-chasm. These ones should hopefully be pretty power efficient. My guess is they're going to have lower power requirements for equivalent performance and probably around £50-£100 less for the same. Anything beyond that is a bonus.
It's going to be an interesting launch, first on-package multi-chip designs may cause headaches in some games. And the leaks don't add up at all, especially the 3+ GHz average clock speeds.

I'm waiting on whatever Navi 33 leaks end up being-likely the 7600 XT-a monolithic die capable of hitting 3 GHz on average, getting close to a 6800 XT, is very attractive.
 
It's going to be an interesting launch, first on-package multi-chip designs may cause headaches in some games. And the leaks don't add up at all, especially the 3+ GHz average clock speeds.

I'm waiting on whatever Navi 33 leaks end up being-likely the 7600 XT-a monolithic die capable of hitting 3 GHz on average, getting close to a 6800 XT, is very attractive.
I'm curious to see what the RAM options are on these things. I'm on 8GB right now and frankly it feels wrong to upgrade to anything that is less than 12GB and frankly 16GB feels like the minimum. RAM is expensive though because it's always the fastest available so I don't know what cards will be in budget.

3GHz sounds a bit high to me. It's conceivable but they'd have to push the power a lot to reach that.
 
Yes, and they even have their own unique design
View attachment 3755078
maybe that's why I've never seen them over in yuropoor (granted I wasn't looking that hard, usually buy sapphire anyway).

Depending on reviews and price I might be looking at Radeon 7700. I've been looking to get a new card for a long time but didn't want to jump on the wrong side of the tech-chasm. These ones should hopefully be pretty power efficient. My guess is they're going to have lower power requirements for equivalent performance and probably around £50-£100 less for the same. Anything beyond that is a bonus.
would be hilarious if amd goes full fuck it mode and dumps them for 1/3 less, although I assume their costs don't allow that and they want to get the best margin they can while nvidia is crawling up their own ass.
 
would be hilarious if amd goes full fuck it mode and dumps them for 1/3 less, although I assume their costs don't allow that and they want to get the best margin they can while nvidia is crawling up their own ass.
Their costs almost certainly let them undercut Nvidia quite a lot if they want. Chiplet designs push yields stratospheric. Guarantee AMD could undercut Nvidia to a whopping degree and still make a profit.

But they wont.
 

If this rumor is true, they could try to bludgeon the RTX 4080 16 GB with a similar priced 7900 XT with 20 GB VRAM, then use the 7950 XT with 24 GB against the 4090. There is also the rumor that they can either do 96 or 192 MB of Infinity Cache, the second configuration having the dies stacked on top of each other.

Too bad they are shit for Null's new Machine Learning Age.
 

If this rumor is true, they could try to bludgeon the RTX 4080 16 GB with a similar priced 7900 XT with 20 GB VRAM, then use the 7950 XT with 24 GB against the 4090. There is also the rumor that they can either do 96 or 192 MB of Infinity Cache, the second configuration having the dies stacked on top of each other.

Too bad they are shit for Null's new Machine Learning Age.
Yeah, that's the big downside. 20GB VRAM (if true) isn't going to show up in most benchmarks any time soon. But it WOULD make a very nice difference for certain ML or rendering options. Except Nvidia is dominant in those markets.
 
I was curious, so I looked it up - a Pentium 100 MHz had a TDP of just 11W. CPU heat sinks in that era typically didn't even have fans on them. The free stream from the case fan flowing over the fins of the heat sink was enough. If you're old enough to remember, video game consoles didn't have fans until the PS2 era. I remember when video cards needing fans and getting their own PSU connectors was a big deal...a Voodoo 2 didn't need that.
What's even more absurd is the abundance of chips that didn't have a heat spreader like now, they didn't use ceramic packaging either, instead they used plastic packaging with no heatsink or fan. Nintendo 64, Playstation 1, Voodoo 1 and 2 and so on where all plastic which is a fantastic thermal conductor as well know.
When things started requiring heat sinks they didn't use brackets to hold them in place, it was not necessary because they used glue instead of thermal paste. That didn't mean that they had abandoned plastic so they might glue a heat sink to a chip that used plastic to transfer the heat. But it worked, pry off that tiny, tiny heat sink and a TNT would overheat pretty fast.
 
Back