GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Sounds like I could line up the 9060XT as a potential upgrade for my 3060 12GB in the medium term. Still happy with it for now though, since I don't game beyond 1080p and it has yet to choke on any of the near-modern game I threw at it (the most modern being BG3, which surprisingly ran without a hitch even in the infamous Act 3).
I predict the 9060 XT will be somewhere between 30-50% faster than the 3060 12 GB and short of the 7700 XT (about 60% faster than 3060), but with 16 GB of VRAM and FSR4sure.

Lossless Scaling 3.1 adds ‘Adaptive Frame Generation’
 
Last edited:
  • Like
Reactions: Kane Lives
I will never care about raytracing in games. I remember when the hype was that it promised more realistic lighting and more efficient calculation. What a crock of shit. Actual garbage technology, and I bet you can do the same lighting at lower cost with fucking light probes.

These cards might be perfect for me.
 
I will never care about raytracing in games. I remember when the hype was that it promised more realistic lighting and more efficient calculation. What a crock of shit. Actual garbage technology, and I bet you can do the same lighting at lower cost with fucking light probes.

These cards might be perfect for me.
It's because devs had to use a hybrid approach off adding RT over raster. Now we're starting to see RT only games and they are resonable(Doom the Dark ages asks for an RTX 2060 for 1080p@60). You sound like every gamer that rages about new tech that becomes standard.
 
It's because devs had to use a hybrid approach off adding RT over raster. Now we're starting to see RT only games and they are resonable(Doom the Dark ages asks for an RTX 2060 for 1080p@60). You sound like every gamer that rages about new tech that becomes standard.
Or he could be someone that misses when GPUs were affordable and not overbuilt space heaters that seemingly exist to cope with Unreal Engine 5 being a dumpster fire.
 
UK retailer confirms stock of 1,000 units of a single Radeon RX 9070 XT model at MSRP

RADEON-RX-9070-XT-PULSE-OCUK-HERO-1200x624.jpg
 
Only 5 iterations of the card I've seen will work with a 750w pwrsupply, I'm hoping I can get one of the models above tomorrow. I called Best Buy today to ask if they had any information about the stock and whether showing up early at the start of the day would be the best option, the jeet on the other end of the phone put me on hold for 5 minutes then had nothing useful to tell me.
 
It's not really that they overestimate the wattage required, as it is that they take into account transients. Both CPUs and GPUs will briefly spike the amperage drawn to well over twice what they're will average over time, and if your power supply can't keep up your system may grow unstable or outright crash, even though the power supply is theoretically sufficient. An overspecced power supply means you'll have more of a buffer in the form of larger capacitors, contributing to stability. As a bonus their cooling will be more likely to keep up with loads without needing to run the fans at a high speed, keeping noise down.
 
Don't companies overestimate the PSU wattage needed?

View attachment 7059051View attachment 7059052
Tom's result for 9070 XT vs. 5070 Ti seems like an outlier.

The entire system with a 9070 XT shouldn't be using much more than 450 Watts, or 500 Watts for overclocked models.
Assuming that the second power consumption chart relates to just the GPU power and not the whole test system power, then those 445W on the 3080 are much more than it's 320W TDP. If the GPU itself can consume those 445W at it's peak when it's apparently rated at 320W, and those PSU recommendations also need to encompass any potential CPU and peripheral combo of all potential systems, then those recommendations have a large safety margin for a reason. Maybe someone wants to pair it with a top end CPU that'll pull a lot of juice as well? Most consumers are idiots, cards like to pull a little more than what's on the spec sheet, people might not understand that CPU's turbo boost TDP is usually what it'll run at so that's what they have to consider and so on and so forth.

In short, they're overestimated so that people who don't know jackshit get enough power overhead to not end up tripping the PSU. Hell, I went with a 1000W PSU just to make sure I have a big safety margin and potential upgrade overhead, even if 700-800W would suffice for my current build. Though really, it was because I needed to upgrade my PSU and the price difference between the 850W and the 1000W model of what I was looking for was so small I went with the beefier one.
 
  • Like
Reactions: Bob's Fuckass Head
It's because devs had to use a hybrid approach off adding RT over raster. Now we're starting to see RT only games and they are resonable(Doom the Dark ages asks for an RTX 2060 for 1080p@60). You sound like every gamer that rages about new tech that becomes standard.
lol, 60fps? Are you still in 2012?
 
It's not really that they overestimate the wattage required, as it is that they take into account transients. Both CPUs and GPUs will briefly spike the amperage drawn to well over twice what they're will average over time, and if your power supply can't keep up your system may grow unstable or outright crash, even though the power supply is theoretically sufficient. An overspecced power supply means you'll have more of a buffer in the form of larger capacitors, contributing to stability. As a bonus their cooling will be more likely to keep up with loads without needing to run the fans at a high speed, keeping noise down.
This. The transient power draw spikes are the thing. When properly estimated and planned for, everything's fine. Otherwise, you get lots of problems (obviously) of varying seriousness. Best case, your PSU overload protection trips and the PC shuts off. Worst case, things start melting and making the Mysterious Blue Smoke™. We saw a lot of that with the higher end 30- and 40-series nvidia cards for a while because green company was putting out inaccurate information about power transients (underestimating them). Their stupid 12VHPWR connectors are still melting sometimes because the cards are trying to pull more power than those cables can feed without overheating.
 
  • Like
Reactions: Brain Problems
I am really considering buying a 4060 before stock dries up, even if I have no real immediate use for it. Planning to build a small form factor pc in the future. It is really tempting at 300 bucks.
Considering I need CUDA AMD is a no go. The 5000 series is so lackluster I am not considering the 5060.
How long does it take generally for stuff like this to go out of stock? Any rough predictions?
 
  • Feels
Reactions: Brain Problems
I am really considering buying a 4060 before stock dries up, even if I have no real immediate use for it. Planning to build a small form factor pc in the future. It is really tempting at 300 bucks.
Considering I need CUDA AMD is a no go. The 5000 series is so lackluster I am not considering the 5060.
How long does it take generally for stuff like this to go out of stock? Any rough predictions?
I mean if this generation is anything to go by, the 5060 will be priced at exactly the same as the current 4060. And I doubt scalpers will be interested in cards like that with only 8GB of vram. They just aren’t as in demand for most people.

But the stock of 4060s will dwindle pretty fast because they are not manufacturing them anymore I believe.
 
Seems like the 9070xt is a decent card whose price point is only enticing due to current year.

Thinking of buying one to upgrade my RTX 3070, but I'm still a salty bitch over it because the only reason I need to upgrade is because even at 1440p the VRAM is hurting. Also don't like how AMD cards are always more power inefficient
 
Last edited:
Seems like the 9070xt is a decent card whose price point is only enticing due to current year.

Thinking of buying one to upgrade my RTX 3070, but I'm still a salty bitch over it because the only reason I need to upgrade is because even at 1440p the VRAM is hurting.
Yep, that 8gb was a real ball buster.
 
I will never care about raytracing in games. I remember when the hype was that it promised more realistic lighting and more efficient calculation. What a crock of shit. Actual garbage technology, and I bet you can do the same lighting at lower cost with fucking light probes.

These cards might be perfect for me.
If you don't care about ray-tracing then go buy a 7800 XT or something. The only improvement these cards bring are raytracing.
 
  • Informative
Reactions: Brain Problems
If you don't care about ray-tracing then go buy a 7800 XT or something. The only improvement these cards bring are raytracing.
7800 XT is going for $540 new (US). 9070 XT at MSRP is 42-50% faster (Tom's 1440p/4K raster) for 11% more. The card to compare to is the 7900 XT, which is going for over $600... on the used market.

If you can't beat 'em, cut 'em.
 
Back