GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Reviews of the B580 are out. I think we have our 6700XT/2080ti/3060 12GB successor. Between this and a refurbished 2080ti, I'm gonna have a hard time to pick a GPU for a friend of mine.
If AMD wants to shut down a price/perf argument for the B580, they can do so pretty easily by either dropping 7700 XT prices to below $350, or the 7600 XT 16 GB to match the B580's price (similar perf in Tom's review), or by using RDNA4, where the 8600 XT using the cheap, small Navi 48 die will outperform it. They have to give that 12 or 16 GB VRAM to completely crush it though.

On Nvidia's side... maybe an RTX 4060 shouldn't cost $300‽
 
  • Like
Reactions: Brain Problems
I hope this gives the attention needed to keep the Intel Arc series growing.
Based on the GN review, it seemed they still have work to do with their drivers to improve the frame time pacing compared to nvidia and AMD in a lot of games.

I hope the review channels revisit this card after a few months so we can see how much more performance driver improvements will take these new cards.
 
Based on benchmarks, there really isn't a better GPU in the $230-$250 price band right now. And I will also toss in that XeSS is much closer to DLSS in terms of image quality rather than FSR. I'm going to guess that the card's higher bandwidth is why it eclipses the 6700 XT in the 1440p benchmarks.
 
Based on benchmarks, there really isn't a better GPU in the $230-$250 price band right now. And I will also toss in that XeSS is much closer to DLSS in terms of image quality rather than FSR. I'm going to guess that the card's higher bandwidth is why it eclipses the 6700 XT in the 1440p benchmarks.
A580 = 512.0 GB/s
B580 = 456.0 GB/s

Bandwidth actually regressed from the previous card.

BXA7QRKSeQcTkvUxhZJQC-970-80.png.webpYm6LMZkkrzkNNceQL9xX7-970-80.png.webpw9rx6jVcdRvcx4gmZc6vS-970-80.png.webp

Raster performance is nothing special against a 7600 XT with only 288 GB/s. The 6700 XT has 384 GB/s.

HUB review with 6700 XT data (raster):
hub-b580-1080R.pnghub-b580-1440R.png
2-5% faster than a 6700 XT at best. Crushed by the 7700 XT, which needs to be priced lower, but more likely will be replaced by RDNA4.

Battlemage has a raytracing advantage, not that performance is very high on these low/mid cards. RDNA4 should substantially improve raytracing, probably in line with what was seen in the PS5 Pro.

B580 (flagship, forget about a B770) is 272mm^2 on TSMC N5, and is barely better than a 7600 XT which is 204mm^2 on TSMC N6. Curbstomped by the 7700 XT (~200mm^2 GCD on N5 + ~111mm^2 MCDs on N6), which is a disability Navi 32 sibling to the 7800 XT. All this shows is that AMD has grown complacent from only competing with Nvidia. RDNA4 will come out and can flush the B580 turd with whatever they choose to do with the 8600 XT on Navi 48.



HDMI 2.2 is set to debut at CES 2025 — the new standard brings higher resolutions, refresh rates, and bandwidth
Some speculation suggests that HDMI 2.2 might be compatible with Nvidia’s upcoming GeForce RTX 50-series and AMD’s Radeon RX 8000 series GPUs. Both companies have scheduled CES press events on January 6, coinciding with the HDMI Forum’s announcement. If confirmed, this would align the new HDMI standard with the latest DisplayPort 2.1 technologies, offering consumers expanded options for ultra-high-definition media and gaming experiences.
It's over before it even began.
 
Last edited:
So, if I'm looking for a GPU and I can afford either Intel Arc B580 or RTX 4060, which should I get? I planned on the former, but after an avelanche of good reviews for the former, I wanted a second opinion from people here. Or would a better option be to wait for new AMD release rumored for early 2025?

I just looking for something to last me a handful of years before needing an upgrade, but also wouldn't be immediately worthless next year or so.
 
So, if I'm looking for a GPU and I can afford either Intel Arc B580 or RTX 4060, which should I get? I planned on the former, but after an avelanche of good reviews for the former, I wanted a second opinion from people here. Or would a better option be to wait for new AMD release rumored for early 2025?

I just looking for something to last me a handful of years before needing an upgrade, but also wouldn't be immediately worthless next year or so.
I would wait to see what AMD is offering first to be honest.

Also to truly determine what GPU will fit your needs confirm which resolution you play at, does ray tracing matter to you?, do you play mostly low demand esports titles or do you play a lot of new releases/unreal 5 games?

Knowing some of that information we can give a more informed recommendation.
 
Also to truly determine what GPU will fit your needs confirm which resolution you play at, does ray tracing matter to you?, do you play mostly low demand esports titles or do you play a lot of new releases/unreal 5 games?
I'm thinking of trying playing at 1440p (1080 is what I play on currently). Raytracing does matter (I want to try Portal RTX)

I don't play a lot of new releases. My two "newest" titles are BG3 and Hogwarts Legacy, and both were rather the exceptions. That said, I'd like the ability to try out new releases.

If it matters, I plan on getting Ryzen 5 9600x as the CPU
 
I'm thinking of trying playing at 1440p (1080 is what I play on currently). Raytracing does matter (I want to try Portal RTX)

I don't play a lot of new releases. My two "newest" titles are BG3 and Hogwarts Legacy, and both were rather the exceptions. That said, I'd like the ability to try out new releases.

If it matters, I plan on getting Ryzen 5 9600x as the CPU
The new Indiana Jones title requires Ray Tracing and it's probably going to be the case moving forward for new titles, so I would for sure wait as both NVidia and AMD are launching new GPUs next month with AMD GPUs that apparently have improved Ray Tracing. I do have question, are you currently on AM4 CPU currently? If you do I would recommend upgrading to 5700x3d instead of 9600x.
 
  • Like
Reactions: Brain Problems
I'm thinking of trying playing at 1440p (1080 is what I play on currently). Raytracing does matter (I want to try Portal RTX)

I don't play a lot of new releases. My two "newest" titles are BG3 and Hogwarts Legacy, and both were rather the exceptions. That said, I'd like the ability to try out new releases.

If it matters, I plan on getting Ryzen 5 9600x as the CPU
Then I would definitely wait and see what the mid tier offering from AMD is going to be and see what the 5070 is priced at. Regardless the 4070 super tier of performance is probably what you are looking for and Nvidia is the best at RT.
 
  • Like
Reactions: Roy Earle
I do have question, are you currently on AM4 CPU currently
I am not. I am currently on an extremely shitty and old Intel cpu that I plan to replace alongside the motherboard. So I'm planning on getting an AM5 motherboard and jump straight into 9600x
Then I would definitely wait and see what the mid tier offering from AMD is going to be and see what the 5070 is priced at.
Are both ~January releases or Q1 releases? Because if it's January, I'm good to wait, but other than that, I want to upgrade ASAP (been running this shitty set up for too many years, and it's quite literally breaking down).

Regardless the 4070 super tier of performance is probably what you are looking for and Nvidia is the best at RT.
Right, but if I wanted something a bit cheaper than that, and circling back to Intel Arc B580 Rtx 4060/4060ti would either last me 3-5 years without me being super into new releases, or would it quickly become worthless? Also another thing with 4070 super is that'd I'd have to up my CPU as well, wouldn't I? The card would be substantially more powerful than the processor.
 
Right, but if I wanted something a bit cheaper than that, and circling back to Intel Arc B580 Rtx 4060/4060ti would either last me 3-5 years without me being super into new releases, or would it quickly become worthless? Also another thing with 4070 super is that'd I'd have to up my CPU as well, wouldn't I? The card would be substantially more powerful than the processor
I have a 4070 super paired with a i9 12900k. The 4070 super basically is a 3080. It kicks ass. If your core is in the 9000 series, that is overkill tbh, I'm having no issues on a core now 3 generations behind.
 
  • Informative
Reactions: Useful_Mistake
Right, but if I wanted something a bit cheaper than that, and circling back to Intel Arc B580 Rtx 4060/4060ti would either last me 3-5 years without me being super into new releases, or would it quickly become worthless?
These cards will run any new release for the next 10 years just fine as long as you don't crank up the graphics to maximum settings
Also another thing with 4070 super is that'd I'd have to up my CPU as well, wouldn't I? The card would be substantially more powerful than the processor.
Games don't really scale that well and the single core performance of a midrange is practically identical to the top end
 
  • Informative
Reactions: Useful_Mistake
Are both ~January releases or Q1 releases? Because if it's January, I'm good to wait, but other than that, I want to upgrade ASAP (been running this shitty set up for too many years, and it's quite literally breaking down).
Yes Nvidia 5000 series will be announced and launched at CES in January. AMDs RDN4 will be as well so you wont be waiting long.

Right, but if I wanted something a bit cheaper than that, and circling back to Intel Arc B580 Rtx 4060/4060ti would either last me 3-5 years without me being super into new releases
If you want to game with raytracing at 1440p (with decent to high FPS at higher settings) over the next 3+ years, any card that has 8GB of VRAM should not be in consideration. 4060ti 16GB is incredibly overpriced for its relative performance. The new Arc card is interesting, but you are running a risk and at the mercy of Intel's still developing drivers.

Also another thing with 4070 super is that'd I'd have to up my CPU as well, wouldn't I? The card would be substantially more powerful than the processor.
A 9600x will definitely not bottleneck a 4070 super. I would recommend getting an x3d variant of the 9600x, but its not out yet and I don't know if/when that could happen.

Also want to add that when the 5000 series releases, remaining stock of 4000 series cards will likely see a price decrease to some extent.
 
2-5% faster than a 6700 XT at best.
I am on my phone, but I remember seeing some games in the first video that were 10%-15% better. Bandwidth is going to become a factor once you have a combination of enough frame-sized layers to blend and textures, which of course will vary by game. It isn't just bounded by resolution alone.

It has been a very, very long time since I saw anything at this level of detail, but one example is Killzone 2's deferred renderer drew I think seven full-screen layers that it combined in different operations. I'm sure modern games have gone even higher than that. The point being that if a game does 4 operations per pixel x 8 passes and another does 2 operations per pixel x 16 passes, the second game will hit bandwidth limits at lower resolutions than the first even if they're doing the same amount of work.

So, if I'm looking for a GPU and I can afford either Intel Arc B580 or RTX 4060, which should I get? I planned on the former, but after an avelanche of good reviews for the former, I wanted a second opinion from people here. Or would a better option be to wait for new AMD release rumored for early 2025?

I just looking for something to last me a handful of years before needing an upgrade, but also wouldn't be immediately worthless next year or so.

Any more, even an entry level GPU will last 4-5 years. I'd go with the B580 over the 4060 because it's cheaper and more powerful.

Also another thing with 4070 super is that'd I'd have to up my CPU as well, wouldn't I? The card would be substantially more powerful than the processor.

You're thinking about it wrong. A GPU is not "more powerful" than a CPU. The only question is whether the CPU is setting up data fast enough for the GPU to draw the frame, which then comes down to what frame rate you want to run at. Any more, what the CPU does has very little to do with graphics settings, so no, getting a 4070 or 4090 does not mean the CPU is going to have significantly more work to do. The 9600X is currently capable of driving just about any current game at over 100 fps.

TH currently has a bunch of games benchmarked here. On nearly everything, the 9700X is posting up speed over 100 fps.

 
I am on my phone, but I remember seeing some games in the first video that were 10%-15% better. Bandwidth is going to become a factor once you have a combination of enough frame-sized layers to blend and textures, which of course will vary by game. It isn't just bounded by resolution alone.
I meant to say "on average". The 12 GB and higher bandwidth is helping, but not by much.
 
Back