GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
AMD’s Ryzen Threadripper 9000 “Shimada Peak” 64-Core & 32-Core SKUs Spotted In Shipping Manifests

For the people who care about Threadripper, maybe 16/32/64/96-core Zen 5 Threadripper is coming (maybe eventually).

AMD’s Ryzen “Zen 6” CPUs & Radeon “UDNA” GPUs To Utilize N3E Process, High-End Gaming GPUs & 3D Stacking For Next-Gen Halo & Console APUs Expected (archive)

China leaker goes hard. N3E for the Zen 6 CCD was already expected, and would help it get to the previously rumored 12 cores instead of 8 cores (Zen 5 CCD uses N4X). Zen 4 and Zen 5 use the same TSMC N6 I/O die, so moving that to N4C would allow memory controller improvements, probably a newer iGPU and maybe an XDNA1/2 NPU. The N4C node is TSMC's "budget" version of the N5/N4 family, so it's a direct replacement for N6.

UDNA1 desktop GPUs will use N3E, and a flagship is back on the table. That can always be cancelled so don't count on it.

Next-gen Strix Halo (Medusa Halo) could use 3D stacking for CPU and GPU. This probably means 3D V-Cache and 3D Infinity Cache, but could refer to some new packaging technique. PS6 will use 3D stacking while Microsoft is "not sure yet", which tracks with them being less certain about Xbox's future.

Intel Arc B570 “Battlemage” Graphics Cards Review Roundup

Tom's Hardware: Intel Arc B570 review featuring the ASRock Challenger OC: A decent budget option with a few deep cuts

$249 B580 MSRP to $219 B570 MSRP is a 12% price cut, but you lose ~17% VRAM (-2 GB). What about the performance? Raster: -15% (1080p Ultra), -12% (1080p Medium), -18% (1440p Ultra).

B570 is very similar to the RTX 3060's performance in this review. It's +9.3% faster than the 7600 non-XT at 1080p Ultra, but 10% slower at 1080p Medium. 1080p Ultra also sees the 7600 XT (16 GB) 28.6% faster than the 7600 non-XT (8 GB), showing off the benefit of sufficient VRAM.

bDX5LshxW6uhL8qcGm2xn9-970-80.png.webpFhu2EgcUnb3RTgiNqDNyx9-970-80.png.webp

1% lows not great for the B570/B580 in these geomean charts either.

The B570 price should have been $200, but if the B580 remains unobtainium, I could see grabbing it instead. But it's no better than an RTX 3060, and even 8 GB cards are fine if you lower settings.

Samsung teases next-gen 27-inch QD-OLED displays with 5K resolution
 
Last edited:
AAA slop plays just fine on a PS5, which basically has a midrange GPU from 2023, and much of it on a PS4, which has a midrange GPU from 2014. A game you can't play comfortably on a midrange card is more likely to be some obscure indie title utilizing Unity assets in a completely retarded way, like that city simulation game from Paradox that used hilariously detailed models for individual citizens, or a user mod for a game where some moron imported models directly from Blender without reducing the vertex count and added 500,000 polygon hats to all the enemies.
I guess there is no reason for me to even think about upgrading?

I'm still rocking an 8GB RTX 3050 from the times of the GPU drought a couple of years ago and a 5600G CPU and am happy with 1080p medium/high in most games (both of my monitors are 1080p 60Hz). Suppose the first port of call if I am looking to upgrade would be a more high end monitor.
 
I guess there is no reason for me to even think about upgrading?

I'm still rocking an 8GB RTX 3050 from the times of the GPU drought a couple of years ago and a 5600G CPU and am happy with 1080p medium/high in most games (both of my monitors are 1080p 60Hz). Suppose the first port of call if I am looking to upgrade would be a more high end monitor.

I wouldn't upgrade until you have a concrete reason to, like a game that struggles or looks like crap on your card, or you see some game on YouTube that looks phenomenal but is just meh on your card.
 
As much as you can criticize Battlefield 1, I think it's genuinely insane how good looking the game is and that you can still max the settings on a GTX 680 (even the 2GB version of the card). You just can't do that with 2024 titles on 2024 hardware even at 1080p. And you could argue that games look worse today than they did in 2017 thanks to upscaling and TAA.

 
i just want to confirm some advice i gave earlier. you should have a monitor that can output the fps you're getting. like if you just buy a regular 60hz then it doesn't matter if your gpu gives you 400fps.

right?
 
  • Thunk-Provoking
Reactions: Betonhaus
As much as you can criticize Battlefield 1, I think it's genuinely insane how good looking the game is and that you can still max the settings on a GTX 680 (even the 2GB version of the card).

Battlefield 1 was developed primarily with the PS4/XbOne as the lead platforms, and the GTX 680 is a high-end card that was only 2 years old when those consoles launched. And yet, the game can't even stay at a stable 60 fps at 1080p ultra.

Battlefield 2042 had the PS5/Xbox Series as the lead platforms, and the RTX 2080 Ti was 2 years old when the PS5 launched. The 2080 Ti can run Battlefield 2042 at over 100 fps at 1080p Ultra.



You just can't do that with 2024 titles on 2024 hardware even at 1080p.

Black Ops 6 is a 2024 release. It runs at 180 fps on 2024's 4080 Super.

Helldivers II can run 1080p ultra at 140 fps on the 4080 Super.

EDIT: Holy shit every game really does look the same now.
 
Last edited:
Developers want more DLSS because it means less optimization to do. Most games these days can't even run at more than 60fps on modern GPUs at 1080p without DLSS and that itself should be a crime. DLSS should only be useful for 2K or 4K.

What games can't run over 60 fps on a modern GPU at 1080p without DLSS?
 
What games can't run over 60 fps on a modern GPU at 1080p without DLSS?
I haven't kept up with all the new games, but I am aware that the new Indiana Jones game and Star Wars Outlaws run like dogshit and some of the new games that came out do too.

RTX 3060 (12GB) performs better than RTX 4060 (8GB) in these native 1080p benchmarks​

Thanks to new benchmarks published by German outlet PC Games Hardware (PCGH), we can see how well the game performs at full HD (1080p) max settings. According to the test, the RTX 4060 averaged just 25.2 FPS with 1% lows of 18, while the last-gen RTX 3060 enjoyed an average of 57 and 1% lows of 51.
Source
 
I haven't kept up with all the new games, but I am aware that the new Indiana Jones game and Star Wars Outlaws run like dogshit and some of the new games that came out do too.

Runs just fine on a 4070 Ti. In 2024, you can't expect to max out settings on an entry-level card with only 8 GB. People who paid for better cards with 12-24 GB of RAM expect games to have settings available to actually use what they bought. Not shipping high-quality assets isn't "optimization."
 
  • Like
Reactions: Brain Problems
Intel's Arrow Lake fix doesn't 'fix' overall gaming performance or match the company's bad marketing claims - Core Ultra 200S still trails AMD and previous-gen chips

Arrow Lake "fixes" do nothing, possibly lead to small performance regressions.

Perhaps more importantly, compared to the fastest patched 285K results on the MSI motherboard, the Ryzen 9 9950X is now 6.5% faster (it was ~3% faster in our original review), and the Ryzen 7 9800X3D remains nearly 40% faster than the 285K – it isn’t close. That means the fix has not altered Arrow Lake’s competitive positioning in a positive way versus AMD’s processors.

More concerning for Intel is that its previous-gen Core i9-14900K experienced much stronger uplift than the Core 9 285K from updating to the new version of Windows. We only updated the OS for the updated 14900K config – no new firmware had been released for our test motherboard since the 285K review. As you can see, the 14900K is now 7% faster than the testing with the older version of Windows. It appears that Windows has corrected some sort of issue with all Intel processors here, leading to the 14900K now being 14% faster than the 285K.
 
Runs just fine on a 4070 Ti. In 2024, you can't expect to max out settings on an entry-level card with only 8 GB. People who paid for better cards with 12-24 GB of RAM expect games to have settings available to actually use what they bought. Not shipping high-quality assets isn't "optimization."
There is no game that needs more than 8 gb of GPU RAM for gaming at standard high definition. If a game can't perform properly with that, it's broken. It's fine if games ship with excessively high 'quality' assets that noone will ever see, as long as they also have functional ones for gaming at reasonable resolutions from 1280x1024 to 1600x1200 to stretch screen ones like 1920x1080.
 
Runs just fine on a 4070 Ti. In 2024, you can't expect to max out settings on an entry-level card with only 8 GB. People who paid for better cards with 12-24 GB of RAM expect games to have settings available to actually use what they bought. Not shipping high-quality assets isn't "optimization."
What is optimization, it feels like a buzzword used without knowing what it means.
 
There is no game that needs more than 8 gb of GPU RAM for gaming at standard high definition. If a game can't perform properly with that, it's broken. It's fine if games ship with excessively high 'quality' assets that noone will ever see, as long as they also have functional ones for gaming at reasonable resolutions from 1280x1024 to 1600x1200 to stretch screen ones like 1920x1080.

Every game shipping right now has assets that work just fine in 8 GB VRAM at 1080p. Many will even work in 4 GB. Both the Indiana Jones game and the Star Wars game people are crying about run at 60-70 fps on a 4060, prior to any upscaling or frame gen, if you actually use the textures & assets designed for 8 GB cards instead of loading the assets designed for 12-16 GB cards.

What is optimization, it feels like a buzzword used without knowing what it means.

To me, "optimization" means you're not unnecessarily wasting compute resources due to bad programming, like using a linked list when you should have used an array, or a binary tree when you should be using a hash map, or shipping uncompressed sound that makes your game take two fucking hundred gigabytes, Call of Duty. To some people, "optimization" means "this game runs at 60 fps ultra settings on my 10-year-old GPU," i.e. the game was ported from Xbox 360 to PC.
 
Every game shipping right now has assets that work just fine in 8 GB VRAM at 1080p. Many will even work in 4 GB. Both the Indiana Jones game and the Star Wars game people are crying about run at 60-70 fps on a 4060, prior to any upscaling or frame gen, if you actually use the textures & assets designed for 8 GB cards instead of loading the assets designed for 12-16 GB cards.



To me, "optimization" means you're not unnecessarily wasting compute resources due to bad programming, like using a linked list when you should have used an array, or a binary tree when you should be using a hash map, or shipping uncompressed sound that makes your game take two fucking hundred gigabytes, Call of Duty. To some people, "optimization" means "this game runs at 60 fps ultra settings on my 10-year-old GPU," i.e. the game was ported from Xbox 360 to PC.
So, no different the standard bitching that happens every time in gaming history.
 
Saying "people have always complained about optimization" is a cop-out. Games today look no better than their predecessors a decade ago. So what you get is the same looking shit at much worse performance. And things like RTX compound this where the graphical "improvements" do not outweigh the massive hit to performance over baked lighting.
And it goes beyond graphics too. In the case of Battlefield 2042, it has worse physics, less interaction, less props, worse destruction, etc. than it's predecessors.

Please stop defending modern developers.
 
Back