GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Hell just get a Steam deck or one of the many other PC handhelds. You don't need to play AAA games on max graphics to enjoy games. The best games are the older ones being remastered or brought to PC through ports.
At London Drugs I saw the MSI Claw for 40% off so I don't think the Intel handhelds are doing well.
 

* Zen 6 will be on AM5 💯%, with 12-core unified CCDs and up to 24 cores for desktop.
* Zen 6 supports 3D V-Cache directly on the CCDs (or not present) as usual.
* APUs share CCDs with desktop, and could have 3D V-Cache, but again, no indications other than it's possible.
* The Medusa Point APU is on FP10.
* Medusa Point appears to have 8 workgroups / 16 CUs again. Which is fine since the current Strix Point APU doesn't scale well from 12 to 16 CUs, bandwidth is more important. No confirmation of anything like Infinity Cache yet.
* Medusa Ridge desktop chips have a 155mm^2 I/O die. Speculates that it's getting larger graphics and/or an NPU.
* From the description: It might be called Olympic Ridge instead of Medusa Ridge.
* MLID is now going with TSMC N2 for the Zen 6 CCD, but it could be N3.
* The CCD and I/O die are right next to each other now, and there is a bridge die embedded in the base die underneath them. UMC is making the bridge dies, with packaging done by SPIL. This will lead to a "massive" latency reduction and is a "Zen 2 moment".

AMD was arguably phoning it in with the chiplet-based designs from Zen 2 to Zen 5, since it was very economical to share cheap, high yield chiplets between desktop and server chips. Now they are effectively fusing them together with the bridge dies, and this packaging is apparently good enough to be used in laptops.

Other sources have said that the Medusa Ridge I/O die will use TSMC N4C. Moving from 122mm^2 and 3.4 billion transistors on TSMC N6 to 155mm^2 on N4C means it's 27% larger. N6 to N4C should be about 1.62x density (I went with (1.8 / 1.18) * 1.06), so that should be a little over double the transistors or roughly 7 billion. Enjoy your desktop NPU.

One unanswered question for me is the L3 cache. It wasn't mentioned but the laptop APUs will suddenly have the same cache per core as desktop chips. Cezanne/Rembrandt/Phoenix have 16 MiB, Strix Point has split 16+8 MiB, while desktop chips had 32 MiB. But I wonder if AMD will increase the CCD L3 cache to 36 MiB to align better with the move to 12 unified cores.
 
Last edited:
>be a new gamer
>wants PC
>check parts
>500$+ GPU
>over budget
>pass

>pick console
>no games
>subscription for online play
>pass

>choose laptop
>use for few years
>becomes e-waste
>thanks Niggersoft

Seeing how the Nvidia-AMD-Intel GPU situation has become worse overtime, I think now is the worst time to ever get into gaming.

Also, I am so screwed in the case the 6750XT shits itself in a decade or so.
2500k and GTX 1060 6gb. You do not need more.
 
He mentioned Deus Ex as an example of one of those old games that just ran wonderfully well, compared to Monster Hunter Wilds, which needs DLSS to achieve 100 fps at 4K or whatever. When it launched in 2001, the GeForce 3 Ti was about the best GPU money could buy. Deus Ex managed to chug along at around 30 fps at 800x600 and high settings. This is what people's rose-tinted goggles are remembering as "well-optimized games" (it also ran like shit compared to many Unreal Engine games).
You're applying rationality and critical thinking here. You're supposed to join the new tech bad,programmers are bad circle jerk.
 
  • Like
Reactions: Fcret
  • Feels
Reactions: Miller
it seems to be a problem specifically with AMD's implementation of ReBAR
but is a good deal if you have an Intel CPU.
What a coincidence, happy accident I'm sure...
This hardware unboxed video is a good demonstration of how much performance loss you can expect compared with a 4060 as you go down in CPU performance.

What a shitshow, it craps out even with a 5xxx series Ryzen, who the hell is going to buy this thing for a 9xxx series? If intel did this on purpose thinking its barely supported GPU would boost CPU sales they didn't just shot themselves in the foot, they blew their entire foot off with a shotgun.
 
If intel did this on purpose

They're not putting code in their drivers to detect low-end 5000 series AMD CPUs and mistime draw commands based on the specific model. The most likely cause is that their test bench had few if any of the competition's CPUs in it and there's something screwy in AMD's resizable BAR implementation.

AMD CPUs have had quite a few bugs and vunlnerabilities, while Intel's QA has slipped a lot since the halcyon days of Tick/Tock, so it's probably both.
 
I'm trying to take the AMD pill because I want to switch to Linux in the future once win10 goes end of life. It's also marginally cheaper and at least more in stock near me. I just bought a 7600XT, is that a decent card? I'm not trying to run things with a billion fps in 4k with raytracing, I just wanted something kind of mid range.
 
I'm trying to take the AMD pill because I want to switch to Linux in the future once win10 goes end of life. It's also marginally cheaper and at least more in stock near me. I just bought a 7600XT, is that a decent card? I'm not trying to run things with a billion fps in 4k with raytracing, I just wanted something kind of mid range.
1739739426855.png
This is based on data from techpowerup
 
  • Horrifying
Reactions: WelperHelper99
I'm trying to take the AMD pill because I want to switch to Linux in the future once win10 goes end of life. It's also marginally cheaper and at least more in stock near me. I just bought a 7600XT, is that a decent card? I'm not trying to run things with a billion fps in 4k with raytracing, I just wanted something kind of mid range.
Generally you want something at or above the level of a RTX 3060 for most gaming. If you're only playing ten year old games or older you can get by with less, and if you're running 4k and want all the bells and whistles you'll want something a bit more.
 
so rich enough to upgrade your laptop every other year
Every other year? Those are poor people numbers. To be fair I mostly have a gaming laptop because I can run a bunch of VMs and segregate customer VPN/Remote Desktop access to their own sandbox. And sometimes game on the road.

Only real annoyance is the 2021 model I have only goes to 40GB RAM. Maybe I'll see what Strix Halo brings.
 
I'm trying to take the AMD pill because I want to switch to Linux in the future once win10 goes end of life. It's also marginally cheaper and at least more in stock near me. I just bought a 7600XT, is that a decent card? I'm not trying to run things with a billion fps in 4k with raytracing, I just wanted something kind of mid range.
I chipped in to buy a 7600 XT for a third-world friend of mine (to prevent him from wasting his money on garbage like a secondhand RX 580 from some pajeet mining rig) and he reports that he's getting pretty good performance in Baldur's Gate 3 on Linux (keeping in mind that his CPU is a lower end Zen 2 APU right now so he's CPU bottlenecked). For 1080p gaming, it's more than sufficient. The 7600 XT is roughly on par with the base PS5 in terms of graphical power and if you're mostly sticking to 1080p then it's going to play pretty much anything out right now at good framerates.

For gaming specifically, you should look at getting radv setup in Linux. AMD has an official open-source vulkan implementation (amdvlk) which most distros default to but anecdotally I've heard that radv performs much better in games.
 
Last edited:
$2K to $3K isn't that much money.
well that's random_text.txt content right there. If I have no other planned major expenses it would take me 3-4 months to save up for one without having to dip into it for unexpected expenses or monthly expenses. if i do have planned expenses it would take even longer. Some people it would take a year or more, and that's with cutting into other important purchases they may need to make.
 
Back