GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I would be really curious to see the performance jump from the z1 extreme to this and how much more expensive this chip would make a new steam deck or ROG Ally.
It's the 12-core Strix Point die with 4 cores disabled, so that's going to lower the cost. The only other Strix Point APU to disable a Zen 5 core is the Ryzen AI 7 PRO 360 with the same 3+5 arrangement, but only 12 CUs enabled. NPU is also disabled. 12 vs. 16 CUs was a matter of debate in the leaks up to this point, but they went with the full thing.

At launch it seemed like Strix Point would benefit from less CPU cores, and the 16 CUs clocked low for efficiency. So this seems to be a good move for the handheld version. The Zen 5 and Zen 5C cores are in different complexes which can introduce latency. Maybe that will affect some games.


Based on these, you can only expect 15-20% more performance than the Z1 Extreme. Notebookcheck found +10-32% across 5 games and 17/25/30W TDPs for the Z1 Extreme vs. Ryzen AI 9 HX 370. Z2 Extreme could be slightly better.

Ryzen Z1 Extreme/Radeon 780M (1080p/High)
Game17 Watt TDP25 Watt TDP30 Watt TDP
Far Cry 536 FPS44 FPS45 FPS
Witcher 339 FPS45 FPS47 FPS
GTA V57 FPS65 FPS67 FPS
Cyberpunk 207719.8 FPS24.5 FPS25.6 FPS
CoD MW326 FPS33 FPS35 FPS
CoD MW3 (FSR)41 FPS51 FPS54 FPS

Ryzen AI 9 HX 370/Radeon 890M (1080p/High)
Game17 Watt TDP25 Watt TDP30 Watt TDP
Far Cry 542 FPS (+17%)51 FPS (+16%)53 FPS (+18%)
Witcher 343 FPS (+10%)51 FPS (+13%)54 FPS (+15%)
GTA V64 FPS (+12%)75 FPS (+15%)77 FPS (+15%)
Cyberpunk 207724.9 FPS (+26%)28.4 FPS (+16%)29.7 FPS (+16%)
CoD MW333 FPS (+27%)38 FPS (+15%)45 FPS (+29%)
CoD MW3 (FSR)54 FPS (+32%)60 FPS (+18%)65 FPS (+20%)

Valve has said there won't be a Steam Deck 2 anytime soon, that they are waiting for a certain level of uplift, etc. They might not bother unless they can get an iGPU 3x faster than Steam Deck 1, which these APUs don't deliver. Valve may be working on a SteamOS box console using the Z2 (Phoenix), faster than the Steam Deck but not by so much, and not needing to manage a battery:

 
"30W TDP" and "handheld" are also not two things I'd say belong in the same sentence. With current battery tech, that'll never be practical.
 
"30W TDP" and "handheld" are also not two things I'd say belong in the same sentence. With current battery tech, that'll never be practical.
Consider the Asus ROG Ally X. They moved the microSD slot because the heat was killing cards in the previous version. 30W is the "Turbo" mode but maximum TDP is 53W. Even in "Turbo" it's unlikely to actually be using 30 Watts all the time during gaming, and if you went all the way up, you would be fine... if you have it plugged in.

 
You can like DLSS, you can defend DLSS, but you are in denial when it comes to its many negatives

I never mentioned NVIDIA frame generation at all, since I've never used it, to be in affirmation or denial about its negatives.

All the polygons in modern games don't change the fact that games look worse than they did 10+ years ago thanks to a decline in motion clarity.

I'm not really into gaming at 30-45 fps like this guy apparently is (he's running most of the games about which he's complaining at that speed and using FG to double the perceived fps), but I definitely prefer motion blur over frame generation at that speed, like they did 10+ years ago on the consoles.
 
Last edited:
if you went all the way up, you would be fine... if you have it plugged in.
That's the thing, I have this mental illness that when somebody sells me a device as "mobile" or "handheld" I actually assume I can use it like that, without it being wired to a power socket. There seems to be a fundamental misunderstanding between me and x86 "mobile" device manufacturers regarding this. I actually really liked that Lenovo Legion Go from a design perspective when I stumbled across it the other day, but if we're being entirely honest 1-4 hours battery life is a joke and at the end of the day, just poor engineering and false advertising. I mean I'm sure there are people that are just fine with having an USB-C cable sticking out of their thing at all times but meehhh.

Also I can promise you the lithium battery won't enjoy being exposed to that kind of heat all the time, either. Then you'll probably have scenarios where the SoC will boost and the USB-C supply can't deliver (be it because of the cable or power supply itself) and the battery will have to compensate so you don't get a brown-out. It's not great. I feel these things will be kinda dead-ends until battery tech improves or x86 SoCs get a lot more efficent.

I think I wrote it in the other thread, the 7840U is basically the Z1 Extreme, just with an NPU and slightly higher TDP. There are definitively mini PCs with that SoC and they'd make good and probably quite cheap x86 games' consoles. With a small 13-16" "portable" screen that would be a very neat and still somewhat portable games station (and also a very capable regular PC).
 
Valve has said there won't be a Steam Deck 2 anytime soon, that they are waiting for a certain level of uplift, etc. They might not bother unless they can get an iGPU 3x faster than Steam Deck 1, which these APUs don't deliver. Valve may be working on a SteamOS box console using the Z2 (Phoenix), faster than the Steam Deck but not by so much, and not needing to manage a battery:

https://old.reddit.com/r/SteamDeck/comments/1h7521g/looks_like_valve_is_working_on_a_steamos_device/

Would Steam Deck be showing up as Arch Linux in their hardware survey? Because if it is, it's got at best a 0.2% share of gamers.

That's the thing, I have this mental illness that when somebody sells me a device as "mobile" or "handheld" I actually assume I can use it like that, without it being wired to a power socket.

You missed the earlier laptop slapfight earlier where @Susanna and myself were on team "laptops should be portable" against the mongrel hordes of "you're always 10 ft from a socket, always having to tether my ASUS Furnacebox to the wall is no big deal!"
 
laptops should be portable
I can totally get "I have limited room, I don't want a big PC tower" as argument (personally LOVE small computers) but I wouldn't buy a laptop in that case, now that mini PCs are plentiful and good. My last PC build was a mini ITX with Fractal Node 202 case and while it's nice and reasonably small, it still feels too big sometimes. The next "high performance" PC I buy will probably also be a Mini PC with something along these APUs, I don't think I need a dGPU anymore, personally. Of course it's less upgradable but computers just don't age like that anymore, anyways.
 
I never mentioned NVIDIA frame generation at all, since I've never used it, to be in affirmation or denial about its negatives.
But I thought this was all technology advancing stuff and we are just gaming neanderthals? You've moved the goalposts to the point of absurdity.
The initial argument is that developers are over reliant on this stuff while at the same time requiring more exclusive hardware. This is bad scenario to find yourself in, especially when the technology has flaws.
 
But I thought this was all technology advancing stuff and we are just gaming neanderthals? You've moved the goalposts to the point of absurdity.

When you change the subject to a different technology than the one I was talking about, and I don't change with you, I'm not the one moving the goalposts. Besides, I actually agreed with you. For gaming at low fps, the 10+ year old tech of motion blur gives better image quality. So what are you bitching about?
 
Then you'll probably have scenarios where the SoC will boost and the USB-C supply can't deliver (be it because of the cable or power supply itself) and the battery will have to compensate so you don't get a brown-out.
ROG Ally X ships with a 65W charger, over 20% more than the APU itself is ever going to use, and it supports 100W charging so some people use those instead. Devices can display a warning if an insufficient power supply is detected.
I think I wrote it in the other thread, the 7840U is basically the Z1 Extreme, just with an NPU and slightly higher TDP. There are definitively mini PCs with that SoC and they'd make good and probably quite cheap x86 games' consoles. With a small 13-16" "portable" screen that would be a very neat and still somewhat portable games station (and also a very capable regular PC).
Valve would seem to agree with you. They went with someone else's custom APU with only 4 cores, while competitors used laptop APUs that are known to have too much CPU. Their unannounced product seems to be a mini PC-like console. They are waiting for a much better APU for Steam Deck 2 instead of jumping on each incremental upgrade.
Would Steam Deck be showing up as Arch Linux in their hardware survey? Because if it is, it's got at best a 0.2% share of gamers.

This autist seems to think it's closer to 1.1% (the same as every other Linux gaming PC in existence put together) and showing up as two different APUs in the results. But I would be cautious of trying to reach conclusions based on Steam Survey.

Whatever it is, Valve is happy with how it's doing, and committed to making another generation. It sells enough to justify subsidizing the cheap one and making it back on Steam games. They've also gotten Lenovo to adopt SteamOS.
I actually really liked that Lenovo Legion Go from a design perspective when I stumbled across it the other day, but if we're being entirely honest 1-4 hours battery life is a joke and at the end of the day, just poor engineering and false advertising.
If all the information is correct, the imminent Lenovo Legion Go S pulls back on the CPU cores, increases graphics, and uses SteamOS. Basically a souped up Steam Deck. We'll see how that goes.
 
When you change the subject to a different technology than the one I was talking about, and I don't change with you, I'm not the one moving the goalposts. Besides, I actually agreed with you. For gaming at low fps, the 10+ year old tech of motion blur gives better image quality. So what are you bitching about?
We are discussing DLSS. DLSS3 is frame generation. Who knows what they are going to cook up for DLSS4. It's all relevant to the initial argument; over reliance on bandaid fixes instead of proper optimization that is forced onto the consumer via expensive and exclusive hardware.
 
  • Agree
Reactions: geckogoy
We are discussing DLSS. DLSS3 is frame generation.

My detailed posts have been exclusively discussing AI-based upscaling, which is DLSS2. DLSS3 is DLSS2 + FG. I have no opinion on FG, because I've never used it. Unlike you, I don't have belligerent opinions on technologies I've never used.

Who knows what they are going to cook up for DLSS4. It's all relevant to the initial argument; over reliance on bandaid fixes instead of proper optimization that is forced onto the consumer via expensive and exclusive hardware.

  • Your initial argument was that nobody uses DLSS, and it makes software support a nightmare.
    • NVIDIA's own statistics show that this the first statement false
    • The simplicity of the new DirectX API makes the second statement false.
  • You also argued that NVIDIA would soon be dropping the technology entirely.
    • I showed that investment in AI-based image enhancement is increasing, as is its use and prevalence.
    • I additionally pointed out that only old GPUs can't use the tech, and what old hardware can't do has never stopped software technology from advancing
  • You then changed your argument to NVIDIA's telemetry somehow being bullshit or falsified
    • You provided no evidence for this
    • I pointed out that AMD and Intel's response and poor market share suggest that it is indeed a selling point, for which you had no response
  • You then switched your argument to insisting that the telemetry doesn't count, since DLSS is on by default
    • This was, to everyone, an obvious case of special pleading
  • You then switched your argument to DLSS making developers lazy, since it now requires less hand-optimization to achieve good results.
    • I pointed out that this is true of technologies dating to at least the early 1990s; you had no explanation for why DLSS was somehow different
    • You additionally provided zero examples of games which would be running better had DLSS not existed during development
    • I also pointed out that most games are developed primarily for Playstation consoles which have no access DLSS technology (you had no answer to this)
    • I then provided DLSS2 screencaps from my own machine of a game that was developed without DLSS, and it was quite impressive.
  • You then accused me of being dishonest and insisted that it would look like shit in motion (your new argument is that DLSS2 upscaling looks like shit, actually)
    • I provided video demonstrating that this is not the case
    • I gave a detailed breakdown of all the technologies I've used and my opinion on each
    • I'm basically the only person in this argument who has recent NVIDIA and AMD video cards to compare with.
  • You then switched away from upscaling entirely to focusing on frame generation, and showed that first-generation frame gen technology looks bad in a few marquee games running well below the recommended FPS for FG.
    • I pointed out that this was not the technology I had been talking about earlier, and I have no opinion on FG.
    • This was your first argument you had any evidence for, but the problem is it's a subject on which I have no opinion, so I provided you no conflict to win. This made you MATI.
This is what moving the goalposts looks like. Over a couple days, you've moved from "DLSS makes software support impossible" to "frame generation sucks." Each time you move the goalposts, your new argument typically is completely lacking in evidence. It is patently obvious that DLSS just makes you really mad, so you're thrashing about for any argument you can. Your arguments also suggest you've never used it, have a really old GPU, and are seething that recent games can't hit 60+ fps at the settings you want to use and are blaming DLSS for those lazy developers not making the game run faster on your machine. Angry emotion drips out of every single one of your posts.
 
Your initial argument was that nobody uses DLSS
Never claimed this.
also argued that NVIDIA would soon be dropping the technology entirely
Never claimed this.
NVIDIA's telemetry somehow being bullshit or falsified
I asked for how they came to their conclusions. You offered nothing. You had no argument to DLSS defaults.
You then switched your argument to DLSS making developers lazy, since it now requires less hand-optimization to achieve good results.
  • I pointed out that this is true of technologies dating to at least the early 1990s; you had no explanation for why DLSS was somehow different
  • You additionally provided zero examples of games which would be running better had DLSS not existed during development
  • I also pointed out that most games are developed primarily for Playstation consoles which have no access DLSS technology (you had no answer to this)
  • I then provided DLSS2 screencaps from my own machine of a game that was developed without DLSS, and it was quite impressive.
I don't care about your cope that it's like some older technologies; it is objectively true that developers are using upscaling as a bandaid.
Most games aren't developed for primarily anything anymore. Regardless, PlayStation has it's own upscaling with multiple forms, and upscaling exists on all platforms. Another irrelevant point.
You provided dishonest stills for a video game that runs at least 60 stills a second.
You then accused me of being dishonest and insisted that it would look like shit in motion (your new argument is that DLSS2 upscaling looks like shit, actually)
  • I provided video demonstrating that this is not the case
  • I gave a detailed breakdown of all the technologies I've used and my opinion on each
  • I'm basically the only person in this argument who has recent NVIDIA and AMD video cards to compare with.
You can provide all the videos in the world, and so can I. The point is DLSS is not perfect technology.

You then switched away from upscaling entirely to focusing on frame generation, and showed that first-generation frame gen technology looks bad in a few marquee games running well below the recommended FPS for FG.
  • I pointed out that this was not the technology I had been talking about earlier, and I have no opinion on FG.
  • This was your first argument you had any evidence for, but the problem is it's a subject on which I have no opinion, so I provided you no conflict to win. This made you MATI.
This is just bloviated cope over the fact that I am right; over reliance on mediocre technology and segmenting it is not good for games or consumers.

you've moved from "DLSS makes software support impossible"
Again, never claimed this.
You are too autistic to understand the argument and keep making up things in your head. It's possible English is not your first language or maybe you lack reading comprehension. But you are very insistent of defending your expensive hardware and mediocre software.
And you keep acting like you're the only person in this thread that has used this technology. For the record I've used all of it; DLSS2, 3, 3.5, FSR2, 3, etc. I played through the entirety of Cyberpunk 2077 on max settings (path tracing) with DLSS 3.5. You are not special.
 
  • Dumb
Reactions: N Space
I never mentioned NVIDIA frame generation at all, since I've never used it, to be in affirmation or denial about its negatives.
It's either not working for me or it is pretty decent so I don't notice it being on. I just turn it on, don't think about it, there's some smearing SOMETIMES but that's likely from certain effects by the usual suspects. Nvidia did a good job on all things DLSS.
Remember when people were hating on Remedy for upscaling Alan Wake 1 instead of being true and honest about their internal rendering resolution?

Question: Who hated on T&L other than 3dfx fans?
 
  • Like
Reactions: Brain Problems
This is what moving the goalposts looks like.

You are too autistic to understand the argument and keep making up things in your head.
1736178538267.gif
 
Still considering the B580 but I'm also trying to get into AI to not be completely unemployable in the future BUT I cant find anything about ollama or some other similar local solution running on this or other arc GPUs, every video its some shitty top10 list or some other spam
but if we're being entirely honest 1-4 hours battery life is a joke and at the end of the day, just poor engineering and false advertising.
Its the modern game gear, tho the switch gets 7 hours at best, mostly 3 hours
There are definitively mini PCs with that SoC and they'd make good and probably quite cheap x86 games' consoles
Prices are still too high, you can get a gaming laptop with the same specs plus an actual GPU for the same prices
With a small 13-16" "portable" screen
So a laptop?
 
Question: Who hated on T&L other than 3dfx fans?
Third-worlders with TNT2 cards. In general, for any discussion where people are mad about hardware advancements, the loudest complainers are people from the developing world who are relying on secondhand parts and thus see requirements to have new hardware as 'unfairly' locking them out.
 
  • Informative
  • Dumb
Reactions: geckogoy and Vecr
Third-worlders with TNT2 cards. In general, for any discussion where people are mad about hardware advancements, the loudest complainers are people from the developing world who are relying on secondhand parts and thus see requirements to have new hardware as 'unfairly' locking them out.
T&L wasn't required for quite a while(if counting in dog-years, things moved fast back then). There were often a software fallback unlike hardware acceleration in general where software renderers were either completely garbage or dropped entirely. With the exception of Unreal, that is a lovely software renderer. Stunning to this day.
 
  • Like
Reactions: The Ugly One
T&L wasn't required for quite a while(if counting in dog-years, things moved fast back then). There were often a software fallback unlike hardware acceleration in general where software renderers were either completely garbage or dropped entirely. With the exception of Unreal, that is a lovely software renderer. Stunning to this day.
A lot of discussions about this stuff aren't rooted in any kind of rationality. People seethed around the clock about Alan Wake 2 system requirements when the techtubers did videos about it but how many of those people were even going to play Alan Wake 2 in the first place?

It's not about actually playing games for a lot of people, but rather the comfortable illusion of, "yea I could play that game if I wanted to and can do so whenever I feel like it because PC master race amirite." Once games exist that you can't play, people start to feel personally attacked even if they weren't gonna play them in the first place.
 
Back