GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I tested some games myself. It turns out, according to notebookcheck.net, the UHD 770 in my Alder Lake CPU is about 1/2 to 1/3 as powerful as the Radeon 680M in my Ryzen 6800U, which just confirms that Intel still wasn't taking iGPUs too seriously as of 12th gen. I found some games will crash on iGPUs if you have a dGPU installed and have ever run them once on it.

Divinity: Original Sin II (2017) - Runs just fine on UHD 770 at ultra settings, 1080p. Seems to mostly maintain 60 fps. Crashed on 680M.
Deep Rock Galactic - On the Radeon 680M, 1080p + Ultra + FSR 2.1 gets 60 fps. Crashed when trying to use UHD.
Modern Warfare 2 - UHD struggled with this one. I could only get 30 fps at 1080p using FSR 1 or NIS, both of which looked terrible. It looked better using XeSS or FSR 2.1 at 720p, but still ran at 30 fps. 680M, by contrast, easily got 60 fps at 1080p when using just about any upscaler and a mix of medium and low settings. The graphics engine for MW2 is kind of ass, and textures flicker in the distance. Upscalers exacerbate this, since the draw distance goes down with resolution, and the flicker effect gets upscaled.

But overall, it looks like you could say that the current lineup of iGPUs is on par with the PS4. The next generation supposedly will be on par with the 3050, and at 1080p, my 3050 Ti Mobile can handle just about anything out now at a very high level of fidelity, especially with DLSS on. A Meteor Lake S or Ryzen 8000 series mini PC could be a surprisingly capable game box if you value form factor and low temperatures over absolutely maxing out raytracing, resolution, and fps.
 
I purchased a RX6950XT recently to replace my aging GTX 1070. I experienced seemingly random crashes despite multiple OS reinstalls - long story short the new PSU I bought alongside the card (850W Corsair) apparently wasn't enough, upgrading to a decent 1000W PSU did the trick, no more crashing.

I was so fucking ready to RMA the card, good thing the situation turned out the way it did. I'm really happy I went with AMD this time around, but If Nvidia ever decides to make decently priced cards again I might hop back, who knows. AMD is good enough at this point that you won't notice a difference except in very specific circumstances.

That being said, the green boys have to work on their software suite, there's alot of cool stuff in Adrenaline that I've come to like.
that's one of my favorite anecdotes whenever MUH NVIDIA MUH AMD comes up.

had an nvidia card that worked without fine till it simply died one day. amd replacement greyscreened every few days. fucking amd right?
turns out when upgrading the board I noticed one of the capacitors blew, meaning fucking with the power. new board and that "shit" amd card ran without a fuzz, for years.
so, what is the "better" card? the one that runs till it doesn't, or one that runs with hiccups?

that's why it's almost never MUH GPU, it's a whole system that affects each other. and that's before even considering the shitheap windows and the state some people use it (I've worked in service, I've seen some shit).
 
that's one of my favorite anecdotes whenever MUH NVIDIA MUH AMD comes up.

had an nvidia card that worked without fine till it simply died one day. amd replacement greyscreened every few days. fucking amd right?
turns out when upgrading the board I noticed one of the capacitors blew, meaning fucking with the power. new board and that "shit" amd card ran without a fuzz, for years.
so, what is the "better" card? the one that runs till it doesn't, or one that runs with hiccups?

that's why it's almost never MUH GPU, it's a whole system that affects each other. and that's before even considering the shitheap windows and the state some people use it (I've worked in service, I've seen some shit).

Yeah, at the end of the day you should just buy whatever makes the most sense at the moment unless there are specific features you can't live without. Good luck trying to get the average Consoomer to use their brain though.
 
  • Feels
Reactions: ZMOT
I tested some games myself. It turns out, according to notebookcheck.net, the UHD 770 in my Alder Lake CPU is about 1/2 to 1/3 as powerful as the Radeon 680M in my Ryzen 6800U, which just confirms that Intel still wasn't taking iGPUs too seriously as of 12th gen. I found some games will crash on iGPUs if you have a dGPU installed and have ever run them once on it.
Looks like the UHD 770 might be a little better than 2CU RNDA2 Mendocino/Ryzen 7000. So if it's not so much worse than 33-50% of 12CU RDNA2 (680M), that says something about scaling. These CPUs/APUs need more memory bandwidth or something.


But overall, it looks like you could say that the current lineup of iGPUs is on par with the PS4. The next generation supposedly will be on par with the 3050, and at 1080p, my 3050 Ti Mobile can handle just about anything out now at a very high level of fidelity, especially with DLSS on. A Meteor Lake S or Ryzen 8000 series mini PC could be a surprisingly capable game box if you value form factor and low temperatures over absolutely maxing out raytracing, resolution, and fps.
Strix Point will be interesting, Strix Halo would be a complete game changer. Meteor Lake is going to be interesting and it looks like it will have L4 cache ("Adamantine"). If the drivers work, that could act a lot like AMD's Infinity Cache. Rumor is that Meteor Lake's top iGPU will be better than Phoenix 780M, maybe not Strix Point.

Raytracing on iGPU? Why not:

Considering we're going to have years, maybe the rest of the decade, of games targeting the XSX and PS5, APUs are on track to hit the non-moving target. CPU performance of these mobile chips is already becoming way better than the 8-core Zen 2 consoles, GPU performance will become sufficient for 1080p. Strix Halo and anything succeeding it would raise the bar, and the amount of silicon used, and you could talk about 1440p/4K.
 
Last edited:
  • Like
Reactions: The Ugly One
that's one of my favorite anecdotes whenever MUH NVIDIA MUH AMD comes up.

had an nvidia card that worked without fine till it simply died one day. amd replacement greyscreened every few days. fucking amd right?
turns out when upgrading the board I noticed one of the capacitors blew, meaning fucking with the power. new board and that "shit" amd card ran without a fuzz, for years.
so, what is the "better" card? the one that runs till it doesn't, or one that runs with hiccups?

that's why it's almost never MUH GPU, it's a whole system that affects each other. and that's before even considering the shitheap windows and the state some people use it (I've worked in service, I've seen some shit).
I'd inquire if the first GPU has such unstable power management that it's killing the PSU with very sharp power spikes
 
I'm wondering if Microsoft's involvement with the ROG Ally (despite Asus' QC totally fucking that project) was to test the waters of the future of the Surface line with AMD or more broadly as @The Ugly One speculated for midtier laptops without a dGPU. Since it'll just be yearly to semiannually revisions of RDNA lining up with laptop manufacturers' yearly refreshes, it'll just end up as "laptops are Steam Decks with two year revision cycles with Windows" that can play even AAA with some sacrifices and better battery life than dGPU. Relatively cheap to replace compared to "gaming" laptops and hitting 60 fps ish on 1080 for better than last gen performance.
 
I'm wondering if Microsoft's involvement with the ROG Ally (despite Asus' QC totally fucking that project) was to test the waters of the future of the Surface line with AMD or more broadly as @The Ugly One speculated for midtier laptops without a dGPU. Since it'll just be yearly to semiannually revisions of RDNA lining up with laptop manufacturers' yearly refreshes, it'll just end up as "laptops are Steam Decks with two year revision cycles with Windows" that can play even AAA with some sacrifices and better battery life than dGPU. Relatively cheap to replace compared to "gaming" laptops and hitting 60 fps ish on 1080 for better than last gen performance.

People just don't replace computers that often. (As a side note, I've noticed reviewers always talk about this mythical user who upgrades their parts annually, someone I have never met.) I think it's more that when people see something like Warzone running at 60 fps on a laptop without the +$200 uplift for a dGPU, they'll just have no interest at all in a 4070 mobile.

If Meteor Lake is going to deliver 3050-tier graphics, that means it's going to meet or exceed the dGPU in my ASUS gaming laptop, which is a 45W chip that makes the laptop blazing hot when really pushed. On the desktop side, the only growth sector has been all-in-ones and mini-ITX. Tower rigs are just collapsing in sales, and I don't think triple-slot GPUs that gobble a kilowatt of power are helping.
 
  • Like
Reactions: Betonhaus
I have a rolling upgrade of 3 desktops and 2 servers. Typically no full system gets replaced in a year, but there are absolutely new drives, memory or CPU upgrades. And then when there's a need one of the desktop motherboard+CPU+RAM becomes a server and a new desktop is cobbled together with higher-spec parts.

Last year desktop #3 (the oldest) blew. So I took a spare MB and got new CPU and RAM for #1. That one became #2 and the #2 CPU and RAM went into a spare MB for #3(#3 is ITX so I couldn't just move #2 straight over) #3 is now a i7-4790K, so only 9 years old or so.
 
I have a rolling upgrade of 3 desktops and 2 servers. Typically no full system gets replaced in a year, but there are absolutely new drives, memory or CPU upgrades. And then when there's a need one of the desktop motherboard+CPU+RAM becomes a server and a new desktop is cobbled together with higher-spec parts.

Last year desktop #3 (the oldest) blew. So I took a spare MB and got new CPU and RAM for #1. That one became #2 and the #2 CPU and RAM went into a spare MB for #3(#3 is ITX so I couldn't just move #2 straight over) #3 is now a i7-4790K, so only 9 years old or so.
Yes, the rolling upgrade is how I do it also. If I get a new processor the old one goes into the server and the server’s old one goes into the backup server. I swap out CPUs and motherboards the most regularly, and add SSDs and HDDs whenever I approach 50% capacity used.
 
  • Like
Reactions: The Ghost of Kviv
People just don't replace computers that often.
I didn't mean to imply that they do, only that it'll keep the laptops on that two year refresh cycle that everyone seems to have mostly settled on where they can shit out marketing about how it's worth upgrading to, not that people are consooming that hard. Not just for battery life but performance-per-watt I feel is going to get increasingly retarded over "green" legislation.
 
I have a rolling upgrade of 3 desktops and 2 servers. Typically no full system gets replaced in a year, but there are absolutely new drives, memory or CPU upgrades. And then when there's a need one of the desktop motherboard+CPU+RAM becomes a server and a new desktop is cobbled together with higher-spec parts.

Last year desktop #3 (the oldest) blew. So I took a spare MB and got new CPU and RAM for #1. That one became #2 and the #2 CPU and RAM went into a spare MB for #3(#3 is ITX so I couldn't just move #2 straight over) #3 is now a i7-4790K, so only 9 years old or so.
Yes, the rolling upgrade is how I do it also. If I get a new processor the old one goes into the server and the server’s old one goes into the backup server. I swap out CPUs and motherboards the most regularly, and add SSDs and HDDs whenever I approach 50% capacity used.
Where do the working parts go once they hit the end of your line?
 
Where do the working parts go once they hit the end of your line?
On a shelf. I’ve given a few things away to my nephew but apart from that it’s a fairly complete collection of 2010s computer parts.
 
  • Agree
Reactions: DavidS877

To be fair, you're more of a professional use case, I was really talking about the gamers reviewers are talking to. The reviewers are always saying things like, "AMD really screwed up this year - this year's new Ryzen won't fit into last year's socket, meaning gamers who bought last year's model and expect to upgrade to this year's are FUCKED. Surely Intel will claw back market share now!" Or, "Damn, this year's GeForce XX70 can't run Cyberpunk 2077 10x faster than last year's XX70, I guess NVIDIA's just absolutely boned when all those gamers who upgrade their GPU every year don't buy it." What fraction of gamers buy a new GPU and CPU every year to stick into their motherboard?

I didn't mean to imply that they do, only that it'll keep the laptops on that two year refresh cycle that everyone seems to have mostly settled on where they can shit out marketing about how it's worth upgrading to, not that people are consooming that hard. Not just for battery life but performance-per-watt I feel is going to get increasingly retarded over "green" legislation.

Biggest trend right now is just putting more stuff on the package. While you technically can put 40 cores on a desktop CPU now with current technology, writing parallel code is hard, and most programmers are extremely bad at it, so it seems like anything beyond 8 cores is really getting into workstation/prosumer territory (video encoding is embarrassingly parallel). So the on-die GPU is getting bigger, they're adding matrix processors (which they're calling "machine learning engines"), and in Apple's case, moving the RAM to the die as well--wouldn't be surprised to see Intel and AMD follow suit.
 
To be fair, you're more of a professional use case, I was really talking about the gamers reviewers are talking to. The reviewers are always saying things like, "AMD really screwed up this year - this year's new Ryzen won't fit into last year's socket, meaning gamers who bought last year's model and expect to upgrade to this year's are FUCKED. Surely Intel will claw back market share now!" Or, "Damn, this year's GeForce XX70 can't run Cyberpunk 2077 10x faster than last year's XX70, I guess NVIDIA's just absolutely boned when all those gamers who upgrade their GPU every year don't buy it." What fraction of gamers buy a new GPU and CPU every year to stick into their motherboard?
Oh, yeah, that’s fair. I tend to be very slow to upgrade my graphics card (now that AMD CPUs have iGPUs I may eventually not bother at all and just stick extra SSDs on instead), and the games I play are all things that run just fine on a 2013 MacBook Air iGPU. My upgrades focus on CPU, RAM, and storage. The oldest part of my workstation right now, barring the NIC and the USB DOM I boot off of, is the 2022 SN850X SSDs. But a gamer wouldn’t think much of me because I spent almost all the money on CPU, RAM, and four SSDs, when all he cares about is GPU. Which is a mediocre 6900XT and likely will stay this way for half a decade or more. I picked this card because of the GPU shortage, if I’d had the option I would have gotten a very un-gamery Radeon Pro W6800.
 
  • Like
Reactions: The Ugly One
People just don't replace computers that often
I keep my computers running till they break. Currently in the process of sticking in a new M.2 of my laptop. I'll upgrade as long as I can before it burns out, because I hate transferring shit, that and I like what I paid for, imagine that!
 
Biggest trend right now is just putting more stuff on the package. While you technically can put 40 cores on a desktop CPU now with current technology, writing parallel code is hard, and most programmers are extremely bad at it, so it seems like anything beyond 8 cores is really getting into workstation/prosumer territory (video encoding is embarrassingly parallel). So the on-die GPU is getting bigger, they're adding matrix processors (which they're calling "machine learning engines"), and in Apple's case, moving the RAM to the die as well--wouldn't be surprised to see Intel and AMD follow suit.
AMD's Phoenix APU added the XDNA AI accelerator. AMD is cagey about adding it to desktop so far. They see it as more important for mobile. Intel's Meteor Lake also adds an AI accelerator. It's unclear if these can be faster than the iGPU, but they are more power efficient and can run while the iGPU is handling graphics.

AMD's regular Strix Point APU is supposed to go to 12 cores, perhaps 16 CUs of graphics, from 8 cores and 12 CUs. Meteor Lake adds 2 more E-cores in the SOC tile that may be primarily for idling and intra-SoC communication purposes. The iGPU has been nerfed from 192 to 128 execution units, but should still be a big improvement from 96 slower ones.

Meteor Lake might add L4 cache (codenamed "Adamantine"), years after Intel stopped adding the beloved eDRAM L4 cache to a handful of mobile and desktop parts. AMD hasn't done anything interesting with cache on APUs since boosting the usable amounts per core way above Renoir. If AMD and Intel aren't willing to put whole ass amounts of RAM into their APUs and CPUs yet, meaning 8 GB or more, then they can at least use bigger caches to compensate for a lack of memory bandwidth.

So yeah, fun times, and people are going to end up with around 16 cores on the top mobile x86 products, in addition to all of the other stuff. What's next, an FPGA? Room-temperature quantum processing unit?
 
Last edited:
  • Like
Reactions: The Ghost of Kviv
So yeah, fun times, and people are going to end up with around 16 cores on the top mobile x86 products, in addition to all of the other stuff. What's next, an FPGA? Room-temperature quantum processing unit?
We might see more on-die RAM, like the Apple Silicone chips do, in mobile chips? Large amounts of fast and active ram helps for AI systems. The leaked AIs were far more memory intensive then cpu intensive, with the capability of your AI completely dependent on the amount of ram you had.
 
AMD's Phoenix APU added the XDNA AI accelerator. AMD is cagey about adding it to desktop so far. They see it as more important for mobile. Intel's Meteor Lake also adds an AI accelerator. It's unclear if these can be faster than the iGPU, but they are more power efficient and can run while the iGPU is handling graphics.

AMD's regular Strix Point APU is supposed to go to 12 cores, perhaps 16 CUs of graphics, from 8 cores and 12 CUs. Meteor Lake adds 2 more E-cores in the SOC tile that may be primarily for idling and intra-SoC communication purposes. The iGPU has been nerfed from 192 to 128 execution units, but should still be a big improvement from 96 slower ones.

Meteor Lake might add L4 cache (codenamed "Adamantine"), years after Intel stopped adding the beloved eDRAM L4 cache to a handful of mobile and desktop parts. AMD hasn't done anything interesting with cache on APUs since boosting the usable amounts per core way above Renoir. If AMD and Intel aren't willing to put whole ass amounts of RAM into their APUs and CPUs yet, meaning 8 GB or more, then they can at least use bigger caches to compensate for a lack of memory bandwidth.

So yeah, fun times, and people are going to end up with around 16 cores on the top mobile x86 products, in addition to all of the other stuff. What's next, an FPGA? Room-temperature quantum processing unit?

I think they're starting out with these "AI engines" in mobile to kind of see how it plays out. If evaluating lots and lots of many-parameter functions really does become the next big thing in computing, it'll be deployed in every desktop CPU soon enough. If ChatGPT ends up being a huge fad, then no worries, they didn't go all-in. Intel went all-in on AVX-512 prematurely, and it ended up bein a hot, expensive waste of silicon (in part because, again, most programmers are too retarded to write SIMD-friendly code).

I can almost guarantee on-die RAM is coming. The reason it's not going to be in Meteor Lake or Strix Point is the basic design parameters for those chips were locked in ~2 years ago, so there were still a lot of question marks around whether Apple's move was the right one. And, as it turns out, it was the right move. Most people do not, in fact, ever upgrade their RAM. The US PC (meaning all desktops + laptops, not Windows) market is down 7.3% YOY, but in the same period, Apple's sales were up 18%. I don't have to have any inside info to know that everyone else has noticed that computers with no dGPU and fixed RAM are growing like crazy and are telling Intel and AMD they want a competitive product.

1688943655932.png


 
Back