GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
In that case, I imagine it would be bigger since iGPU feeds off main memory
Intel dGPUs.
AMD and NVIDIA have decades of experience making GPUs, their drivers were optimised in a pre-ReBAR era, and those optimisations still work. Intel doesn’t have this backlog, their drivers require ReBAR to perform well. It’s not more complicated than that.
 
Intel dGPUs.
AMD and NVIDIA have decades of experience making GPUs, their drivers were optimised in a pre-ReBAR era, and those optimisations still work. Intel doesn’t have this backlog, their drivers require ReBAR to perform well. It’s not more complicated than that.
1750376547566.webp
 
9600X3D spotted in drivers

Intel Arc GPU Guide Suggests 10th Gen CPU or Newer Required
Arc A770 Loses Up to 24 Percent Performance Without Resizable Bar
Intel Arc A770 Tested with PCIe 3.0 and Resizable BAR Off
You really need resizable-BAR for the Arc A770 to perform as advertised. The card is simply unusable without it, and Intel said as much throughout the marketing events leading up to this launch. With Resizable-BAR disabled, performance drops like a rock down to just 77% of the performance you get with it enabled at 1080p, and just 76% at 1440p—nearly a quarter of the performance lost. But that's averaged across all game tests, and presents an incomplete picture.
It's been known for a long time, and admitted by Intel. Made their position even more questionable since they are/were the budget brand with $100-300 cards (like the A380), but Nvidia or AMD could be a better choice for ancient but otherwise acceptable CPUs. I think it spurred some interest in forcing ReBAR to work on older systems/mobos. I assume they will never fix it because of a lack of resources and almost 100% of new systems going forward will have ReBAR enabled.
 
Last edited:
  • Like
Reactions: Dawdler
i was hit by a bunch of apple fanboy nonsense the other day about their in-house processors and i've been trying to wrap my head around it. im trying to peel back the woo and its kind of hard because the product is the woo and not the silicon itself and they clearly want to make the nerds buy one to mess with it to answer their questions

its a beefed up chromebook with an interpreter like their old powerpc>intel one for intel>arm that works well enough to play old video games. but their marketing leads people to say they figured out some cpu innovation. did they? or were they the first to the market on the next gen process node? the new intel laptop cpus on passmark seem to perform about the same now
 
  • Thunk-Provoking
Reactions: Brain Problems
its a beefed up chromebook with an interpreter like their old powerpc>intel one for intel>arm that works well enough to play old video games. but their marketing leads people to say they figured out some cpu innovation. did they? or were they the first to the market on the next gen process node? the new intel laptop cpus on passmark seem to perform about the same now
They were first to TSMC N5 (Apple M1), and ARM performs better than x86 at lower power levels (for now). They use memory-on-package which improves efficiency but isn't magic (like DRAM as 3D L4 could be in the future) and hurts upgradeability. Intel adopted memory-on-package for Lunar Lake but pulled back. They have efficiency cores, which AMD didn't at the time. Now AMD does have their take on efficiency cores, and will probably go even further by adopting separate "LP" cores like in Intel's Meteor Lake, allowing CPU chiplets to be turned off when not in use.

Apple are generous with the memory channels and bandwidth above the 128-bit entry-level product. This wasn't matched until AMD launched Strix Halo with a 256-bit memory bus for AI/gaming.
 
i was hit by a bunch of apple fanboy nonsense the other day about their in-house processors and i've been trying to wrap my head around it. im trying to peel back the woo and its kind of hard because the product is the woo and not the silicon itself and they clearly want to make the nerds buy one to mess with it to answer their questions

its a beefed up chromebook with an interpreter like their old powerpc>intel one for intel>arm that works well enough to play old video games. but their marketing leads people to say they figured out some cpu innovation. did they? or were they the first to the market on the next gen process node? the new intel laptop cpus on passmark seem to perform about the same now
They've built their chips around what end-users actually experience instead of chasing surrogate metrics from the tech mags. The M-series have good memory bandwidth, excellent low-power performance (their equivalent of e-cores are quite capable and sip power), and they own the OS so they don't have to worry about the windowsjeets dragging their feet to make sure scheduling actually works.

They also poached most of their talent from established big-players in the CPU space so you should assume their architecture is roughly as competently put-together as Intel or AMD's, just with an ARM ISA.

Anecdotally I can say that I use my macbook for everything that isn't gaming (including some really large compilation jobs and running a bunch of local containers simultaneously) and it's as snappy as my 7950X desktop when it comes to these things.

The big thing though is the efficiency. My macbook is just as performant when unplugged as it is when plugged in and that's something that's still not super common among x86 laptops (although AMD is getting there). Apple also controls the OS and bullies application vendors so you rarely have situations where applications will wake up the macbook and guzzle power when it's supposed to be in sleep mode.
 
Last edited:
They were first to TSMC N5 (Apple M1), and ARM performs better than x86 at lower power levels (for now). They use memory-on-package which improves efficiency but isn't magic (like DRAM as 3D L4 could be in the future) and hurts upgradeability. Intel adopted memory-on-package for Lunar Lake but pulled back. They have efficiency cores, which AMD didn't at the time. Now AMD does have their take on efficiency cores, and will probably go even further by adopting separate "LP" cores like in Intel's Meteor Lake, allowing CPU chiplets to be turned off when not in use.

Apple are generous with the memory channels and bandwidth above the 128-bit entry-level product. This wasn't matched until AMD launched Strix Halo with a 256-bit memory bus for AI/gaming.
i guess the nutshell explanation is they totally re-did the software and hardware stack of their ecosystem around a processor they designed, to deliver maximum efficiency, memory latency, and bandwidth at the lowest power point, at the expense of being able to expand any of it. those intel processors have double (probably triple) the TDP at their peak but you can plug an external graphics card into them and other such nonsense.

those aren't really good things for laptops, but they are for AIO imac type computers. it's mostly the desktop offerings that make apple silicon seem questionable and as part of the apple woo they dont really want to say 'yeah the reason its so good is that they designed it around laptops so the desktops got compromised"
 
First time using an AMD CPU. Also my first time using an 8 core CPU on a personal machine. Didn't expect the damn thing to idle at nearly 50C with my fans at stock everything. Fixed that with an insanely aggressive fan curve.
 
First time using an AMD CPU. Also my first time using an 8 core CPU on a personal machine. Didn't expect the damn thing to idle at nearly 50C with my fans at stock everything. Fixed that with an insanely aggressive fan curve.
What cooler? My 3900x (12 core) idles at 38c with a beQuiet dark rock. I don't know how the newer Ryzen chips run.
 
What cooler? My 3900x (12 core) idles at 38c with a beQuiet dark rock. I don't know how the newer Ryzen chips run.
Noctua U9S Redux. Don't like huge bulky coolers. CPU is a 9700X. The high idle temps could be due to it being summer. House gets hot this time of year. As long as it isn't in the 90's during heavy loads, it's probably fine for now.
 
It's been known for a long time, and admitted by Intel. Made their position even more questionable since they are/were the budget brand with $100-300 cards (

"Cheap" and "supports old machines" aren't really equivalent terms. People with 6-year-old or older machines planning to plug a brand-new GPU into that old box are probably a meaningless sliver of the market compared to OEMs, DIYers building all-new machines, and people updating machines <= 5 years old.

i was hit by a bunch of apple fanboy nonsense the other day about their in-house processors and i've been trying to wrap my head around it. im trying to peel back the woo and its kind of hard because the product is the woo and not the silicon itself and they clearly want to make the nerds buy one to mess with it to answer their questions

its a beefed up chromebook with an interpreter like their old powerpc>intel one for intel>arm that works well enough to play old video games. but their marketing leads people to say they figured out some cpu innovation. did they? or were they the first to the market on the next gen process node? the new intel laptop cpus on passmark seem to perform about the same now

Memory bandwidth and the square-voltage power scaling law aren't woo. Apple Silicon trades speed during compute-bound work for speed during memory-bound work, which from an energy-efficiency standpoint is a huge win.
 
  • Like
Reactions: Brain Problems
those aren't really good things for laptops, but they are for AIO imac type computers. it's mostly the desktop offerings that make apple silicon seem questionable and as part of the apple woo they dont really want to say 'yeah the reason its so good is that they designed it around laptops so the desktops got compromised"
Well also keep in mind that laptops are what sells. The overwhelming majority of people who buy computers buy laptops. And the overwhelming majority of people who buy laptops never upgrade them. Making a really kickass machine that fits into like 80-90% of the buying public's needs is part of their overall strategy to move volume.

The Mac desktops do have some niche uses - the Mac Studio and Mac Pro are where you can find Threadripper-tier core counts at significantly lower energy consumption, giving you access to workstation-class performance without the heat or energy consumption. And the Mac Mini is just a generally solid desktop machine for normies that's absurdly cheap in its base configuration.

I don't think this mobile-first approach is going to stay Apple-exclusive. We're already seeing vendors like Minisforum offer mobile chipsets in desktop form factors because of the cost and energy savings they provide. I could very easily see a future where the entire consumer desktop space is just souped up mobile processors.
 
"Cheap" and "supports old machines" aren't really equivalent terms. People with 6-year-old or older machines planning to plug a brand-new GPU into that old box are probably a meaningless sliver of the market compared to OEMs, DIYers building all-new machines, and people updating machines <= 5 years old.
Intel's share of the discrete GPU market drops to 0% as sales in the overall market increase

Intel Arc doesn't have any brand recognition among gaymers. It must be better than zero, but not by much. Tech enthusiasts who read too much news know they exist, and are exponentially more likely to try to rejuvenate an old CPU with a new GPU (GPU typically being the bottleneck for gaming). By offering display output tier GPUs near $100 like the A310 and A380, Intel could have had an ideal choice for older systems and CPUs. And you do get Intel's decent Quick Sync on steroids, with AV1 hardware encode support in Alchemist. But ReBAR/CPU overhead can be a problem for gaming.

I have seen some OEM systems pairing Alder/Raptor Lake and Arc GPUs, not many though.




Expect Zen 6 to hit at least 6 GHz. He won't give up an exact number he's heard, but it could be in that 6.5-7.0 GHz range.
 
Last edited:
i was hit by a bunch of apple fanboy nonsense the other day about their in-house processors and i've been trying to wrap my head around it. im trying to peel back the woo and its kind of hard because the product is the woo and not the silicon itself and they clearly want to make the nerds buy one to mess with it to answer their questions

its a beefed up chromebook with an interpreter like their old powerpc>intel one for intel>arm that works well enough to play old video games. but their marketing leads people to say they figured out some cpu innovation. did they? or were they the first to the market on the next gen process node? the new intel laptop cpus on passmark seem to perform about the same now
Well, it's more a combination of things.

1. The M series is basically a beefed up version of their A series processors, so the chip from their iPhones, etc.
2. Apple sells a HEAP of MacBook Airs and Pros. I'd estimate about 80-85% of Macs on Steam are laptops, and for those, efficiency is really important.
3. Apple invests heavily into TSMC, and in return, gets their nodes a year ahead of everyone else.

The most interesting thing about Apple Silicon is for laptops, it's really fucking good. A fanless laptop that's able to go at a very good speed, lasts hours between charges, and doesn't lose efficiency when unplugged. That's goddamn nice.
 
Noctua U9S Redux. Don't like huge bulky coolers. CPU is a 9700X. The high idle temps could be due to it being summer. House gets hot this time of year. As long as it isn't in the 90's during heavy loads, it's probably fine for now.
I have the same issue on my 9700x. I think its normal as I have an aio that had no problems cooling an 11900k and keeping it under 85c at peek loads. I think its just due to how they report temperatures because when running the cpu in a benchmark it uses 150w and "hits" 95c but my 11900k would hit 200w and be under that by 10c. It could also be due to the high idle power of these cpus
 
I have the same issue on my 9700x. I think its normal as I have an aio that had no problems cooling an 11900k and keeping it under 85c at peek loads. I think its just due to how they report temperatures because when running the cpu in a benchmark it uses 150w and "hits" 95c but my 11900k would hit 200w and be under that by 10c. It could also be due to the high idle power of these cpus
Its because the freaking lid is like a eighth of an inch thick, practically mandatory delidding for higher power AM5
 
  • Like
Reactions: Brain Problems
Back