GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Considering I really just want to play games, which it can out of the box with its iGPU, it'll be sufficient for now.

The desktop iGPUs from Intel are significantly weaker than the laptop versions. I know, I've benchmarked the shit out of mine. That said, some games will run at 60 fps on low settings, so you'll survive until you get your GPU.

Looks like I'm moving back to team green, I was able to get a great deal on a 4090 (1200USD).

Stupid dual-slot case. *grumbles* Even the 4070 is too fucking fat for my case.
 
The desktop iGPUs from Intel are significantly weaker than the laptop versions. I know, I've benchmarked the shit out of mine. That said, some games will run at 60 fps on low settings, so you'll survive until you get your GPU.
Really I'm just happy it'll give me a web browser, not really relying on it too hard, but that is indeed good to know. It's just nice having a backup
 
Stupid dual-slot case. *grumbles* Even the 4070 is too fucking fat for my case.
Yeah... Mine's a DIY case, so I've only got myself to blame for not predicting that GPUs would grow so obscenely fat. I did design it around watercooling so it'll still fit a 2.5 slot aircooled card, but the 4090 is like 3.5 slots and ultra long, so... Fortunately a lot of that length is just heatsink, so once I get a water block it'll actually be a significantly shorter GPU than my 6900XT, which is also super long and half the PCB is unpopulated because apparently ASUS decided that a "flowthrough heatsink" is too fancy a feature for a lowly AMD GPU. I think the GPU will fit in the case if I remove the front braces, and front/top and side panel, otherwise I'll unscrew the bifurcation card and just lay the GPU down on a book or something until I can get it watercooled.
Just look how dense the Nvidia PCB is vs. the AMD!
12.jpg287.jpg
Really I'm just happy it'll give me a web browser, not really relying on it too hard, but that is indeed good to know. It's just nice having a backup
AMD iGPUs are pretty solid for games, but last I checked Intel actually had a few compatibility issues. Arc's existence has fixed some of that, but don't be too disappointed if some of your games refuse to launch.
It absolutely can run a browser and software like that, though. Just about every single office computer uses Intel's integrated graphics.
 
Last edited:
AMD iGPUs are pretty solid for games, but last I checked Intel actually had a few compatibility issues. Arc's existence has fixed some of that, but don't be too disappointed if some of your games refuse to launch.
It absolutely can run a browser and software like that, though. Just about every single office computer uses Intel's integrated graphics.
Really as long as it can work on a temporary basis until I get the card, life is good. And to be fair, my laptop hasn't had issues with its IGPU so far, so I have a idea what I can run. I'm not planning on running a simulation of the heat death of the universe even with the card, so I think I'll be fine
 
  • Like
Reactions: Susanna
AMD iGPUs are pretty solid for games, but last I checked Intel actually had a few compatibility issues. Arc's existence has fixed some of that, but don't be too disappointed if some of your games refuse to launch.

When I was doing my sperglord tests, I discovered some Steam games will permanently shit the bed on iGPUs if you switch back and forth on the same computer, or sometimes different computers. Seems to be related with how they save hardware settings to the cloud (why???!!!?). So for example, I tested a game on my laptop iGPU & dGPU, then my desktop iGPU & dGPU, but when going back to the laptop, it only ever crashes now on the iGPU. Another game now crashes in the reverse - only on the desktop iGPU.

re: Meteor Lake news, rumor is the Intel 4 process is only barely scraping by, which is why they shitcanned Meteor Lake on the desktop. i.e. the chips aren't coming out at a high enough quality to survive really high, hot clock speeds. Raptor Lake S is a stopgap solution. However, given that x86 laptop battery life is horrendous, Apple's gobbling laptop market share, and your average CPU is already powered far beyond 99% of consumer needs, and Meteor Lake is a seismic change in CPU design, I wonder if Intel isn't deprioritizing speed in their design so they can get power consumption under control.
 
Last edited:
Luckily, I always opt for full-size tower cases so I can fit my giant monkey-man hands into the case to work on stuff. Major boon in this thicc GPU environment.

re: Meteor Lake news, rumor is the Intel 4 process is only barely scraping by, which is why they shitcanned Meteor Lake on the desktop. i.e. the chips aren't coming out at a high enough quality to survive really high, hot clock speeds. Raptor Lake S is a stopgap solution. However, given that x86 laptop battery life is horrendous, Apple's gobbling laptop market share, and your average CPU is already powered far beyond 99% of consumer needs, and Meteor Lake is a seismic change in CPU design, I wonder if Intel isn't deprioritizing speed in their design so they can get power consumption under control.
It almost feels like we're back in the netburst days. Except back then, Intel eventually released Core and even won over Apple. These days I don't know what their gameplan is.
 
AMD Extends PyTorch + ROCm Support To The Radeon RX 7900 XT

I wonder if Intel isn't deprioritizing speed in their design so they can get power consumption under control.
That seems fine to me. Maybe consumers have unrealistic expectations of laptops, seeing as they buy almost 3x of them than desktops, and often leave them on the desk forever. And the OEMs feel the pain because of that.

I want my laptops to be cheap, relatively weak, with high battery life. I can connect them remotely to a desktop for more power. Intel's 4+4 Lunar Lake would probably be a good fit for me.
 
Dafuk is a rentable unit? You have to pay a subscription to use all your cores?
No, it's just really unfortunately named.
Basically hyperthreading means that the CPU will keep some of a second thread cached while it's working on a primary thread. If there's a gap in the workload for the first thread (for example a really slow instruction where the CPU has to wait for a return from an FPU before doing the next instruction), it can briefly hop over to the second thread and do some work there while it waits for the first thread to be ready again. Hyperthreading gets more work done than single threads, but nowhere near as much as having two actual cores working on separate threads would. 2 cores > 1 HT core > 1 core.
Rentable Unit is a way of dividing a workload between multiple cores simultaneously. It solves some of the efficiency issues inherent to the current Intel heterogeneous architecture. Simple instructions can be executed by e-cores while more complex ones are handled by the more fully-featured p-core. Basically the architecture would be that a "core" consists of a Rentable Unit that presents itself to the OS as a single core. The RU divides work between its components, which currently seems to be one p-core and two e-cores. This way each component core can be kept more busy than if you just divide threads between p-cores and e-cores at random, since currently for example a workload may consist of four threads, where two go to a p-core and the other two to two e-cores. The p-core will likely finish first, and then sit idle while it waits for the e-cores to finish. If you could instead send the entire workload to the RU, it would divide the work evenly between all three cores, and idle time is reduced. Each RU-P-E unit has a common cache and some registers, which lets a single thread be executed in parallel across multiple cores, each one doing what they're most optimised for.

I don't really get it either. It seems like this would multiply the number of registers needed per core, and they're already one of more expensive parts of a CPU die.
 
Dafuk is a rentable unit? You have to pay a subscription to use all your cores?
Some new type of way to assign instructions to cores, replacing SMT hyperthreading (2 threads per core for Intel CPUs):


If it works, it could be much more efficient at allocating resources. If it fails, maybe it's a Bulldozer moment.
 
  • Like
Reactions: Post Reply and Vecr
They are trying to push AVX10 to fix the AVX-512 mess, and are rumored to be ditching hyperthreading for "rentable units" within a couple generations. Maybe they can have another Itanium moment.
They can't just admit Arm did it right with SVE and copy it.

We always disabled HT at work. More trouble than it's worth. It can actually cause performance regressions for some demanding workloads, both the Intel & AMD versions.
 
If it works, it could be much more efficient at allocating resources. If it fails, maybe it's a Bulldozer moment.
Yep. Either way it'll probably solve a lot of the issues I've had with e-cores, since to the OS you're no longer dealing with separate cores of wildly different capability, but one really big super-high performance core that handles the division of labour internally. If Intel can make it work it could make them competitive in the server market again. I don't have high hopes though.
 
Yeah, Apple sells insane amounts of laptops.

A quick look at the Steam numbers, plus working it out, gives us:

MacBook Pro: 51.24%
MacBook Air: 33.51%
iMac: 8.15%
Mac Mini: 4.94%
Mac Studio: 1.37%
MacPro: 0.46%
MacBook: 0.34% (think pre-2015)

That's just by the Steam hardware survey, but it doesn't leave much doubt.

Incidentally, for those curious, October 2020 gives us:

MacBook Pro: 52.84%
MacBook Air: 26.54%
iMac: 16.47%
Mac Mini: 1.98%
MacPro: 0.77%
MacBook: 1.48%
 
Good news, Newegg has the Sapphire Pulse 7900 XT for only $740 after a $10 off code's applied, so not all hope's lost on getting away from a 2019 GPU and having enough fuckoff sized VRAM space to deal with machine learning based audio tools like UVR without encountering the joys of program/system instability brought about by the devs not caring enough to make ROCm cache refreshing work.
 
  • Informative
Reactions: SilentDuck
Yeah, Apple sells insane amounts of laptops.
Skimming around the latest professional guesstimates, MacOS is about 30% of desktop OS marketshare in the US. Considering how much more expensive Mac hardware is, you know shit is fucked up in win86 land for it to be losing so much ground.
 
  • Agree
Reactions: The Ugly One
Yeah, Apple sells insane amounts of laptops.

A quick look at the Steam numbers, plus working it out, gives us:

MacBook Pro: 51.24%
MacBook Air: 33.51%
iMac: 8.15%
Mac Mini: 4.94%
Mac Studio: 1.37%
MacPro: 0.46%
MacBook: 0.34% (think pre-2015)

That's just by the Steam hardware survey, but it doesn't leave much doubt.

Incidentally, for those curious, October 2020 gives us:

MacBook Pro: 52.84%
MacBook Air: 26.54%
iMac: 16.47%
Mac Mini: 1.98%
MacPro: 0.77%
MacBook: 1.48%

And since a MB is as powerful as most consumer desktops (more so for the Pro), if you get one, just get a dock & monitor for your desk. The wife used to have MB Air + iMac, now it's just MB Pro. There's just no reason for the iMac to exist now.

Skimming around the latest professional guesstimates, MacOS is about 30% of desktop OS marketshare in the US. Considering how much more expensive Mac hardware is, you know shit is fucked up in win86 land for it to be losing so much ground.

Right, the important story in the consumer space right now isn't Intel vs AMD, it's Apple vs Win86 (nice portmanteau, stealing). Meteor Lake appears to me to be targeted at Apple, not AMD.
 
I'd say part of it is... it's kinda hard to recommend an iMac.

M3 iMac starts at $2199 AUD. M2 Mac Mini starts at $999 AUD.

That's goddamn over $1000 to throw on a keyboard, mouse, monitor. Or to throw at the Mini to bump the specs up.

iMac's faster than the Mini currently, but it's pretty good odds the Mini'll get a M3 as well.
 
  • Informative
Reactions: Brain Problems
I'd say part of it is... it's kinda hard to recommend an iMac.

M3 iMac starts at $2199 AUD. M2 Mac Mini starts at $999 AUD.

That's goddamn over $1000 to throw on a keyboard, mouse, monitor. Or to throw at the Mini to bump the specs up.

iMac's faster than the Mini currently, but it's pretty good odds the Mini'll get a M3 as well.
The major upsell for the iMac is the display. A Mac Mini + a prosumer-grade 4K monitor with muh accurate colors is already going to put you in the ballpark of an iMac price-wise but with a far less cohesive experience. And if you want the nearest fully cohesive Apple experience with a Mac Mini, you'll be paying more than $2000 to get a Mini plus Apple Studio Display.

The iMac's not something I'd personally recommend (especially since the iMac display is essentially useless w/o the accompanying computing bits thanks to the removal of target display mode) but I can defintiely see why certain types gravitate towards it.
 
Back