GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

That's another thing I've noticed - I can barely tell the difference among different detail levels these days. The only visible difference between Darktide on high & medium for me is high drops more frames.
That's why the Steam Deck is such a successful and good system despite its low specs. I guess the same could be said about the Switch.
 
I found a pretty good article comparing Intel 4 and TSMC N5. The TL;DR is that it achieves higher density than TSMC N5 because the transistors aren't as tall. https://semiwiki.com/semiconductor-manufacturers/intel/314047-intel-4-presented-at-vlsi/

A very interesting thing in this article is that Intel 4 has only high-performance cells, while TSMC N5 has both high-performance and high-density cells. Note that high-density cells use less power and have lower performance. I have zero information on why this is. It could be a deliberate design decision, or it could be a sacrifice made to get the Intel 4 node to market. However, it could certainly explain why in benchmarks, we're not seeing Meteor Lake deliver high efficiency. You're also going to have energy losses at tile interfaces, so that might also be part of it.

Oh, and there's a problem with their pre-release BIOS, too. Up to 12% performance loss. Oops.

Meteor Lake is the most complicated x86 CPU yet, with a 6+8+2 CPU configuration. 2 of those cores live on the completely separate SoC tile, so excessive context switching between the SoC and the CPU tile is going to cause a lot of power losses. Early versions of Alder Lake were extremely prone to excessive context switching, which could obliterate the gains you should have gotten from E-cores.

I'm a little curious as to what the real battery life will be. As far as I can tell, every laptop site is run by liars, charlatans, and/or morons when it comes to battery life. I have an ASUS Tuf A15, and review sites show benchmarks with 8-9 hours of battery life, but in real life, I have never gotten more than three. My Macbook Pro actually has all-day battery life. Intel's trying out a completely new power management strategy with Meteor Lake, something a Cinebench test isn't going to capture. Alder Lake preferentially started tasks on the P-Cores and moved them to E-Cores if they proved to not demand too much time, while Meteor Lake completely reverses the hierarchy, starting on LP Cores and then moving to E-Cores and finally P-Cores.
 
However, another big part of the problem is he refused to get a better monitor, he thinks his piece of shit 75Hz spectre monitor is "amazing" because "it was 80 bucks and it's curved!". He doesn't realize he wouldn't be experiencing screen tearing on a better monitor, even with the 1660 super.
I never got why people cheap out on monitors, or not follow the old adage of spending as much on it as your GPU. A shit monitor is going to make a better GPU worthless.
 
  • Agree
Reactions: BPD Gunt Rider
I never got why people cheap out on monitors, or not follow the old adage of spending as much on it as your GPU. A shit monitor is going to make a better GPU worthless.
I've tried explaining this to many people who want builds. I assume it's partly because it's harder to know which monitors are good, and what they even want from one. Monitor marketing is among the worst at being confusing and outright deceptive.
 
You know how annoying it is when people (friends/family) come to you because you're the "computer guy" for tech support. I think even more annoying is when people don't listen to your advice. Bit of a sperg but I needed to vent. So my buddy wanted me to build him a new budget computer to replace his old budget computer, he was going to get another prebuilt from META and I convinced him not to, to let me build it and I won't charge him for labor. So I sent him a pcpartpicker list in his budget, which would have been an amazing computer for the price. Instead, he decided to do his own pcpartpicker build, and it included a 1000W PSU and a 4060, I told him he barely needs half of those 1000W, get a smaller PSU and we can get you a way better GPU. A 6700XT would be in budget, and even a 3060TI if we change some other parts. Nope. He wasn't having it. So whatever he buys his shitty computer, I build it and now he's complaining that he barely notices a difference over his old 1660 super. Lmao.
Should have treated it like a prebuilt -- and by that, I mean "take his money and buy the puter parts for him."
 
A very interesting thing in this article is that Intel 4 has only high-performance cells, while TSMC N5 has both high-performance and high-density cells. Note that high-density cells use less power and have lower performance.
TSMC has a capability they call "FinFlex" which can use different cells on a single die, but only starting with the N3 node AFAIK. It could be a good choice for monolithic "hybrid" APUs (beyond Strix Point, which will be on N4 again).

Intel apparently tested a Meteor Lake design that had P-cores and E-cores on different chiplets, but went with the one we're seeing instead.

I never got why people cheap out on monitors, or not follow the old adage of spending as much on it as your GPU. A shit monitor is going to make a better GPU worthless.
I will cheap out until I can buy an 8K MicroLED display for $200. Wait, I would still be cheaping out...
 
Last edited:
Have you considered a middle ground? Something that is good enough to demonstrate the capabilities of you're video card without falling into diminishing returns?
I don't game enough (or new enough) to care, and also passed on buying into cheap 4K (like TCL sets at Saturday night special pricing and quality). OLED is genuinely interesting. MicroLED will be superior but is 7+ years out, so I jokingly say wait for that. But is it joke?
 
I don't game enough (or new enough) to care, and also passed on buying into cheap 4K (like TCL sets at Saturday night special pricing and quality). OLED is genuinely interesting. MicroLED will be superior but is 7+ years out, so I jokingly say wait for that. But is it joke?
Fuck if I know, I just have a 1.5x 1080p ultrawide with low refresh rates.
 
I never got why people cheap out on monitors, or not follow the old adage of spending as much on it as your GPU. A shit monitor is going to make a better GPU worthless.

This is part of why I think gaming PCs are steadily declining in sales. When you can't tell the difference in image quality between medium, high, and ultra settings, and all there really is to gain is driving an ultrawide monitor at 120 FPS, most people just quit caring. Personally, my eyes aren't sharp enough to tell the difference between 1440p and 4K. The 6700XT struggles a little at 4K, so having a $250 160 Hz 1440p monitor is plenty for me.
 
This is part of why I think gaming PCs are steadily declining in sales. When you can't tell the difference in image quality between medium, high, and ultra settings, and all there really is to gain is driving an ultrawide monitor at 120 FPS, most people just quit caring. Personally, my eyes aren't sharp enough to tell the difference between 1440p and 4K. The 6700XT struggles a little at 4K, so having a $250 160 Hz 1440p monitor is plenty for me.
My bigger issue was with PPI and monitor sizes. I used a 24 inch 1080p monitor for the longest time, but nowadays they all seem to be 27 and 32, or even higher. At 1440p, I would barely get any real sharpness increase, and at 32 it would be the exact same PPI as 24 1080p. I ended up going with a 28 inch 4k monitor to get a real tangible difference I could appreciate.
 
This is part of why I think gaming PCs are steadily declining in sales. When you can't tell the difference in image quality between medium, high, and ultra settings, and all there really is to gain is driving an ultrawide monitor at 120 FPS, most people just quit caring. Personally, my eyes aren't sharp enough to tell the difference between 1440p and 4K. The 6700XT struggles a little at 4K, so having a $250 160 Hz 1440p monitor is plenty for me.
Really that's why I'm willing to wait on a better monitor. I can pick one up for cheap at a thrift store for setup purposes, then once everything is running, buy a nicer one. Even older ones are 1080p, even if they are DVI, I can still game on that, my eyes aren't that bad. It's diminished returns with new monitors
 
I want this computer to be viable for roughly 10 years.
I think this is extremely unlikely to happen even if you buy the tippy top-end. There are certainly periods where the industry develops collective retardation and shit doesn't move much in consumer processors for a decade at a time (the early 1980s until around 1990, whatever the fuck was going on from like 2010 until 2020 etc), but we're definitely not in that kind of environment anymore. 5-7 years? Pretty reasonable, you could probably swing that. 10 years? Unlikely. Just heterogeneous designs in CPUs themselves becoming a big deal is probably going to be a whole nother revolution we'll be riding over the next decade or two where things change dramatically on 5 year spans.

As an aside - it's pretty crazy to me watching zoomers get mindbroken by PCs actually changing. You can tell they've been conditioned by the last ten years into thinking that PC technology always progresses at a glacial pace and being able to put together a high-end gaming PC for like $800 is normal.

As far as I can tell, every laptop site is run by liars, charlatans, and/or morons when it comes to battery life. I have an ASUS Tuf A15, and review sites show benchmarks with 8-9 hours of battery life, but in real life, I have never gotten more than three
They either regurgitate what the manufacturer claims or they do something goofy like play 24-hour youtube videos until the battery dies. I know Apple typically builds prototype machines and hands them out to employees to extensively test via daily use, but I'm not sure if any of the other manufacturers do that. Maybe Sony did at one point (RIP VAIO my beloved).
 
I think this is extremely unlikely to happen even if you buy the tippy top-end. There are certainly periods where the industry develops collective retardation and shit doesn't move much in consumer processors for a decade at a time (the early 1980s until around 1990, whatever the fuck was going on from like 2010 until 2020 etc), but we're definitely not in that kind of environment anymore. 5-7 years? Pretty reasonable, you could probably swing that. 10 years? Unlikely. Just heterogeneous designs in CPUs themselves becoming a big deal is probably going to be a whole nother revolution we'll be riding over the next decade or two where things change dramatically on 5 year spans
It's still possible to account for what the change looks like.

For example, the next big thing is going to be at-home AI for personal use or gaming, which will be available from a dedicated AI card in your computer. So you just need to have a spare full speed PCIe slot on your motherboard and you'll be fine. In the next couple years developers may be forced to confront how horribly optimized their code is, and if they do that then newer games may actually run better on current hardware then current games do
 
I think this is extremely unlikely to happen even if you buy the tippy top-end. There are certainly periods where the industry develops collective retardation and shit doesn't move much in consumer processors for a decade at a time (the early 1980s until around 1990, whatever the fuck was going on from like 2010 until 2020 etc), but we're definitely not in that kind of environment anymore. 5-7 years? Pretty reasonable, you could probably swing that. 10 years? Unlikely. Just heterogeneous designs in CPUs themselves becoming a big deal is probably going to be a whole nother revolution we'll be riding over the next decade or two where things change dramatically on 5 year spans.

Agreed that 10 years is a little much, but with Dennard scaling dead, Moore's Law dying, developers already not using most of the available power, and old games hanging around forever (how old are TF2 and LoL now?), it's less about the hardware itself becoming useless and more the software support, e.g. how Win 11 doesn't support 8-year-old CPUs now.
 
Win 11 doesn't support 8-year-old CPUs now.
maybe not officially but i tried the Win11 ISO that you get directly from Microsoft on this ivy bridge computer and it installed with no complaints or anything
1703473315701.png

the GPU driver does refuse to run unless i turn off the "memory integrity" option in defender but everything seems to work fine with it disabled
1703473437637.png
piece of junk from 2011 using beta Win10 drivers from 2016 by the way
 
For example, the next big thing is going to be at-home AI for personal use or gaming, which will be available from a dedicated AI card in your computer. So you just need to have a spare full speed PCIe slot on your motherboard and you'll be fine. In the next couple years developers may be forced to confront how horribly optimized their code is, and if they do that then newer games may actually run better on current hardware then current games do
Next-gen consoles from Sony and Microsoft are expected around 2026-2028. They're likely to include a dedicated AI accelerator in the APU, like Intel Meteor Lake and AMD Phoenix/Hawk Point do, but it probably won't be super fast and a discrete GPU in your PC might be able to handle it and graphics at the same time. Games won't make good use of it until years after the consoles release. I'll be interested to see if anyone feels compelled to have a separate discrete AI card for gaming and can't already find the capability added into their CPU. I'm not sure if PCIe x16 is necessary, an accelerator in an M.2 slot might be sufficient.

Winblows 12 might be mandating around 45 TOPS of AI performance, which is low and could be a squishy requirement. It was speculated that Hawk Point got bumped to 16 TOPS just so it could meet a requirement when combined with the iGPU, which gets it to 39 TOPS total instead of 33.

Computing progress has slowed significantly, none of this shit matters too much. It's nice to get the latest video decode/encode, AI accelerators, 8-16 blazing fast cores, and iGPUs capable of 1080p gaming. Not very necessary unless artificial requirements are involved. We will need some amazing 3D chip developments to make everything sold from 2010 to 2025 hopelessly obsolete.

maybe not officially but i tried the Win11 ISO that you get directly from Microsoft on this ivy bridge computer and it installed with no complaints or anything
That won't stop millions of office PCs from hitting the landfill, since everything up to Skylake isn't "officially supported". Anyone want to go dumpster diving with me?
 
I'll be interested to see if anyone feels compelled to have a separate discrete AI card for gaming and can't already find the capability added into their CPU.

I think that advances in process node technology are going to primarily drive consolidation of all chip features onto SoCs. Apple is already there, with M2 Ultra just being two M2 Max chips mashed together. Sound cards are dead, dGPUs are dying, and consumer-grade discrete NPUs are stillborn. You may be old enough to remember when floating-point units were dedicated accelerators:

1703556045759.png

If anyone reading this thread has a recent iGPU, you should really try running some games at 1080p on it, see what kind of settings you need to get 60 fps (spoiler, you won't get 60 fps on MW2 on an Intel iGPU unless you have a laptop). I think it gives a good glimpse into why I think things are going toward SoCs.
 
Back