GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Advanced graphics settings will impact your GPU. DLSS + FG will significantly reduce their impact by reducing the total number of pixels directly rasterized. My guess is a lot of high CPU usage comes from decompressing assets as you roam around the game world.
Ah, so turn those on. As for the CPU usage, you are probably right. Even with it on a very fast SSD, you can notice the world load. Just barely, but you can on the fringes. And a lot is happening. There is a fuck ton of NPC's, lights, sounds, action, just a lot of shit happening all at once. I'm just surprised it's taxing a 12 gen i9 when it's rated for a 12 gen i7 as recommended.
 
I think Intel showed us with Lunar lake and Arrow lake that a lot of that energy efficiency has to do with SOC packaging. When you move the ram, storage, etc on die, you save system power and improve performance at increased production and switching cost and sku numbers. After all- now you need to pay for ram and storage in addition to the CPU, and can't swap them out.
I think the efficiency benefits of on-package memory are severely overrated, because of Apple fanboys or autisms. It can reduce power consumption and allow increased memory speed somewhat but it's not a magic bullet. Also, the storage isn't moving on there. That's something to think about when we finally get universal memory in 10+ years.

There are a lot of optimizations and design changes in Lunar Lake, from Meteor Lake-U. Very notably it goes from 4 chiplets (not counting Foveros base) to just 2. It moves graphics back onto the compute tile, using TSMC N3B instead of Intel 4 for CPU and TSMC N5 for graphics. I don't know how N3B compares to Intel 4 but I imagine it's better. Lunar Lake surprisingly drops the "low power E-cores" after just one generation, having the massively improved (according to PowerPoint slides) Skymont E-cores handle low-power situations. Instead of 2 P-cores, 8 E-cores, and 2 LP E-cores (2+8 has been the config since Alder Lake), it uses 4 P-cores and 4 E-cores. Lessons were learned and have been applied in Lunar Lake.

I don't think we know enough about Arrow Lake mobile/desktop to say a lot, or I need to pay more attention. But I did read that there will still be an Arrow Lake-U, which means Lunar Lake is not a replacement for the "-U" die if that's true. I do hope we see "Adamantine" L4 cache during this half of the decade.

Realistically speaking, where will we go with the 5000 series? Everything since the 2000's feels like dimishing returns.
Only thing I can think why I'd want to upgrade my GPU is to run more local AI models and maybe Helldivers, since that's the only game I struggle to run. Everything else is just mediocre and most games I play are older than 5 years anyways.
We can expect that the 5090 and 5080 will launch first, so for those who have no interest in spending $1,000+ on a GPU, you will have to wait much longer into 2025 to care.

From the info that has been leaked about the dies used, it looks like Nvidia is continuing the strategy of making the 90-class card get an impressive 50% or better generational gain, while other cards below that will be shuffling tiers around again. We can be pretty sure that the 5080 can't beat the 4090, because sanctions would prevent it from being sold in China. Maybe it can win in gaming from the bandwidth but not compute?

If you are feeling no pressure to upgrade for gaming, you can safely ignore GPU launches besides glancing at them.

For AI, the 5090 is rumored to use a 448-bit bus (cut down from 512-bit which will be used with pro cards) for 28 GB of memory instead of 384-bit / 24 GB this time. And it will use much faster GDDR7. That could translate into some great improvements for AI, although I wouldn't be surprised if having under 32 GB is a trick to keep it from running some models as good as the professional cards can. -4 GB doesn't seem like a lot but 32 GB is ~14% more than 28 GB, and there have already been 32 GB pro cards. Some models are probably designed to use that much VRAM.

If you're looking for something cheaper, based on those memory buses, you're likely to see the 5070 with 12 GB, 5080 and maybe a 5060 Ti with 16 GB. Same as before. But all of those will use GDDR7 and the increased memory bandwidth should be helpful. The lowest die listed for 8 GB budget cards uses GDDR6.

Cyberpunk 2077 was the first and last game release I felt hyped for.
I want to feel something for TES VI, but a lot has changed for the worse since 2011. So I'll keep my expectations low, maybe pirate it 5 years from now and be pleasantly surprised.
 
Last edited:
with the 4070 super roughly at 70.
If you are only getting 70% GPU utilization, you need to up the settings and/or get a better monitor with higher resolution/FPS… if you only want to game at 1080p you should get a monitor that can push FPS above 144hz with a 4070 Super.
 
hink the efficiency benefits of on-package memory are severely overrated, because of Apple fanboys or autisms. It can reduce power consumption and allow increased memory speed somewhat but it's not a magic bullet. Also, the storage isn't moving on there. That's something to think about when we finally get universal memory in 10+ years.

As of 11nm, fetching a pair of 64-bit floats from RAM takes 200x -300x the energy of doing an operation on them. Moving the data from somewhere on-chip takes considerably less energy. New process nodes are reducing the energy cost of processing, but data movement not so much.

1719844455560.png

source

I don't know how N3B compares to Intel 4 but I imagine it's better.

They completely skipped high-efficiency cells for Intel 4, which made it not a good general-purpose node. Intel 3's yields are low enough that its entire capacity is going to the Xeon line, where AMD has been absolutely eating their lunch.

Realistically speaking, where will we go with the 5000 series?

Depends, how much current can your fuse box handle?
 
If you are only getting 70% GPU utilization, you need to up the settings and/or get a better monitor with higher resolution/FPS… if you only want to game at 1080p you should get a monitor that can push FPS above 144hz with a 4070 Super.
I'm planning on upgrading the monitor, the one I have was a thrift store find, temporary.
 
  • Like
Reactions: Vecr and 4090 Chad
As of 11nm fetching a pair of 64-bit floats from RAM takes 100x the energy of doing an operation on them. Moving the data from somewhere on-chip takes considerably less energy.
The data still has to travel a "long distance" in both the cases of on-package and off-package memory. It might be a -20% picojoules per bit improvement, shaving off a fraction of a watt from the overall total power consumption which will usually be from around 10 to 30 Watts for lower power x86 mobile. And then they use the power savings to run it at LPDDR5X-8533 instead of LPDDR5X-7500 or whatever. It's helpful but not revolutionary.

If we want the real shit, main memory needs to be moved into a 3D layered arrangement nanometers away from the cores, so that 100x becomes 1.5x or something.
 
Depends, how much current can your fuse box handle?
It's a 2070 Gigabyte Windforce, with a i7-9700K, I got the whole thing before the pandemic and the crypto boom, so I got lucky there, it can still handle most if not all content at pretty decent quality and good framerate, except Helldivers which I've mentioned, where I had to decrease to 1080p to be able to have at least a decent framerate, and Deadlock, which just runs like shit but that's just Valve. That's pretty much it.
If I want to change GPUs, I also have to change the CPU to avoid bottlenecks, and for that I'd require a new mobo with a socket compatible with a new gen CPU, also new GPUs have ridiculously big sizes and I need a new case, and by that point I'd end up buying a new PC since the only thing I'm missing is the cooling
If you are feeling no pressure to upgrade for gaming, you can safely ignore GPU launches besides glancing at them.
That's how I'm been feeling for the past 2 years, so I guess I'll just continue like that, though I do wonder if I should still get a new case to future-proof and in the way get better cooling and more storage since I'm still using a 500GB SSD and a 1TB hard drive where I store most things.
But then again, we are expected to see a big leap in storage thanks to the HAMR shit that is coming up soon™, so maybe it would be better to just wait there too.
 
Anyone have an RTX 3060? How are they holding up? Wife is dipping her toes into gaming, so I built a PC out of older parts, and all it's missing is a gpu. There's an rtx 3060 12gb on newegg for $300 that seems pretty decent.
You can find a used 3060 on ebay for less than $300. I've seen 3060 TIs for less than $300 too.
 
The data still has to travel a "long distance" in both the cases of on-package and off-package memory. It might be a -20% picojoules per bit improvement, shaving off a fraction of a watt from the overall total power consumption which will usually be from around 10 to 30 Watts for lower power x86 mobile. And then they use the power savings to run it at LPDDR5X-8533 instead of LPDDR5X-7500 or whatever. It's helpful but not revolutionary.

If we want the real shit, main memory needs to be moved into a 3D layered arrangement nanometers away from the cores, so that 100x becomes 1.5x or something.

Moving data from DIMMS consume about 2x the energy of moving data from the package (source). Most of the power consumption of a memory network isn't directly by the memory chips on the DIMMS; you have to consider the I/O network as a whole, which can consume over half of a system's energy budget under load (source).

except Helldivers which I've mentioned, where I had to decrease to 1080p

With modern games, you should lower the upscaling quality rather than lowering the resolution.

If I want to change GPUs, I also have to change the CPU to avoid bottlenecks

That CPU has no problem keeping up with 60 fps in Helldivers 2, so it sounds like you're GPU limited. But you probably don't need to upgrade. There are a lot of settings to fiddle with in that game. I don't think my 6700XT is much better than a 2070, and I can pull 70 fps without much trouble.
 
Last edited:
  • Like
Reactions: Unstable Diffusion
You can find a used 3060 on ebay for less than $300. I've seen 3060 TIs for less than $300 too.
Wife ended up getting a 4060 for a little over $300, as the performance is better than a 3060 for just a few bucks more.

Now I need to think if I want to replace my still amazing 980ti for current features or not. This thing has been with me for almost a decade, and it's STILL playing games amazing. I'm not sure if it's completely worth it yet. Plus, I like the ones with built in liquid coolers because they are so quiet, and those are way more expensive than they used to be.
 
I found out what was happening with Cyberpunk yesterday. Power efficiency for the PC was set to balanced, not performance. Everything was getting throttled. Now CPU load on the i9 12900k at its high points is hitting 80%, and usually in the 70s. High, but not nuclear for what I'm seeing on screen. It was a me fuck up.
 
I watched the videos and they want you to remove the retaining mech on lga1700 and add washers? Whatever happened to simplicity?
 
Competitors have always beaten Noctua in performance. Where they dominate is in low noise levels and beige colour schemes. But you can get that by swapping in Noctua fans on a non-Noctua cooler.
 
Competitors have always beaten Noctua in performance. Where they dominate is in low noise levels and beige colour schemes. But you can get that by swapping in Noctua fans on a non-Noctua cooler.
They lose at noise and idk, the beige is debatable because of how many people want the chromax for even more money.
 
Am I the only one who finds these giant Noctua coolers absolutely useless nowadays? I understand that a decade ago, cases with top rad mounts were much more expensive, AIO liquid coolers were also expensive and largely unproven. Air coolers were great because they could fit in most cheaper cases, without modifications, and noctua's ones in particular were much quieter while dissipating a lot of heat. It was basically the midpoint between loud air coolers and expensive/complicated liquid setups.

But nowadays? My wife's cheap $80 case came with 480mm worth of top radiator mounting area. AIO liquid coolers are cheap and fairly reliable. You also have basically the same amount of noise reduction. There's almost no reason to use an air cooler except for maybe cost, but noctua is more expensive than a 120mm AIO cooler now, so not even price is on their side. I just don't see why nowadays you would go for an expensive cooler that dumps all of the heat directly into the case, having to run a whole bunch of fans to push that out, when you could just dump the hot air straight out of the case with a cheap AIO. Am I missing something here?
 
  • Like
Reactions: Vecr
Back