GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

GPU is sitting at minimum usage and CPU regularly caps out at 100% when sitting in game menus and doesn't really change with resolution.

Not just on unreal 5 games either.
This CPU is on it's maxed stable 4.7 GHz overclock.
The only effect screen resolution could possibly have on your CPU is that lowering it allows the GPU to run even faster, pushing your CPU close to its limits, i.e. at 120 fps, your CPU has to finish its tasks 2x faster than at 60 fps.

Two things to check here. One is to make sure your frame rate isn't unbounded in the menus. In Diablo IV and MW2019, I can really heat up my CPU by increasing the in-menu frame rate. Second thing is to see if you've picked up some malware, or if some stray always-on process is causing the CPU to be retarded. Sometimes, even dumb little programs like Corsair's iCue can cause the CPU to run at full speed.
 
Last edited:
Is a Nvidia 2060 a good GPU for 1080p gaming games from 5+ years ago and titles like Baldur's Gate 3 or Power wash simulator?
 
Is a Nvidia 2060 a good GPU for 1080p gaming games from 5+ years ago and titles like Baldur's Gate 3 or Power wash simulator?
Depends on how little you are paying. I have a 12gb model and it was adequate for playing eSports crap like Overwatch, though it is now sitting in a server.
 
Is a Nvidia 2060 a good GPU for 1080p gaming games from 5+ years ago and titles like Baldur's Gate 3 or Power wash simulator?
2060 is the exact recommended spec for BG3. Although if you're buying secondhand anyway, I would just get a 3060 12GB because it's only like $20 more on the used market from what I can see.
 
"There's more. This new release unlocks frame rate generation multipliers up to x20."

See, to me, this what PC gaming is really about. It's not about having the best specs. It's about being able to do any damn fool thing you want to do, and if I want to lock my speed at 15 fps and use 10x upscaling to game at 150 fps in the blurriest mess possible just to show I can do it, I should be able to.
 
I think you came up with that, but they have basically admitted that's what they want to do:

Nvidia Hints at DLSS 10 Delivering Full Neural Rendering, Potentially Replacing Rasterization and Ray Tracing

No, that is not an April Fool's headline.

Think about it this way - we already use various tricks, like normal mapping, to make a low-fidelity model resemble a very high-fidelity model. Imagine if, instead of chopping things down to low-fidelity assets and using rendering tricks to make them look nice, those high-fidelity assets are used to train AI models, and then you inference at rendering time based on input primitives that would basically be tagged skeletons. Inferencing already does an excellent job when just using a low-res frame and motion vectors, so I imagine it could do much more with better input.

There's a lot more about DLSS4 here. It isn't just upping the frame count from 2x to 4x.

Main things:
  • There's a new frame generation model that's both higher quality and 40% faster than DLSS3
  • Pacing the frame generation has moved from the CPU to the GPU
  • The convolutional neural networks used for DLAA + upscaling + ray reconstruction have been replaced by a vision transformer. I don't really know what either of those terms are. The new thing is fancier and much less prone to visual artifacts.
  • The new multi-frame gen technology is only possible on 50 series GPUs, but the CNN -> VT upgrad will be available on 20 series cards and up.
Showing the difference between CNNs & VTs:
 
Think about it this way - we already use various tricks, like normal mapping, to make a low-fidelity model resemble a very high-fidelity model. Imagine if, instead of chopping things down to low-fidelity assets and using rendering tricks to make them look nice, those high-fidelity assets are used to train AI models, and then you inference at rendering time based on input primitives that would basically be tagged skeletons. Inferencing already does an excellent job when just using a low-res frame and motion vectors, so I imagine it could do much more with better input.
I am going to remain wary of deceptive marketing, but I understand that we will probably give in eventually, and frame generation/quadrupling/octupling is ultimately a necessity for 1000+ Hz displays.

It's Jensen's world, we're just rendering it.
 
I am going to remain wary of deceptive marketing, but I understand that we will probably give in eventually, and frame generation/quadrupling/octupling is ultimately a necessity for 1000+ Hz displays.

It's Jensen's world, we're just rendering it.
I'd rather see this in 60 fps than anything at 1000 fps:

 
I look forward to the day when Nvidia GPUs no longer "brute force" my frames at all, but instead use AI to imagine what my game should look like.
What a terrifying statement. Guess I'll keep my 6750XT especially if the 9070XT turns out to be a dud.

Also, while I was cleaning my desktop, I decided to do a little thing.
20250110_182217.jpg20250110_182204.jpg
"There's more. This new release unlocks frame rate generation multipliers up to x20."

See, to me, this what PC gaming is really about. It's not about having the best specs. It's about being able to do any damn fool thing you want to do, and if I want to lock my speed at 15 fps and use 10x upscaling to game at 150 fps in the blurriest mess possible just to show I can do it, I should be able to.
rip Linux users. Fuck you guys for having really lucky shit.
 
What a terrifying statement. Guess I'll keep my 6750XT especially if the 9070XT turns out to be a dud.
The 9070 might turn out to be a decent upgrade from a 6750. I have a 6900 which is in a bit of a crappy position. The newer high end cards support better software but don't add meaningful performance gains.
 
  • Agree
Reactions: The Lurky 1
The 9070 might turn out to be a decent upgrade from a 6750. I have a 6900 which is in a bit of a crappy position. The newer high end cards support better software but don't add meaningful performance gains.
Maybe I'll just aim for the non-XT version.
 
I have a RX 6600 XT, is it worth getting a 7800 XT for cheap once the new series is out?
Does increasing the amount of RGB lights in your computer make it go faster
Only if you set them to red.
 
Back