GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
I was looking at the minimum specs for some newer AAA releases and here's something that stuck out to me:
View attachment 7095877

These are the minimum specs for Silent Hill 2 (remake). You can play this ultra whizzbang AAA game that stresses current gen consoles with an upper midrange card from... 7 years ago. Like I know the idea of midrange is a little bit different between 2002 and 2017 but even a flagship card that was 5+ years old in the mid-2000s would have been unusable junk.
Half-Life 2 on a GeForce 2 MX 400. This entry-level GPU was just 3 years old when HL2 came out:

That's like if 2024 games could only run on a 3060 at 720p, absolutely minimal settings, and barely maintain 45 fps.
 
These ridiculous prices of PC parts, both new and used, will end up killing PC enthusiasm in general. I am reminded of this when I was watching a video from Dawid Does Tech Stuff's Japan video and seeing those prices have reminded me how much parts costs are now. Its no wonder the general population just goes to consoles and phones; why bother with extreme prices for a computer when """budget""" options like a console, smartphone and laptop exists to fulfill the same thing, except these tech can't be the solution to a manufactured problem and are planned to become obsolete few years from the purchase date.

What you need to pray game is highly exaggerated. You can get a 1080p capable GPU for around $200 (that video used a low profile 75 Watt RTX 3050 6GB to work in a cheap OptiPlex). Console price/performance is not being beat for the most part but it's a general purpose computer you can use for anything until broken. If your budget is $500, you can probably get a reasonably good mini PC. Laptops are PCs and have been the majority of sales for years now. They start out very expensive but can fall to reasonable levels. There have been Lunar Lake laptops for around $500 recently, which would probably be ok for 1080p.

I think we'll see DIY socketed desktops slowly decline in favor of cheap soldered BGA options like the Minisforum motherboards with 16-core mobile chips, or mega APUs like Strix Halo that can't be socketed and may not even have upgradeable RAM. You'll eat shit if the price is right (like buying a console).

 
Like the ridiculous prices of PC parts in the 90s killed PC enthusiasm in general? Nigger, a midrange gaming PC in 2002 would have run you $1500. That's over $3k today. And yet PC gaming was doing just fine. And that midrange gaming PC would have been lucky to do 60 fps/medium/800x600 on games two years later. Meanwhile people get a 4060 nowadays and complain that they they might have to to turn things down from Ultra to High in order maintain above 100 fps. Oh, and that's not even getting into the fact that medium presets in 2002 looked like garbage while low presets in 2025 look barely distinguishable from ultra.

PC maintains a strong following because it does things consoles can't. That was how it was in 2002 and that's how it is today.
Translation "I can't afford a 5090 or 5080 so I'm going to doom post", not to you but to the nigga you replied to.
 
It's interesting that current ultra high end GPUs are basically equivalent to SLI flagships of yesteryear. Like if you'd slap together two 980tis I'd come out to about the same price in today's money as a 4090.
 
  • Thunk-Provoking
Reactions: Kane Lives
That's like if 2024 games could only run on a 3060 at 720p, absolutely minimal settings, and barely maintain 45 fps.
That would be more annoying if the visual standards had declined for the prior 10 years, as is the case today. This was one of the best looking games in the world at the time and people were really excited to try it and actually interested in getting new hardware for something pushing the envelope.

The retarded niggers that can barely sidegrade a "remaster" onto hardware 500% more powerful than what the original ran on don't get the same consideration.
 
It's interesting that current ultra high end GPUs are basically equivalent to SLI flagships of yesteryear. Like if you'd slap together two 980tis I'd come out to about the same price in today's money as a 4090.
I had a 980Ti for a few months through 2023-2024, it was surprisingly capable of running most of the games I play at medium settings, and was a decent stopgap until I got my current 6750XT. Helped too that I had 1080p60hz monitors until now.
 
  • Agree
Reactions: Kane Lives
It's interesting that current ultra high end GPUs are basically equivalent to SLI flagships of yesteryear. Like if you'd slap together two 980tis I'd come out to about the same price in today's money as a 4090.
Now consider this:
A 3060Ti or a 4060 is as good as a 1080Ti. The 1080Ti still being a very capable GPU for 1080p gaming.
A 3090 is as good as two 1080Ti's. That gives you a ton of buffer for 1080p and good 1440p capability.
3090 = 4070Ti = 5070.

Even if you're not aiming for the best of the best, the current midrange offerings are on a really high performance level. And remember: a 1060 was able to run games from 2016 at 1080p just fine. I played The Witcher 3 and GTA V on one. Moore's Law was exponential, but the evolution of video game graphics and their processing demand wasn't. What's happening now is not the "death of Moore's Law" as Jensen Huang wants everyone to believe, it's an industry wide competency crisis where devs are incapable of properly harnessing the hardware power we have on hand.

In short: Moore's Law kept on going, and we've already reached the sweet spot of hardware power, graphics capability and processing demand. SLI is dead because Moore's Law superseded the need for it. You get a 4060, you have a 1080Ti. You get a 4070Ti, you have two 1080Ti's.
 
Like if you'd slap together two 980tis I'd come out to about the same price in today's money as a 4090.
980 Ti = 601mm^2 28nm, 8 billion transistors, 2816 shaders, 6 GB, 100% ($649 2015 MSRP / $800-810 adjusted to 2022 dollars)
4090 = 609mm^2 TSMC 4N (5nm), 76.3 billion transistors, 16384 shaders, 24 GB, 555% ($1599 2022 MSRP)

Some of the 4090 die area has been used for raytracing/AI instead of raster performance.
In short: Moore's Law kept on going, and we've already reached the sweet spot of hardware power, graphics capability and processing demand. SLI is dead because Moore's Law superseded the need for it. You get a 4060, you have a 1080Ti. You get a 4070Ti, you have two 1080Ti's.
SLI could spiritually return as multi-chiplet GPUs, alllowing the reticle limits to be busted in a single card, although AMD used only one GCD for RDNA3 and pulled back on RDNA4, and Intel's multi-chiplet (gaming) plans are a distant memory.
 
What's happening now is not the "death of Moore's Law" as Jensen Huang wants everyone to believe, it's an industry wide competency crisis where devs are incapable of properly harnessing the hardware power we have on hand.

Moore's Law has been dead for years. AMD and Intel are trying to pretend it's not by holding up massive arrays of power-gobbling chiplets, but the fact they have to do that proves it's dead.

Also ,you've never told me what "properly harnessing the hardware power" even means. Apparently it's not more complex lighting or higher resolution assets, so what exactly is it? What software technology is not being implemented in games that could be on current systems?

It looks to me like exactly what Satoru Iwata predicted has been happening for 15 years now, each objective doubling of compute power leads to a smaller and smaller subjective impression of improvement.
 
Last edited:
you've never told me what "properly harnessing the hardware power" even means
Your constant denial of optimization just defies reality. You act as if there are any modern developers on the same level as Carmack or Chris Sawyer, when in reality modern developers know only the basics of Unreal Engine which is bloated to hell.

Ignoring this guys autism, do you have response to videos like this:
 
Also ,you've never told me what "properly harnessing the hardware power" even means. Apparently it's not more complex lighting or higher resolution assets, so what exactly is it? What software technology is not being implemented in games that could be on current systems?
Qwen 2.5b and AI voice synthesis to make NPCs react dynamically to conversations.
 
Moore's Law has been dead for years. AMD and Intel are trying to pretend it's not by holding up massive arrays of power-gobbling chiplets, but the fact they have to do that proves it's dead.

Also ,you've never told me what "properly harnessing the hardware power" even means. Apparently it's not more complex lighting or higher resolution assets, so what exactly is it? What software technology is not being implemented in games that could be on current systems?

It looks to me like exactly what Satoru Iwata predicted has been happening for 15 years now, each object doubling of compute power leads to a smaller and smaller subjective impression of improvement.
I still remember all the 2015 games that were seen as graphics pinnacles get complained that they were hard to run and needed more optimization at the time.
 
You act as if there are any modern developers on the same level as Carmack or Chris Sawyer

You act as if a ten-line optimization as simple as Carmack's fast inverse square root will double frame rates in modern games. Doom 3 barely ran at 30 fps on then-recent hardware anyway, so I don't know how you point that as "good optimization" while chimping out about a modern game "only" running at 70 fps.
 
Last edited:
  • Like
  • Dumb
Reactions: Fcret and geckogoy
I need to replace my island of misfit toys spare parts rig that I use as a stream/steamlink/browsing/light computing machine. Basically want to accomplish the same things plus a little more power in a smaller form factor so I can move it into my entertainment center. I think I've narrowed down to a 7840HS ryzen mini-PC. Any opinions on brands/what to avoid?
 
I need to replace my island of misfit toys spare parts rig that I use as a stream/steamlink/browsing/light computing machine. Basically want to accomplish the same things plus a little more power in a smaller form factor so I can move it into my entertainment center. I think I've narrowed down to a 7840HS ryzen mini-PC. Any opinions on brands/what to avoid?
Minisforum is the main big company doing this stuff nowadays. They're all largely fine but I would make sure whatever you're buying isn't using liquid metal as badly sealed liquid metal systems have been a major problem for these cheapo mini PCs.

Also possibly cursed suggestion - M4 Mac Mini. As long as the stuff you want to do runs on MacOS, the M4 is going to feel much snappier and is also going to come with a much better GPU than the 780M you see in a lot 7840HS mini PCs.
 
Minisforum is the main big company doing this stuff nowadays. They're all largely fine but I would make sure whatever you're buying isn't using liquid metal as badly sealed liquid metal systems have been a major problem for these cheapo mini PCs.

Also possibly cursed suggestion - M4 Mac Mini. As long as the stuff you want to do runs on MacOS, the M4 is going to feel much snappier and is also going to come with a much better GPU than the 780M you see in a lot 7840HS mini PCs.
Plus the Apple chips have pretty good video transcoding abilities, the Jellyfin devs say it's even better then the Intel transcoding. file storage will be a problem though
 
  • Informative
Reactions: Post Reply
Wow, I finally experienced my very first SSD failure ever after something like 15 years of using them. Zero warning signs of hardware problems - everything was perfectly fine one minute and I got a hard lock-up and my laptop wouldn't even POST until I pulled the drive. I tested it in a USB enclosure and it wouldn't even recognize a drive was there and eventually froze my desktop too.

Thank God I'm pretty paranoid about backing important stuff up, but that's still something I'm going to have to take into account in the future. I'm used to some amount of wonky behavior and a grace period before catastrophic data loss.
 
I went ahead and found some old numbers on the supposedly wonderfully well-optimized Battlefield 3. When BF3 launched in 2011, the 500 series was cutting edge, and 9 series was 3 years old. When BF2042 launched in 2022, the 30 series was cutting edge, and the 20 series was 3 years old. So relative to their respective launch dates,

GTX 590 = 3090
9800 GT = 2080 Ti

Looking 1080p ultra in both games -- 1920x1440 monitors had been around for 10 years already in 2011 -- here's what I found

Battlefield 3, GTX 590: 98 fps
Battlefield 3, 9800 GT: 20 fps

Battfield 2042, 3090: 157 fps
Battlefield 2042, 2080 Ti: 120 fps

Everybody who thinks the 2010s was this gaming nirvana when all these hyper-optimized game ran at high framerates at maxed out settings on all recent GPUs either wasn't there or is just misremembering everything. It was actually an era where if your GPU was more than 3 years old, you might as well just have dog shit in your PCIe slot. Imagine if your 3090 could only manage 20 fps in the hottest games of 2025 unless you ran at absolute minimal quality. That's what the early 2010s were like.

 
PC has this nice feature called settings menu, look at all things you can tune to balance looks and performance:

1742121821015.png

I was playing BF3 on i3 2100 and GTX 550Ti @1080p with mix of medium and high settings at ~60fps in 2012. It could do ultra, but not very well (obviously).
 
Back