- Joined
- Dec 17, 2019
How about, for once, you talk about the 2015-2016 era, Pascal GPU's like the 1060 and the 1080Ti, and games like The Witcher 3, GTA V and Metal Gear Solid V?That's what the early 2010s were like.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
How about, for once, you talk about the 2015-2016 era, Pascal GPU's like the 1060 and the 1080Ti, and games like The Witcher 3, GTA V and Metal Gear Solid V?That's what the early 2010s were like.
How about, for once, you talk about the 2015-2016 era, Pascal GPU's like the 1060 and the 1080Ti, and games like The Witcher 3, GTA V and Metal Gear Solid V?
3-year entry (low) | 3-year performance (ultra) | 1-year entry (ultra) | 1-year performance (ultra) | |
Witcher III | 10 fps | 25 fps | 33 fps | 58 fps |
Helldivers II | 108 fps | 118 fps | 69 fps | 158 fps |
on a 630
So you're talking about something completely else, okay then.1080p ultra
Pascal GPUs came out in 2016, one year after the Witcher III came out. A developer can't optimize his software for hardware that doesn't exist yet. Nobody complaining about optimization is saying, "Argh! This game that just came out probably won't run well on next year's GPUs!" Nobody says, "This game was so well optimized...I only had to wait a whole year for a GPU to come out that could handle it!"While constantly refusing to acknowledge the Pascal GPU's
1060 or 4060 1080p Ultra | 1070 or 4070 1440p Ultra | 1080 or 4080 4K Ultra | |
Witcher III | 44 fps | 56 fps | 50 fps |
Hitman 3 | 164 fps | 163 fps | 185 fps |
Transistor count kept growing year after year. Exponentially. Most graphs that represent it are linear, which is misguiding.
Which is exactly what I said. I even highlighted that part in the post above in hopes your autism would be able to catch it but I guess it wasn't enough. So:Nobody complaining about optimization is saying, "Argh! This game that just came out probably won't run well on next year's GPUs!" Nobody says, "This game was so well optimized...I only had to wait a whole year for a GPU to come out that could handle it!"
Oh I remember the past very well, because I played at 1080p at Medium on a 1060. But if you insist on doing your comparisons where you scale with the heaviest possible scenarios instead of the ones 99% of people played with then be my guest. No one played Witcher 3 at 1080p on Ultra on a 1060. Which I've already mentioned in the post you were replying to.Seem you arent remembering the past very well.
GPU's. We're talking about GPU's. They're still monolithic. Now you're suddenly switching to talking about CPU's. I shouldn't have ever mentioned Moore's Law since now you'll now hyperfixate on how ackshually we weren't exactly doubling transistor count every year so Moore's Law didn't apply for years. Even though that wasn't the core of my fucking argument. You'd know what the core of my argument was if you've actually read what I wrote instead of having your autism lock onto Moore's Law so I'm not even going to try to reiterate it.Newer processors
Nvidia hasn't moved to chiplets for gaming GPUs yet. I brought up how the 980 Ti and 4090 have roughly the same die size on their respective nodes, and the 4090 is ~5.5x faster (didn't look up games, just used TechPowerUp score) despite using some (much?) of the ~9.5x transistor budget for raytracing and AI instead of raster.Newer processors are actually array of chiplets made on multiple different processes. The only way they're able to pretend Moore's Law is still alive is by putting this massive array of chips on package the size of my hand on the same graph with single-chip CPUs that fit on my fingertip:
I can definitely tell the difference visually between games made 10 years ago and games made today. Maybe you should get your eyes checked.To reiterate: the argument everyone is making is that we already have good hardware that can deliver good visuals. It's the game devs that are wasting this potential and forcing everyone to keep buying newer hardware because of their own incompetence. That's what's everyone talking about, not how unoptimized games were X years ago on a GT 9800 or whatever the fuck. Modern iGPU's have more computing power than that, it doesn't matter for today's issues.
Doesn't that use the same engine as the previous 2 entries and is basically just the last one in an episodic series that started much earlier? Either way I don't think they're comparable since the engines that those games run on utilize the CPU differently. And TW3 was basically the most demanding game of its time bar none while Hitman wasn't.Hitman 3 seems like a reasonable choice to me.
Except that also applies to the 2000s. Games in this period typically had a turnaround time of 18 months and they certainly used off-the-shelf engine slop like Gamebryo and RenderWare.Instead of performance overhead being used to pump better visuals, its being leaned on to support unsustainably fast development times enabled by unoptimized one-size-fits-all engine technologies,
Maybe 1920x1080 on the high-end. The GeForce 500 series struggled at 1080p and even with the then-new GTX 680 not every game would run well at 1080p (keeping in mind that 'running well' means ~70 fps).And in terms of performance from the 2000s, keep in mind that resolution and fidelity was actively being pushed. From 2000 to 2012 or so, games went from 800x600 as standard to 1920x1080. From the early 2010s to now, we've been stuck at that same resolution with big publishers and studios having no desire to increase it.
Most big-budget games nowadays target 4K. And even putting that aside, I don't know a single person still gaming at 1080p on PC. Everyone's moved on to at least 1440p.From the early 2010s to now, we've been stuck at that same resolution with big publishers and studios having no desire to increase it.
Which is kinda his point - even with midrange Turing cards, most new games run well above 60 fps as long as you tune the settings down towards medium or high.PC has this nice feature called settings menu, look at all things you can tune to balance looks and performance:
View attachment 7098564
I was playing BF3 on i3 2100 and GTX 550Ti @1080p with mix of medium and high settings at ~60fps in 2012. It could do ultra, but not very well (obviously).
I see 8600G at $189 on Amazon/Newegg, 8700G at $255. Newegg is giving $5 off either with a code. 8600G without overclocking looks OK for 1080p LOW. What price makes the 8600G (which has more than enough CPU perf) attractive? 5600G prices bottomed out at around $100-120 with some rare $80 deals. There is a 5600GT successor at about $130 for AM4 but the Cezanne iGPU is still too weak. 8500G is a compromised 4 CU Phoenix2 APU and is way too high at $149, and the 8300G also has 4 CUs but hasn't seen a retail release.And then you get into the absolutely abysmal state of entry-level. The 4060 is an okay card but $300 is not where entry-level needs to be. The lack of anything for people who simply don't have $300 to spend on a new card is frustrating if you're a kid or not from the first world. APUs were supposed to be the new entry-level but their pricing is just as bad.
It seems bizarre to me if a 5050 will have access to MFG, but a 4090 couldn’t.Rumors say an RTX 5050 desktop GPU will exist, with a slightly higher TDP than the 4060. I think the best you could hope for is for it to be slightly faster than the 4060 at $250... MSRP.
THE ARGUMENT EVERYONE IS MAKING IS THAT WE ALREADY HAVE GOOD HARDWARE THAT CAN DELIVER GOOD VISUALS. IT'S THE GAME DEVS THAT ARE WASTING THIS POTENTIAL AND FORCING EVERYONE TO KEEP BUYING NEWER HARDWARE BECAUSE OF THEIR OWN INCOMPETENCE
Oh I remember the past very well, because I played at 1080p at Medium on a 1060. But if you insist on doing your comparisons where you scale with the heaviest possible scenarios instead of the ones 99% of people played with then be my guest. No one played Witcher 3 at 1080p on Ultra on a 1060. Which I've already mentioned in the post you were replying to.
GPU's. We're talking about GPU's. They're still monolithic. Now you're suddenly switching to talking about CPU's.
None of these people you are invoking actually agree with you and your statements are not logically downstream of their statementsYou know who else thinks so? John Carmack.
We have no reason to believe you are any good at thisBut what the fuck do I know, I'm just a big fat idiot who spent all week optimizing GPU-accelerated physics code.
Dell Inspiron 14 2-in-1 (Open-Boxes): 14" FHD+ IPS Touch, Ryzen 5 8640HS, 8GB DDR5, 512GB SSD $289.99I need to replace my island of misfit toys spare parts rig that I use as a stream/steamlink/browsing/light computing machine. Basically want to accomplish the same things plus a little more power in a smaller form factor so I can move it into my entertainment center. I think I've narrowed down to a 7840HS ryzen mini-PC. Any opinions on brands/what to avoid?
The PS3 and PS4 generations were both aberrationally extended due to 1) 2008 2) Covid so you had Pascal basically have a ridiculously long glory time, as games didn't really shift. Now that we're seeing the generational jump finally happen, it's throwing people off.EDIT: I think really the issue is vibes. Pascal was a sweet spot that spoiled a lot of consumers (even when it was new it was insane value compared to Maxwell). From 2020 onwards, it feels like we've been moving in the opposite direction of what the Pascal era was and even though the numbers show that performance tiers haven't changed much, it doesn't feel great to have something like the 4090/5090 hanging around that customers need to think about (almost like an inverse halo product).
The PS3 and PS4 generations were both aberrationally extended due to 1) 2008 2) Covid so you had Pascal basically have a ridiculously long glory time, as games didn't really shift. Now that we're seeing the generational jump finally happen, it's throwing people off.
It doesn't help TSMC demand is insane between AI, Professional Cards, The switch to PC from consoles all happening at once.
People are judging how old games run through the scope of modern hardware. Old games ran and still do run like absolute shit for the visuals offered lol. Getting around 110-144 fps on spiderman remastered on my 7900xtx. Now, how much fps do you think I'm getting on the pc port of spiderman 3 the game from 2007? I'm getting similar if not worse performance despite the game looking 10x worse. Not even just an issue that plagues low budget games. Saints row 3 is legit dipping into the mid 50s at times on my system while gta 5 is chugging along at 120hz ez.I had a GeForce4 MX (which was a rebadged GeForce 2) and would have died in shock if NVIDIA came out with a technology that allowed it to run new games when it was 2 years old at 800x600x60 fps. Now you have people with 7-year-old cards bitching that they can't get 120 fps at max settings and crying about the "good old days" when "everything was optimized."
I thought I heard some dooming about XeSS recently, but I forgot what it was about. Seems it's still alive, for now.Unfortunately, it seems that Intel has not released XeSS 2 as open source. The algorithms still come as precompiled libraries, and despite promises dating back even before the Arc Alchemist launch (and the first XeSS), the company has never published any open-source upscaling technology to date. Perhaps launching XeSS 1 as open-source would save Intel’s face, but it looks like there are no such plans either.