GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

You seem to think most people buy a computer every year, since you're fixated on Newegg prices on vintage parts.
"Build pc every year" No? 2016-2017 was quite a while ago. If someone is fine with 2gb of ram since then, I really doubt they even factor into the market for games because 2gb gets you....absolute shit now. Got you not much better than shit back then, too. A 1050 buyer was never actually interested in game performance.

If you want to hang on to a card for that long, the 1080ti was a much better buy. You could still be rocking games at max 1080p and even some 1440p. Hell with FSR you could even be doing more. If you care about longevity, you always buy more up front. Unless you really don't care about being stuck on substandard graphics, or you only want to play old stuff/indies. If it's the later than why should anyone care? Nobody talks about them because there's no money to be made.

I'd hope someone would have at least updated their cpu by then and not using one that's 6-7 years old riddled with security mitigations and whatnot. God, they might even be using a spinner drive for games.

Idk man. I get it. "If it still works for what I'm doing". But like, sometimes just spending $100 can render such a huge pc experience improvement. I'd be loosing my mind if I was still running a bottom of the barrel setup (i3/1050 2g, probably a spinner drive) from 2016 in 2023. Or I just don't use my pc.
 
That bell curve graph shows usage but if it were to show profit vs. usage it would have a weird but not insignificant little secondary hump over to the right. The reason is Halo products. Pricing models follow a point of cost vs. benefit for the customers, find the points where people will pay a bit extra to get a bit extra, settle for a little less in return for paying a little less - those are your pricepoints and they follow a sensible(ish) correlation between value and cost.

Halo Infinite's min spec GPU is the 4 GB 1050 Ti...so Microsoft cast its net about like how you'd expect.

I say all this because your graph shows profit from a game producer's point of view. But we were originally talking about card manufacturer's point of view and that one looks a little different.

A couple times in this thread, people opined that 12 GB cards will become unusable in just a few years. I am pointing out why, in fact, this is not going to happen. Internet forums are inordinately skewed toward nerds.

Hell with FSR you could even be doing more. If you care about longevity, you always buy more up front. Unless you really don't care about being stuck on substandard graphics, or you only want to play old stuff/indies. If it's the later than why should anyone care? Nobody talks about them because there's no money to be made.

No, nobody talks about them because there's nothing new to say about GPUs that came out 7 years ago, and the people still using them aren't the kinds of turbo-dorks that post about GPUs on internet forums. I'm pretty sure very few people in this thread talking about GPUs make any money at all on games or gaming GPUs.

If someone is fine with 2gb of ram since then, I really doubt they even factor into the market for games because 2gb gets you....absolute shit now.

There are about 3x as many gamers with 2 GB of VRAM as there are with 16 GB or more. Unsurprisingly, quite a few of Steam's top-grossing games in 2022 can run well on old, low-end cards. Gotta capture that revenue.

Idk man. I get it. "If it still works for what I'm doing". But like, sometimes just spending $100 can render such a huge pc experience improvement. I'd be loosing my mind if I was still running a bottom of the barrel setup (i3/1050 2g, probably a spinner drive) from 2016 in 2023. Or I just don't use my pc.

The only buyer who cares what you want to buy is you. You keep conflating your personal preferences with market trends. It's not a good way to make money.
 
A 1050 buyer was never actually interested in game performance.

If you want to hang on to a card for that long, the 1080ti was a much better buy.
The 1080ti cost as much as seven 1050s and it wasn't seven times as fast. There's absolutely a bit of Vimes Boots going on at the absolutely low end but the $100 1050 wasn't low end, it was entry level for gamers and absolutely respectable at that level of price and performance. The previous 970/980 had 4GB of vram so a low cost 2GB card wasn't an outrage.
I'm pretty sure very few people in this thread talking about GPUs make any money at all on games or gaming GPUs.
Not a single person that has ever posted in this thread have ever made a cent from games or gaming GPUs or anything related to it. This is a casual thread.
 
Last edited:
There is some drama about the 4060ti launch, with some outlets applying their seal of approval&recommendation on the product review, while others (mostly Youtubers like Gamer's Nexus, Hardware Unboxed and others) are bashing Nvidia and its overpriced card hard.
Apparently JayzTwoCents has originally sided with the positive reviews, only to be "forced" to delete and apologize:
Something fishy going on maybe with the very polarized reviews.
 
https://youtu.be/Y2b0MWGwK_U
GN's analysis is that the 4060ti is slower than the 3060ti in some instances. Well, statistically the same. But hey, dlss3.

DLSS isn't a minor feature. It's core to NVIDIA's GPU technology. NVIDIA is betting that bigger tensor cores for better inferencing will give far more real-world gains than just slapping on more standard CUDA cores, so focusing on how well the GPU performs with one of its key features disabled is missing the point.
 
I think Nvidia is missing the boat on the 4060ti.

The 3060 12GB was genius really. Take all of your largely defective GPUs and slap em on a board with a little more memory, and you can sell it at a midrange premium. You give something, you get something better in revenue. I have no idea why they are advertising the successor to a tradeoff card as a premium product now. I also have no idea why anyone would buy a product that doesn't fit the market segment.
 
If you want to hang on to a card for that long, the 1080ti was a much better buy. You could still be rocking games at max 1080p and even some 1440p. Hell with FSR you could even be doing more.
Can confirm that the 1080ti is still pretty comfy. I generally game at ultrawide 1440 unless a game doesn't support it (or I'm going turbo-autist in EvE with a window snapped to every corner) and Dead Space was the first game that had me even contemplating upgrading, it was low 30s if not 20s no matter what settings I tried and even resolution scaling didn't help. I waited a week, the game updated a few times, and I grabbed the newest drivers and it ended up being fine, medium at 70-80 FPS, which isn't unpleasant at all with gsync
 
  • Like
Reactions: AgendaPoster
DLSS isn't a minor feature. It's core to NVIDIA's GPU technology. NVIDIA is betting that bigger tensor cores for better inferencing will give far more real-world gains than just slapping on more standard CUDA cores, so focusing on how well the GPU performs with one of its key features disabled is missing the point.
Sure, if you price it lower, as the 4060 TI is basically a 3060 TI but better at DLSS. Given the gains DLSS does give to supported (typically more popular) games compared to previous gen tensor cores, I think there's a place in the market for a $300 GPU that gives the kind of very specific performance uplifts that the 4060 TI can provide. Sure it might not be a good general purpose card, but for those who basically want a computer that can play 10 or so popular DLSS supporting games at good a framerate/quality ratio, it'd be perfect.

But when it's younger brother, the 3060 TI, the greatest mining card ever made (PBUH), is dead heat with it on a lot of tests not involving DLSS, that really makes it hard to justify the $400 pricetag for something that isn't really better overall.

The 3060 TI FE can be had used for between $200 and $300, and given the way miners tuned them, underpowered and underclocked to shit with a high memory OC, they're ironically probably in pretty good nick as long as you don't want to overclock their memory...and if you do get a dud, ebay has the buyer over a fucking barrel so long as you discover the problem promptish.
 
Sure, if you price it lower, as the 4060 TI is basically a 3060 TI but better at DLSS. Given the gains DLSS does give to supported (typically more popular) games compared to previous gen tensor cores, I think there's a place in the market for a $300 GPU that gives the kind of very specific performance uplifts that the 4060 TI can provide. Sure it might not be a good general purpose card, but for those who basically want a computer that can play 10 or so popular DLSS supporting games at good a framerate/quality ratio, it'd be perfect.

According to NVIDIA, 301 games support DLSS 2, up from 5 games that supported DLSS 1. 34 games already support DLSS 3.

But when it's younger brother, the 3060 TI, the greatest mining card ever made (PBUH), is dead heat with it on a lot of tests not involving DLSS, that really makes it hard to justify the $400 pricetag for something that isn't really better overall.

OEMs, who are the main buyers of GPUs, are going to drop 3060 for 4060 for the same reason they dropped 2060 for 3060 - NVIDIA will discontinue the last-gen product.

It's still not going to revitalize PC sales, but they seem to be right about the way tech is going - ML inferencing is gaining adoption as fast as programmable pixel pipelines and hardware T&L did.
 
OEMs, who are the main buyers of GPUs, are going to drop 3060 for 4060 for the same reason they dropped 2060 for 3060 - NVIDIA will discontinue the last-gen product.
Something tells me the OEMs aren't paying $400 a pop for 4060 TIs though.

Let's put it this way, if you're going to buy a new GPU in a box to put in your computer, the 4060 TI is a terrible buy at $400, which is the market for hardware reviews like that.
 
Can confirm that the 1080ti is still pretty comfy.
Hell, I was surviving on a 700 series, then a 900 before I fucking finally got something modern.

Sincerely glad at this point that I went with my heart and stayed with a 3080 for Win 7 compatibility instead of the latest-greatest, because it's sure seeming like the entire 4000 line is full of fail.
 
The main value I can see for the 4060ti is that it's a 16 GB card with CUDA support. For hobbyist AI/ML work, that means it's a $500 alternative to the 3080, 4080, and 4090 which are all significantly more expensive. Still a pretty disappointing card overall though.
 
  • Agree
Reactions: Bananadana
Hell, I was surviving on a 700 series, then a 900 before I fucking finally got something modern.

Sincerely glad at this point that I went with my heart and stayed with a 3080 for Win 7 compatibility instead of the latest-greatest, because it's sure seeming like the entire 4000 line is full of fail.
The cards are good IMO, the issue is how they are sold:
- there is too much of a distance between high)er) end products and mid
- there is too much of a distance between mid tier cards
- cards are obviously mislabeled, with the 4060ti being a 4050, 4070 a 4060, 4080, eh, like a 4070ti maybe. This is made quite clear by the fact that Nvidia tried to sell the 4070ti as a 4080.
- cards are sold at a huge profit; the chips making the additional 8GB VRAM on the upcoming 4060ti 16GB are like 30USD, from what I seen (didn't verify yet), yet the card is sold at a +100USD price
Other than these financial and marketing aspects, DLSS has gotten better and better, frame gen is OK-ish with some downsides, but good to have, the AV1 hardware encoder is great to have too, RT performance is superior to the competition, Tensor cores are useful for a number of other tasks, and power efficiency is greatly improved, minus the 4090 where they just went all in not an issue for a top end card.
 
Back