GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

They have nobody to blame but themselves.
It's been organically sliding downward steadily since 2007. Interestingly, that's when COD4 came out. I think there's a strong case to be made that fewer and fewer people care about each successive advance in graphics fidelity.

1681682512291.png
1681682527651.png
 
It's been organically sliding downward steadily since 2007. Interestingly, that's when COD4 came out. I think there's a strong case to be made that fewer and fewer people care about each successive advance in graphics fidelity.

View attachment 5061130
View attachment 5061131
Some of that possibly. Consoles have also been gaining in capability, and now a lot of PC-centric titles are now geared towards consoles. Now we combine that with the push to make midrange gpus the price of a console and tv....and well, is it any surprise? Nvidia has been wanting to ditch the gaming-centric market for a while now. They want enterprise. The pc crowd merely gets the scraps for ever higher cost.
 
APUs will eat into discrete graphics. The Steam Deck's is good enough for 720p, while more powerful ones take on 1080p, the resolution of the majority.

Reminding me of the ASUS ROG Ally, which could be priced less than expected:

 
Given how ancient some of these games are, I think it shows that graphics just don't matter like they did in the old days. Imagine if in 2010, the #1 multiplayer game was still UT '99.

  1. CS:GO - 11 yr old
  2. DOTA 2 - 10 yr old
  3. Apex Legends - 4 yr old
  4. PUBG - 6 yr old
  5. Path of Exile - 10 yr old
  6. Destiny 2 - 6 yr old
  7. Rust - 5 yr old
  8. GTA V - 10 yr old
  9. TF2 - 16 yr old
  10. MW II - < 1 yr old
Exactly one game in the top 10 that game out in the last 12 months, and the average game is 8 years old.


1681707882607.png
 
Given how ancient some of these games are, I think it shows that graphics just don't matter like they did in the old days. Imagine if in 2010, the #1 multiplayer game was still UT '99.

  1. CS:GO - 11 yr old
  2. DOTA 2 - 10 yr old
  3. Apex Legends - 4 yr old
  4. PUBG - 6 yr old
  5. Path of Exile - 10 yr old
  6. Destiny 2 - 6 yr old
  7. Rust - 5 yr old
  8. GTA V - 10 yr old
  9. TF2 - 16 yr old
  10. MW II - < 1 yr old
Exactly one game in the top 10 that game out in the last 12 months, and the average game is 8 years old.


View attachment 5062602
Graphics don't matter for competitive multiplayer games that need to run fast and work on a huge range of crappy computers to sustain a large player base.
 
Given how ancient some of these games are, I think it shows that graphics just don't matter like they did in the old days. Imagine if in 2010, the #1 multiplayer game was still UT '99.

  1. CS:GO - 11 yr old
  2. DOTA 2 - 10 yr old
  3. Apex Legends - 4 yr old
  4. PUBG - 6 yr old
  5. Path of Exile - 10 yr old
  6. Destiny 2 - 6 yr old
  7. Rust - 5 yr old
  8. GTA V - 10 yr old
  9. TF2 - 16 yr old
  10. MW II - < 1 yr old
Exactly one game in the top 10 that game out in the last 12 months, and the average game is 8 years old.


View attachment 5062602
The point of diminishing returns has been well surpassed and we're now on an asymptote towards "photorealism". MW II doesn't have appreciably better graphics to the point where an 11-year-old game looks antiquated, unlike in 2010 where games like UT99 and Q3A looked hopelessly dated. The only reason to upgrade now is to keep up with the bloat of AAA games for the same boring Skinner box formula that was perfected in the PS3 generation.
 
They have nobody to blame but themselves.
I don’t think Nvidia gives nearly as much of a fuck about the consumer GPU market anymore, especially after the mining bubble popping. Nvidia’s future growth and revenue is primarily going to derive from AI technology/hardware. Selling a couple thousand units of high end consumer GPUs is nothing compared to the vast amount of high end machine learning/data center cards like the A100 that continue to grow in demand.
 
APUs will eat into discrete graphics. The Steam Deck's is good enough for 720p, while more powerful ones take on 1080p, the resolution of the majority.

Reminding me of the ASUS ROG Ally, which could be priced less than expected:

this wasnt an april fools joke?
 
4070: 5888 CUDA cores @ 1.9-2.5 GHz, 12 GB GDDR6 @ 504.2 GB/s
4060: 3840 CUDA cores @ 2.3-2.5 GHz, 8 GB GDDR6 @ 288 GB/s

Overpriced or not, those aren't the same card.
The reason why people say "The 4070 is actually a 4060" is because its hardware (and performance) is what a x60 class GPU used to be.
The x70 class used to have >=50% as many cores as the biggest GPU. (56% in the case of the 3070 compared to 3090. )
The x60 class generally had less than that, around 30-40%.
The 40"70" has 39% of the cores of a 4090.

The x70 class used to have a 256bit memory bus. (even my 970 has this, just as the 3070)
The x60 cards 192bit.
The 40"70" has 192bit.

All of this leads to performance of course.
The x70 class used to perform better/the same as the previous gen 80ti/high-end card.
The x60 class slightly worse or the same as the x80 base model.
The 40"70" runs like a 3080 10gb at 1080p and up to 10% worse at higher resolutions.

So people dont just imagine that this is a x60(ti) class card solely because of autism and gamer salt,
it would have been exactly that in any other generation before this 40xx shitshow.

Obviously Nvidia is going to cobble together some joke as their 40"60", but that does not change that the 4070 is equipped with historical 60-class hardware and performance.
At the end of the day it does not really matter what they call them either. At 400$ this would not be a (huge) problem.
But Nvidia is pricing this at a premium compared to the 3070 despite downgrading everything to the level of a x60 class card.

It would be a nice 4060 (at x60 class prices) too, which makes this extra irritating.
Because now the "real" 4060 is probably going to get x50 class hardware (like a 128 bit bus lol) and run like the 3070.
Despite the fact that the x60 class used to get you previous generation x80 performance etc.
 
The reason why people say "The 4070 is actually a 4060" is because its hardware (and performance) is what a x60 class GPU used to be.
The x70 class used to have >=50% as many cores as the biggest GPU. (56% in the case of the 3070 compared to 3090. )
The x60 class generally had less than that, around 30-40%.
The 40"70" has 39% of the cores of a 4090.

The x70 class used to have a 256bit memory bus. (even my 970 has this, just as the 3070)
The x60 cards 192bit.
The 40"70" has 192bit.

All of this leads to performance of course.
The x70 class used to perform better/the same as the previous gen 80ti/high-end card.
The x60 class slightly worse or the same as the x80 base model.
The 40"70" runs like a 3080 10gb at 1080p and up to 10% worse at higher resolutions.

So people dont just imagine that this is a x60(ti) class card solely because of autism and gamer salt,
it would have been exactly that in any other generation before this 40xx shitshow.

Obviously Nvidia is going to cobble together some joke as their 40"60", but that does not change that the 4070 is equipped with historical 60-class hardware and performance.
At the end of the day it does not really matter what they call them either. At 400$ this would not be a (huge) problem.
But Nvidia is pricing this at a premium compared to the 3070 despite downgrading everything to the level of a x60 class card.

It would be a nice 4060 (at x60 class prices) too, which makes this extra irritating.
Because now the "real" 4060 is probably going to get x50 class hardware (like a 128 bit bus lol) and run like the 3070.
Despite the fact that the x60 class used to get you previous generation x80 performance etc.

This isn't how corporate branding works at all, not at intel, not at AMD, and not at NVIDIA. You're not guaranteed anything on the spec sheet by the brand beyond low / mid / high / highest. You can look across the entire range, from the 400 series to the 4000 series, and there's never been a universal rule. Sometimes X70 is equal to (X-1)80-Ti, sometimes it's slower. There's certainly no consistent ratio for transistor or core counts, or memory bandwidth (bus width is irrelevant alone).

I just looked at random examples and immediately started finding counterexamples to all your claims - shouldn't be surprising. None of these companies have a branding bible drawing a strict correspondence between brand and clock speed/transistor count/bandwidth/etc.
 
I found a better chart. This what an overshot market looks like. Across all platforms, market interest in better graphics has been sliding.

1681774357000.png
 
I found a better chart. This what an overshot market looks like. Across all platforms, market interest in better graphics has been sliding.

View attachment 5065377
What about consoles? If it's a population-wide trend, we'd still be making 2011-era console games with record sales. Even the switch is better than the Wii. I would think, if it is to be similar, it's normalized for market size.

And I don't think the loss of interest in graphics is what caused the peak, it's mobile gaming. 2010-2012 is when phones got good enough to play 3D games.
 
There are people who worship Nvidia literally because they're "sharks". Like, why would a consumer ever respect a company's ability to screw the market to the wall? It makes no sense unless you're a shareholder.

Same shit in the bulldozer days when Intel fanbois were cheering on the destruction of AMD. Dumb niggers apparently would be happy if your only cpu choice was Intel.

AMD can't write drivers for shit. that's whay nvidia owns them, because they can acutally write driver code.

I wish AMD was better but they aren;t, nvidia wipes there ass

I've got a 3090 wouldn't trade it for an amd.
 
AMD can't write drivers for shit. that's whay nvidia owns them, because they can acutally write driver code.

I wish AMD was better but they aren;t, nvidia wipes there ass

I've got a 3090 wouldn't trade it for an amd.
On Windows. The Linux Nvidia drivers are pure trash, while AMD and Intel work just fine.
 
What about consoles? If it's a population-wide trend, we'd still be making 2011-era console games with record sales. Even the switch is better than the Wii. I would think, if it is to be similar, it's normalized for market size.


Through 2017 (src)
1681819093769.png

Obviously, storytelling based on charts is speculative, but it looks to me like the last time the world went gaga for graphics was the first Playstation. Since then, sales & revenue have been flat, minus the Wii phenomenon.

And I don't think the loss of interest in graphics is what caused the peak, it's mobile gaming. 2010-2012 is when phones got good enough to play 3D games.

Mobile isn't cannibalizing PC. In fact, there's been a small amount of growth in PC gaming. People just aren't rushing out to buy the latest hardware. And why should they, when a 7-year-old graphics card can play games like CS:GO, WOW, Fortnite, Roblox, Minecraft, and Modern Warfare II? If you look at the Steam hardware survey, there are a lot of really old 1000-series Geforces on there (link), and a fair number of intel integrated graphics.

1681820508821.png




Just had my worst ever driver crash. Looked like this for about a minute until it came back.
View attachment 5065631
Been having this issue for about 2 years, usually the screens will just go grey/white for a couple seconds and come back.

You sure it's drivers and not thermal? I had a 5700 XT do weird stuff like that until I stopped trying to drive it at 4K.
 
Last edited:
  • Thunk-Provoking
Reactions: The Ghost of Kviv
AMD can't write drivers for shit. that's whay nvidia owns them, because they can acutally write driver code.

I wish AMD was better but they aren;t, nvidia wipes there ass

I've got a 3090 wouldn't trade it for an amd.
Literal reddit-tier regurgitation.

Funny how everyone has 1st hand AMD driver knowledge despite AMD sales being far fewer.

Had black screens on an early 5700xt that eventually got fixed, and absolutely zero between 3 6700xts and a 6800xt. Thought the 6800xt was giving me instability, but unlike redditors I actually isolated the issue.

It was my motherboard. Card works great in a new am5 setup.

Retards have a screen glitch, go online and see other retards go "muh amd driver 6 years ago...", and don't isolate anything. There's cases where just replacing a cable fixed "driver issues".
 
Last edited:
Literal reddit-tier regurgitation.

Funny how everyone has 1st hand AMD driver knowledge despite AMD sales being far fewer.

Had black screens on an early 5700xt that eventually got fixed, and absolutely zero between 3 6700xts and a 6800xt. Thought the 6800xt was giving me instability, but unlike redditors I actually isolated the issue.

It was my motherboard. Card works great in a new am5 setup.

Retards have a screen glitch, go online and see other retards go "muh amd driver 6 years ago...", and don't isolate anything. There's cases where just replacing a cable fixed "driver issues".

Hey guy, I'm a programmer and I've written driver code. Look up "hooking the system service table" sometime to see what I mean. I reverse engineer shit all the time.

AMD is shit for driver code. I use to run AMD's but I don't anymore because I don't want to deal with their shit.
 
Back