Just Some Other Guy
kiwifarms.net
- Joined
- Mar 25, 2018
They have nobody to blame but themselves.I think it is NVIDIA trying to find a way to reinvigorate GPU sales, which are at their lowest point in ages
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
They have nobody to blame but themselves.I think it is NVIDIA trying to find a way to reinvigorate GPU sales, which are at their lowest point in ages
Some of that possibly. Consoles have also been gaining in capability, and now a lot of PC-centric titles are now geared towards consoles. Now we combine that with the push to make midrange gpus the price of a console and tv....and well, is it any surprise? Nvidia has been wanting to ditch the gaming-centric market for a while now. They want enterprise. The pc crowd merely gets the scraps for ever higher cost.It's been organically sliding downward steadily since 2007. Interestingly, that's when COD4 came out. I think there's a strong case to be made that fewer and fewer people care about each successive advance in graphics fidelity.
View attachment 5061130
View attachment 5061131
Graphics don't matter for competitive multiplayer games that need to run fast and work on a huge range of crappy computers to sustain a large player base.Given how ancient some of these games are, I think it shows that graphics just don't matter like they did in the old days. Imagine if in 2010, the #1 multiplayer game was still UT '99.
Exactly one game in the top 10 that game out in the last 12 months, and the average game is 8 years old.
- CS:GO - 11 yr old
- DOTA 2 - 10 yr old
- Apex Legends - 4 yr old
- PUBG - 6 yr old
- Path of Exile - 10 yr old
- Destiny 2 - 6 yr old
- Rust - 5 yr old
- GTA V - 10 yr old
- TF2 - 16 yr old
- MW II - < 1 yr old
View attachment 5062602
The point of diminishing returns has been well surpassed and we're now on an asymptote towards "photorealism". MW II doesn't have appreciably better graphics to the point where an 11-year-old game looks antiquated, unlike in 2010 where games like UT99 and Q3A looked hopelessly dated. The only reason to upgrade now is to keep up with the bloat of AAA games for the same boring Skinner box formula that was perfected in the PS3 generation.Given how ancient some of these games are, I think it shows that graphics just don't matter like they did in the old days. Imagine if in 2010, the #1 multiplayer game was still UT '99.
Exactly one game in the top 10 that game out in the last 12 months, and the average game is 8 years old.
- CS:GO - 11 yr old
- DOTA 2 - 10 yr old
- Apex Legends - 4 yr old
- PUBG - 6 yr old
- Path of Exile - 10 yr old
- Destiny 2 - 6 yr old
- Rust - 5 yr old
- GTA V - 10 yr old
- TF2 - 16 yr old
- MW II - < 1 yr old
View attachment 5062602
I don’t think Nvidia gives nearly as much of a fuck about the consumer GPU market anymore, especially after the mining bubble popping. Nvidia’s future growth and revenue is primarily going to derive from AI technology/hardware. Selling a couple thousand units of high end consumer GPUs is nothing compared to the vast amount of high end machine learning/data center cards like the A100 that continue to grow in demand.They have nobody to blame but themselves.
this wasnt an april fools joke?APUs will eat into discrete graphics. The Steam Deck's is good enough for 720p, while more powerful ones take on 1080p, the resolution of the majority.
Reminding me of the ASUS ROG Ally, which could be priced less than expected:
![]()
ASUS ROG Ally: New AMD Zen 4 and RDNA 3 gaming handheld launching soon with Steam Deck competitive pricing
ASUS has teased that its first Windows gaming handheld could almost be upon us, only a few weeks after teasing the device. The ASUS ROG Ally is also now expected to be priced at Steam Deck levels, despite featuring an AMD Zen 4 and RDNA 3-based APU.www.notebookcheck.net
![]()
ASUS ROG Ally: New AMD Zen 4 and RDNA 3 gaming handheld launching soo…
archived 17 Apr 2023 00:46:08 UTCarchive.ph
Yup. They went with the ol' do something real on April Fool's marketing strategy.this wasnt an april fools joke?
The reason why people say "The 4070 is actually a 4060" is because its hardware (and performance) is what a x60 class GPU used to be.4070: 5888 CUDA cores @ 1.9-2.5 GHz, 12 GB GDDR6 @ 504.2 GB/s
4060: 3840 CUDA cores @ 2.3-2.5 GHz, 8 GB GDDR6 @ 288 GB/s
Overpriced or not, those aren't the same card.
The reason why people say "The 4070 is actually a 4060" is because its hardware (and performance) is what a x60 class GPU used to be.
The x70 class used to have >=50% as many cores as the biggest GPU. (56% in the case of the 3070 compared to 3090. )
The x60 class generally had less than that, around 30-40%.
The 40"70" has 39% of the cores of a 4090.
The x70 class used to have a 256bit memory bus. (even my 970 has this, just as the 3070)
The x60 cards 192bit.
The 40"70" has 192bit.
All of this leads to performance of course.
The x70 class used to perform better/the same as the previous gen 80ti/high-end card.
The x60 class slightly worse or the same as the x80 base model.
The 40"70" runs like a 3080 10gb at 1080p and up to 10% worse at higher resolutions.
So people dont just imagine that this is a x60(ti) class card solely because of autism and gamer salt,
it would have been exactly that in any other generation before this 40xx shitshow.
Obviously Nvidia is going to cobble together some joke as their 40"60", but that does not change that the 4070 is equipped with historical 60-class hardware and performance.
At the end of the day it does not really matter what they call them either. At 400$ this would not be a (huge) problem.
But Nvidia is pricing this at a premium compared to the 3070 despite downgrading everything to the level of a x60 class card.
It would be a nice 4060 (at x60 class prices) too, which makes this extra irritating.
Because now the "real" 4060 is probably going to get x50 class hardware (like a 128 bit bus lol) and run like the 3070.
Despite the fact that the x60 class used to get you previous generation x80 performance etc.
What about consoles? If it's a population-wide trend, we'd still be making 2011-era console games with record sales. Even the switch is better than the Wii. I would think, if it is to be similar, it's normalized for market size.I found a better chart. This what an overshot market looks like. Across all platforms, market interest in better graphics has been sliding.
View attachment 5065377
There are people who worship Nvidia literally because they're "sharks". Like, why would a consumer ever respect a company's ability to screw the market to the wall? It makes no sense unless you're a shareholder.
Same shit in the bulldozer days when Intel fanbois were cheering on the destruction of AMD. Dumb niggers apparently would be happy if your only cpu choice was Intel.
On Windows. The Linux Nvidia drivers are pure trash, while AMD and Intel work just fine.AMD can't write drivers for shit. that's whay nvidia owns them, because they can acutally write driver code.
I wish AMD was better but they aren;t, nvidia wipes there ass
I've got a 3090 wouldn't trade it for an amd.
What about consoles? If it's a population-wide trend, we'd still be making 2011-era console games with record sales. Even the switch is better than the Wii. I would think, if it is to be similar, it's normalized for market size.
And I don't think the loss of interest in graphics is what caused the peak, it's mobile gaming. 2010-2012 is when phones got good enough to play 3D games.
Just had my worst ever driver crash. Looked like this for about a minute until it came back.
View attachment 5065631
Been having this issue for about 2 years, usually the screens will just go grey/white for a couple seconds and come back.
Literal reddit-tier regurgitation.AMD can't write drivers for shit. that's whay nvidia owns them, because they can acutally write driver code.
I wish AMD was better but they aren;t, nvidia wipes there ass
I've got a 3090 wouldn't trade it for an amd.
Literal reddit-tier regurgitation.
Funny how everyone has 1st hand AMD driver knowledge despite AMD sales being far fewer.
Had black screens on an early 5700xt that eventually got fixed, and absolutely zero between 3 6700xts and a 6800xt. Thought the 6800xt was giving me instability, but unlike redditors I actually isolated the issue.
It was my motherboard. Card works great in a new am5 setup.
Retards have a screen glitch, go online and see other retards go "muh amd driver 6 years ago...", and don't isolate anything. There's cases where just replacing a cable fixed "driver issues".