Nvidia RTX series

Two words: diminishing returns.

I have a 1080 Ti and Ryzen 1600. I can play just about everything in 4K at 60fps subject to optimisation (Kingdom Come lagged at times when it first came out). The fact that even Cyberpunk 2077, at its gameplay demo, was also running in 4K on a 1080 Ti, and CP77 is one of the few new games I would definitely want under any circumstances, indicates that I suspect this "ray tracing" is a bit of a gimmick for now.

GPUs have been ascending in price hugely over the past few generations mainly because it's effectively a monopoly at the top end. The GTX 600 series was fairly reasonably priced because it had very stiff competition in the form of the excellent Radeon HD 7970. The GTX 700 series was more expensive but while it just about outdid the Radeon R9 290 and 290X ("Hawaii") esp. once the latter was paired with a good cooler, it didn't justify the difference in price. Then the GTX 900 series came out and trashed AMD's offerings hard enough that the latter doesn't appear to have recovered in the top end. The GTX 970 was faster, cheaper, colder, and physically smaller than its equivalent R9 290 and traded blows with the 290X.

So what does Nvidia do knowing that the only competition they'll have will be a development of a card that was good in theory but in practice was underwhelming? Puts prices up, of course.

GTX 780 Ti - cost £350 at launch.
GTX 980 Ti - cost £500 at launch.
GTX 1080 Ti - cost £700 at launch.

Sorry, but ray tracing isn't important enough to me to justify spending £750 on an RTX 2080 or £1,200 (!) on an RTX 2080 Ti.
 
Definitely can't justify the extra $$$$$ on the 20 series, I could if VR were not still more or less in the first generation. As someone who runs mostly in 1080p and occasionally 4k. I am looking forward to picking up a 1080TI when the price drops.
 
Definitely can't justify the extra $$$$$ on the 20 series, I could if VR were not still more or less in the first generation. As someone who runs mostly in 1080p and occasionally 4k. I am looking forward to picking up a 1080TI when the price drops.
I've been planning to buy a 1080 once the new GPUs come out (that's how I nabbed my current R9 Fury for $250 a couple years ago), but if everyone else has the same idea I don't think it'll do much good. I've been seeing more hype for the cheap 1080s than the new 2080s, which means that in the end they won't actually get any cheaper.
 
  • Feels
Reactions: Gator Young Henning
Two words: diminishing returns.

I have a 1080 Ti and Ryzen 1600. I can play just about everything in 4K at 60fps subject to optimisation (Kingdom Come lagged at times when it first came out). The fact that even Cyberpunk 2077, at its gameplay demo, was also running in 4K on a 1080 Ti, and CP77 is one of the few new games I would definitely want under any circumstances, indicates that I suspect this "ray tracing" is a bit of a gimmick for now.

GPUs have been ascending in price hugely over the past few generations mainly because it's effectively a monopoly at the top end. The GTX 600 series was fairly reasonably priced because it had very stiff competition in the form of the excellent Radeon HD 7970. The GTX 700 series was more expensive but while it just about outdid the Radeon R9 290 and 290X ("Hawaii") esp. once the latter was paired with a good cooler, it didn't justify the difference in price. Then the GTX 900 series came out and trashed AMD's offerings hard enough that the latter doesn't appear to have recovered in the top end. The GTX 970 was faster, cheaper, colder, and physically smaller than its equivalent R9 290 and traded blows with the 290X.

So what does Nvidia do knowing that the only competition they'll have will be a development of a card that was good in theory but in practice was underwhelming? Puts prices up, of course.

GTX 780 Ti - cost £350 at launch.
GTX 980 Ti - cost £500 at launch.
GTX 1080 Ti - cost £700 at launch.

Sorry, but ray tracing isn't important enough to me to justify spending £750 on an RTX 2080 or £1,200 (!) on an RTX 2080 Ti.

I beg to differ, considering how well the 290 video cards aged far more gracefully than the 780/780ti. I know Hardware Unboxed did a video comparing the 2 cards recently. I also believe that the 390 will age far better than the 970, especially with that 3.5 GB limit(yes I know it has 4GB).

But I think that's the biggest issue with AMD: they can't win. Especially with consumers. Remember the GTX 480/580? They were hot & loud too. Yet those sold more than any of AMD's offering at the time.

Honestly, if AMD does drop out of GPU manufacturing, I wouldn't be surprised. Nobody buys their cards when they're the performance king, so why should they keep making them? And I own a Vega 64.
 
  • Thunk-Provoking
Reactions: Long time lurker
I just want to replace my fucking trash 970 with something new but all these fucking cards have price points set by a mad jew
 
  • Agree
Reactions: tarni
Meh, technology sounds really impressive but there’s no killer games that I must play that require such hardware.

I seem to only upgrade hardware once per console cycle so I’ll wait.
 
Yeah, I wouldn't be concerned with getting a raytracing card soon. AAA games will invariably have a fallback graphics option for older cards, DirectX 12 is hard as balls to learn for your average indie dev, and Unity is made out of technology and libraries from the year 1812 so you don't have to worry about sldner clones ever getting technologically impressive any time soon.
 
Yeah idk, I see comments all the time saying "If only AMD had something competitive". I'm skeptical that peopke would have bought AMD even if they gave 1080ti performance for cheaper and cooler just because it's happened in the past. As for me, I wanted a 580 but miners nuked that plan. Decided to go used 980ti instead, thus both companies lose a sale.
 
AdoredTV puts nvidia in pretty good perspective and I agree with him. Nvidia is still coasting on the "we have no meaningful competition" thing from before the Fury launched and so they ratchet up the price and skimp on performance increases despite the premium price point. I pretty much refuse to buy their products because the price doesn't justify the performance increase. If each new GPU generation was leaps and bounds ahead of its predecessor like we had turn of the century, I could.probably justify the cost. But not noe.
 
I beg to differ, considering how well the 290 video cards aged far more gracefully than the 780/780ti. I know Hardware Unboxed did a video comparing the 2 cards recently. I also believe that the 390 will age far better than the 970, especially with that 3.5 GB limit(yes I know it has 4GB).

But I think that's the biggest issue with AMD: they can't win. Especially with consumers. Remember the GTX 480/580? They were hot & loud too. Yet those sold more than any of AMD's offering at the time.

Honestly, if AMD does drop out of GPU manufacturing, I wouldn't be surprised. Nobody buys their cards when they're the performance king, so why should they keep making them? And I own a Vega 64.
It's funny you should mention that. AMD cant win because they are AMD, and constantly fuck themselves in the ass.

The GTX 400 line was a complete embarrassment for nvidia, and a huge win for AMD. They gained something like 12% marketshare in a single year, UNHEARD OF at the time. It was one of the best years in ATi's history, and they had nearly a 50/50 market split.

AMD's response to this? Fucking themselves with a dildo. The HD6000s were just re-branded 5000s, with only the 6900 cards having a slightly updated arch (the 300 line was not the first time AMD pulled this stunt). When nvidia actually somewhat fixed fermi with the 500s, AMD was caught completely off guard. Nvidia cards were offered with more VRAM, and the 570 was tied up with the 6970 in games. The 580 was untouchable performance wise, which looked bad for AMD. Furthermore, AMD's drivers were absolute dogshit at this time. Crossfire was completely broken, and this era dual GPUs were popular. SLI was far more stable and just worked, so most dualGPU builds went nvidia. Even on single GPU, AMD was taking months to get game fixes out, some bugs would go for more then a year before getting fixed. Older game were constantly broken. This was absolutely a time were you kept multiple catalyst drivers around depending on the game you wanted to play. Not a problem on nvidia, whom was fixing games the day of launch and providing SLI profiles 2 days after launch.

The fact that AMD sold as well as they did with the 5000s is a goddamn miracle.

The 7000s were genuinely better cards then the 600 series, and their age has proven that. The 290x stomped the 780 at launch, and the 780ti was neck and neck. The problem, again, was drivers. It took years for AMD to fix a memory bug that cost the 7000 cards something like 10% of their max performance. The boost was great, but it came long after the 7970 came out. That kind of thing would have been really valuable at launch, when first impressions are made. Then the 200 series came out, rebranding the 7000 line except for the high end. This was troubling, but Nvidia was also doing it, so whatev. The 200 series, specifically the 290/x, sold quite well, especially to miners.

Then nvidia's 900 series came out, overclocking like champions and running crisp and cool. AMD's response? Rebrand the 200 series into the 300 series! And hey, lets make it 9 months late to boot! *yawn**yawn**yawn*

The worst part was AMD pulled an evergreen again with the 300s. They had GCN 1.2 present with the R9 285. It was about 5% faster per clock, used about 5% less power, and needed less memory bandwidth. Everybody thought AMD would do the smart thing and take the 200 cards and port them all to GCN 1.2, but nope, AMD only made two GCN 1.2 cards, the 380 (just a 285) and the 380x, which was the full chip the 285 was built from (and took over half a year after other 300 series cards to actually release). The lower end cards were SOL, the big hawaii cards that could have used the efficiency boost were SOL, ece.

And remember, while this was going on, AMD was pouring resources into mantle and vulkan instead of focusing on making good GPUs, and their CPU division was huffing poison gas.

And none of this is even getting into nvidia working closely with developers, helping to optimize games and fix bugs, while AMD couldnt be bothered to respond to emails from their "featured" developers. I believe it was the dirt showdown devs or the saints row devs that stated that AMD's support was beyond nonexistent.

AMD's biggest foe is AMD. Their support and release track records are cum-encrusted horseshit. That changed somewhat with the RTX 400 line, where AMD finally got their driver game together, but AMD went and fucked themselves AGAIN by not releasing anything above 1060 level, when they HAD THE ARCH TO DO IT. Instead, they spent 2 more years half-assing vega while letting nvidia have full control over the high end, and then it took them another 6 months to get vega's drivers figured out, on top of massive supply shortages. The 3dfx strategy of releasing 2 different arches let nvidia rape the consumer market for 2 years with old GPUs, going into a 3rd year, providing team green with tons of juicy cash and name recognition.

Raja was in charge of RTG the whole time this was happening, and thank Christ he is gone. RTG seems to be keeping mum on their next cards, seems like they are trying to pull a ryzen, wait a year or so and come out swinging with a new arch. AMD will never let go of the GPUs, APUs are a huge market for them between consoles and PCs, and they need RTG to do that. AMD would be successful if they would just consistently put out good GPUs for a few generations. I've been mostly happy with my 480 (despite some occasional driver weirdness), just picked up a vega 64 cheap, and would have gone with the 7970 instead of 680s if AMD hadnt fucked their driver releases to oblivion with dual GPU at the time. If they focus on making a decent arch with Navi, and pour money into their driver support, they could very much steal a good portion of sales from nvidia, especially with the RTX2000s looking like half-baked pascal successors.
 
Last edited:
It's funny you should mention that. AMD cant win because they are AMD, and constantly fuck themselves in the ass.

The GTX 400 line was a complete embarrassment for nvidia, and a huge win for AMD. They gained something like 12% marketshare in a single year, UNHEARD OF at the time. It was one of the best years in ATi's history, and they had nearly a 50/50 market split.

AMD's response to this? Fucking themselves with a dildo. The HD6000s were just re-branded 5000s, with only the 6900 cards having a slightly updated arch (the 300 line was not the first time AMD pulled this stunt). When nvidia actually somewhat fixed fermi with the 500s, AMD was caught completely off guard. Nvidia cards were offered with more VRAM, and the 570 was tied up with the 6970 in games. The 580 was untouchable performance wise, which looked bad for AMD. Furthermore, AMD's drivers were absolute dogshit at this time. Crossfire was completely broken, and this era dual GPUs were popular. SLI was far more stable and just worked, so most dualGPU builds went nvidia. Even on single GPU, AMD was taking months to get game fixes out, some bugs would go for more then a year before getting fixed. Older game were constantly broken. This was absolutely a time were you kept multiple catalyst drivers around depending on the game you wanted to play. Not a problem on nvidia, whom was fixing games the day of launch and providing SLI profiles 2 days after launch.

The fact that AMD sold as well as they did with the 5000s is a goddamn miracle.

The 7000s were genuinely better cards then the 600 series, and their age has proven that. The 290x stomped the 780 at launch, and the 780ti was neck and neck. The problem, again, was drivers. It took years for AMD to fix a memory bug that cost the 7000 cards something like 10% of their max performance. The boost was great, but it came long after the 7970 came out. That kind of thing would have been really valuable at launch, when first impressions are made. Then the 200 series came out, rebranding the 7000 line except for the high end. This was troubling, but Nvidia was also doing it, so whatev. The 200 series, specifically the 290/x, sold quite well, especially to miners.

Then nvidia's 900 series came out, overclocking like champions and running crisp and cool. AMD's response? Rebrand the 200 series into the 300 series! And hey, lets make it 9 months late to boot! *yawn**yawn**yawn*

The worst part was AMD pulled an evergreen again with the 300s. They had GCN 1.2 present with the R9 285. It was about 5% faster per clock, used about 5% less power, and needed less memory bandwidth. Everybody thought AMD would do the smart thing and take the 200 cards and port them all to GCN 1.2, but nope, AMD only made two GCN 1.2 cards, the 380 (just a 285) and the 380x, which was the full chip the 285 was built from (and took over half a year after other 300 series cards to actually release). The lower end cards were SOL, the big hawaii cards that could have used the efficiency boost were SOL, ece.

And remember, while this was going on, AMD was pouring resources into mantle and vulkan instead of focusing on making good GPUs, and their CPU division was huffing poison gas.

And none of this is even getting into nvidia working closely with developers, helping to optimize games and fix bugs, while AMD couldnt be bothered to respond to emails from their "featured" developers. I believe it was the dirt showdown devs or the saints row devs that stated that AMD's support was beyond nonexistent.

AMD's biggest foe is AMD. Their support and release track records are cum-encrusted horseshit. That changed somewhat with the RTX 400 line, where AMD finally got their driver game together, but AMD went and fucked themselves AGAIN by not releasing anything above 1060 level, when they HAD THE ARCH TO DO IT. Instead, they spent 2 more years half-assing vega while letting nvidia have full control over the high end, and then it took them another 6 months to get vega's drivers figured out, on top of massive supply shortages. The 3dfx strategy of releasing 2 different arches let nvidia rape the consumer market for 2 years with old GPUs, going into a 3rd year, providing team green with tons of juicy cash and name recognition.

Raja was in charge of RTG the whole time this was happening, and thank Christ he is gone. RTG seems to be keeping mum on their next cards, seems like they are trying to pull a ryzen, wait a year or so and come out swinging with a new arch. AMD will never let go of the GPUs, APUs are a huge market for them between consoles and PCs, and they need RTG to do that. AMD would be successful if they would just consistently put out good GPUs for a few generations. I've been mostly happy with my 480 (despite some occasional driver weirdness), just picked up a vega 64 cheap, and would have gone with the 7970 instead of 680s if AMD hadnt fucked their driver releases to oblivion with dual GPU at the time. If they focus on making a decent arch with Navi, and pour money into their driver support, they could very much steal a good portion of sales from nvidia, especially with the RTX2000s looking like half-baked pascal successors.
https://wccftech.com/amd-confirms-new-7nm-radeon-graphics-cards-launching-in-2018/

Theyre going for it. BTW what do you think of Ryzen?
 
Last edited:
I've been planning to buy a 1080 once the new GPUs come out (that's how I nabbed my current R9 Fury for $250 a couple years ago), but if everyone else has the same idea I don't think it'll do much good. I've been seeing more hype for the cheap 1080s than the new 2080s, which means that in the end they won't actually get any cheaper.
There is a LOT of old 10-series stock at AIB facilities, so at some point you will see price drops happen, IMO, this is part of the reason for the HYPER inflated pricing, to push people into buying the old stock.

I honestly don't see a reason for anyone to go with RTX cards, especially in my use case, a small form-factor PC, where power and heat management are paramount. Just look at how many triple-slot coolers are going to be on the market.
 
Speaking as a Ryzen 1600 user, I like it. Made Intel sit up and beg for once.
https://wccftech.com/amd-1st-gen-ryzen-threadripper-cpus-price-cut-300-8-core-400-12-core/

That specific processor is down to 200 since Zen 2 is out and 9th Gen intel is around the corner.

There is a LOT of old 10-series stock at AIB facilities, so at some point you will see price drops happen, IMO, this is part of the reason for the HYPER inflated pricing, to push people into buying the old stock.

I honestly don't see a reason for anyone to go with RTX cards, especially in my use case, a small form-factor PC, where power and heat management are paramount. Just look at how many triple-slot coolers are going to be on the market.
Already coming down since the 2080 and 2070 is around the corner.

https://www.tomshardware.com/news/best-nvidia-gpu-deals,37668.html
 
Last edited:
  • Like
Reactions: Done
There is a LOT of old 10-series stock at AIB facilities, so at some point you will see price drops happen, IMO, this is part of the reason for the HYPER inflated pricing, to push people into buying the old stock.

I honestly don't see a reason for anyone to go with RTX cards, especially in my use case, a small form-factor PC, where power and heat management are paramount. Just look at how many triple-slot coolers are going to be on the market.

This. There will be blower cooled ones for SFF users like you and I but I'm sceptical that a blower will be up to it given that even the reference ones are dual fan.
 
  • Agree
Reactions: Done
This. There will be blower cooled ones for SFF users like you and I but I'm sceptical that a blower will be up to it given that even the reference ones are dual fan.
Based on power connectors, I think the only one that may be decent is the 2070 (single 8-pin), but the rest is just far too power hungry (6-pin+8-pin for the 2080, and 2x8-pin for the 2080 Ti) for blowers to be effective.
 
Back