GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
All the major Australian parts suppliers had their sites crash at the exact release time - lol. Glad I'm not in the market for one of these - figure my 2080Ti is more than enough for now and I can wait for the 3080 Super (or even the 4000 series)... It'll probably be my 9700k's lack of hyperthreading that prompts an overhaul, if anything.
I wouldnt be so sure. Look how long the 2500k hung in there. Next gen consoles are still only going to allow 6 cores for games, the other 2 are reserved for the OS, likely the 9700k will last through this entire generation unless you are a FPS snob looking for 200+ FPS.
I honestly would be wary of buying a GPU with only 8GB of VRAM these days. Fine if it's not a new card but anything you expect to be future-proofing I think that's low.
The 3060 is maybe 2080 level. It's performance wouldnt warrant a larger framebuffer, the GPU would bottleneck on processing before VRAM limitations would come into play.

The 290x can maintain 1080p ultra with 4GB no problem right now, the 3060 will likely be fine with 8GB
I really hope this one is good. The cooler looks great but the 256 bit rumors and lack of any leaks showing greater then 2080ti performance is very concerning.
 

Given that over on Scan the RTX 3080 is now described as "end of life" just a week after it came out, mine (which I paid for and have e-mails showing the order was confirmed) totally missed its projected delivery date, though I'll give it a few weeks before demanding my money back, Ampere has vaporware written all over it.

Unfortunately this means that all those people will be scampering to get themselves one of these instead, assuming the performance is as good as AMD claims. With a bit of luck this won't just be a paper part.
 
  • Thunk-Provoking
Reactions: Allakazam223
Anyone have a good resource on GPU computing, maybe focusing on shit like discrete logarithms, modulus. Discrete math in general with GPU computing
 
I really hope this one is good. The cooler looks great but the 256 bit rumors and lack of any leaks showing greater then 2080ti performance is very concerning.

It will definitely be 192/256bit buses on the two main cards if they ships with the amounts of VRAM that's been rumored. Gut feeling is that their best card might be around a 3070 in performance and that's a bit optimistic. Price is really where AMD will have to compete.
 
  • Agree
Reactions: DNA_JACKED
Buying an AMD GPU at launch is begging for niggling irritations because their drivers always suck for months. That's why they should stick as the value option until they can finally get this shit right. But, they do get it right eventually. It's bugging me now that a 5500xt is performing nearly as well as my 1660 Super.
 
Buying an AMD GPU at launch is begging for niggling irritations because their drivers always suck for months. That's why they should stick as the value option until they can finally get this shit right. But, they do get it right eventually. It's bugging me now that a 5500xt is performing nearly as well as my 1660 Super.

Hmmm, I was on AMD from 2011 until 2014 and the almighty GTX 970. I never had any problems with drivers. And in terms of ease of use, I found their Catalyst Control Centre, as it then was, to be far more congenial than the Nvidia equivalent.

In your case, though, it's probably because the GTX 1600 series is basically designed to suck. If they made it any better it would have cannibalised sales of the RTX 2060 and they wanted to encourage people to pay a premium for ray tracing.

I stuck with my GTX 1080 Ti from getting it in 2017 until my Ampere card falls through the letterbox (whenever that might be). It is just that good. Only the RTX 2080 Ti beats it (the RTX 2080 Super matches it) and there was not enough games with ray tracing to really make an upgrade worth it. Basically the RTX 2000 series was an early adopter trap.
 
Are they really that bad?
Several friends of mine (keep in mind this was many years ago) have had their ADATA SSDs die early controller related deaths. Usually when it comes to SSDs I recommend sticking with brands that control their own supply, EG micron crucial drives or samsung drives, toshiba, WD, sandisk, ece or one of the reputable brands. Sabarent has become the new upa nd comer stealing Samsung's thunder lately.
Buying an AMD GPU at launch is begging for niggling irritations because their drivers always suck for months. That's why they should stick as the value option until they can finally get this shit right. But, they do get it right eventually. It's bugging me now that a 5500xt is performing nearly as well as my 1660 Super.
I wouldnt let it bug me, after all the 5500xt took nearly a year to get to where it needs to be, and the RTX 3000 series lower end GPUs will be out likely by end of q1 2021.

That's the problem with AMD finewine, by the time the GPUs have all their bugs fixed and perform the way they should they are a whole generation behind the current nvidia brand.
 
Several friends of mine (keep in mind this was many years ago) have had their ADATA SSDs die early controller related deaths. Usually when it comes to SSDs I recommend sticking with brands that control their own supply, EG micron crucial drives or samsung drives, toshiba, WD, sandisk, ece or one of the reputable brands. Sabarent has become the new upa nd comer stealing Samsung's thunder lately.

I wouldnt let it bug me, after all the 5500xt took nearly a year to get to where it needs to be, and the RTX 3000 series lower end GPUs will be out likely by end of q1 2021.

That's the problem with AMD finewine, by the time the GPUs have all their bugs fixed and perform the way they should they are a whole generation behind the current nvidia brand.
Not on Linux where AMD drivers are great and Nvidias are garbage, best of both worlds, windows niggas are really missing out.
 
Several of the more knowledgeable YouTubers have been starting to say that Big Navi is going to be competitive with Nvidia's offerings, but just not able to reach 3090. But there's still some good stuff about the Radeon cards if they're right. For one, Nvidia is likely to have poor yields because their chips are fucking huge. So AMD might have the stock advantage. More significantly, Nvidia's 3000 series seem to be monsters in terms of power-draw. The 3080 has, what, about 25-30% performance gain over the 2080? It also draws around 25% more power to do so! Big Navi is supposed to have some major performance-per-watt gains.

For me, the big advantage Nvidia has if these RDNA2 rumours are true, isn't the hardware but the ecosystem. As well as CUDA, Nvidia has some very neat gimmicks (and not so gimmicks) in terms of software support - background noise elimination, voice changers... cool stuff like that. For gaming it doesn't matter but AMD need to step up their game in terms of ecosystem. Maybe the consoles will help with that.
 
Several of the more knowledgeable YouTubers have been starting to say that Big Navi is going to be competitive with Nvidia's offerings, but just not able to reach 3090. But there's still some good stuff about the Radeon cards if they're right. For one, Nvidia is likely to have poor yields because their chips are fucking huge. So AMD might have the stock advantage. More significantly, Nvidia's 3000 series seem to be monsters in terms of power-draw. The 3080 has, what, about 25-30% performance gain over the 2080? It also draws around 25% more power to do so! Big Navi is supposed to have some major performance-per-watt gains.

For me, the big advantage Nvidia has if these RDNA2 rumours are true, isn't the hardware but the ecosystem. As well as CUDA, Nvidia has some very neat gimmicks (and not so gimmicks) in terms of software support - background noise elimination, voice changers... cool stuff like that. For gaming it doesn't matter but AMD need to step up their game in terms of ecosystem. Maybe the consoles will help with that.
Yeah, having to use those Samsung wafers hurt Ampere. It's really just a scaled up 2080ti with those massive power draws.

Idk, I hate gaming culture and hate that AMD has to challenge the top nvidia stuff for people to consider them in general. Like when the 2080ti was top dog and people were so fixated on Navi beating it. Almost nobody owns a 2080ti anyways. Just give me the best card for my budget.

Grabbed a 5700xt for $300 and threw a block on it. Worked great except for those damn black screen issues. I'll probably give AMD another shot if they can deliver what the 3070 is supposed to be. Give me a 2080ti or better for $500 and no black screens and I'm sold.
 
Yeah, having to use those Samsung wafers hurt Ampere. It's really just a scaled up 2080ti with those massive power draws.

Idk, I hate gaming culture and hate that AMD has to challenge the top nvidia stuff for people to consider them in general. Like when the 2080ti was top dog and people were so fixated on Navi beating it. Almost nobody owns a 2080ti anyways. Just give me the best card for my budget.

Grabbed a 5700xt for $300 and threw a block on it. Worked great except for those damn black screen issues. I'll probably give AMD another shot if they can deliver what the 3070 is supposed to be. Give me a 2080ti or better for $500 and no black screens and I'm sold.
Hey gamers used to feel the same about AMD CPUs and actually shilled Intel’s shit even though what Intel offered was subpar.
 
  • Agree
Reactions: Aberforth
Several of the more knowledgeable YouTubers have been starting to say that Big Navi is going to be competitive with Nvidia's offerings, but just not able to reach 3090. But there's still some good stuff about the Radeon cards if they're right. For one, Nvidia is likely to have poor yields because their chips are fucking huge. So AMD might have the stock advantage. More significantly, Nvidia's 3000 series seem to be monsters in terms of power-draw. The 3080 has, what, about 25-30% performance gain over the 2080? It also draws around 25% more power to do so! Big Navi is supposed to have some major performance-per-watt gains.

For me, the big advantage Nvidia has if these RDNA2 rumours are true, isn't the hardware but the ecosystem. As well as CUDA, Nvidia has some very neat gimmicks (and not so gimmicks) in terms of software support - background noise elimination, voice changers... cool stuff like that. For gaming it doesn't matter but AMD need to step up their game in terms of ecosystem. Maybe the consoles will help with that.

They will definitely be competitive with some of Nvidias current products but I would be surprised if they could trade blows with a 3080. Raytracing is a real dark horse in all of this, they haven't released a card with hardware support for it but they have been working on both nextgen consoles, including working close with Microsoft who's behind DXR. Microsoft works with Nvidia too in the same way but unlike Nvidia they have worked closely with AMD to get the GPU right for the Xbox and with Sony(OpenGL ES I guess) on the PS5, so they have that experience.

Their rumored 12GB card would be using 12 chips for a 384bit interface instead of 192bit and the 20GB card could have a 512bit interface instead of 256, AMD have been pretty extreme with memory bandwidth from time to time(with HBM) but they haven't used anything above a 256bit interface since the R9 390. The upcoming consoles on the other hand... They have spent a couple of years to create a powerful and cost efficient product so there's always that.


On the topic of CUDA, it's good but the rumor is that Nvidia is playing dirty and sandbagging their OpenCL support to make CUDA look better. Same productivity tests on the same card(Nvidia, of course) will have better scores using CUDA which makes people think that CUDA is much better. That impression makes OpenCL scores on AMD hardware seem like they could be better if only they were using CUDA, which is only available on Nvidia cards, so maybe it's best to buy one of those. The Radeon VII is a monster though.
 
They will definitely be competitive with some of Nvidias current products but I would be surprised if they could trade blows with a 3080. Raytracing is a real dark horse in all of this, they haven't released a card with hardware support for it but they have been working on both nextgen consoles, including working close with Microsoft who's behind DXR. Microsoft works with Nvidia too in the same way but unlike Nvidia they have worked closely with AMD to get the GPU right for the Xbox and with Sony(OpenGL ES I guess) on the PS5, so they have that experience.

Their rumored 12GB card would be using 12 chips for a 384bit interface instead of 192bit and the 20GB card could have a 512bit interface instead of 256, AMD have been pretty extreme with memory bandwidth from time to time(with HBM) but they haven't used anything above a 256bit interface since the R9 390. The upcoming consoles on the other hand... They have spent a couple of years to create a powerful and cost efficient product so there's always that.


On the topic of CUDA, it's good but the rumor is that Nvidia is playing dirty and sandbagging their OpenCL support to make CUDA look better. Same productivity tests on the same card(Nvidia, of course) will have better scores using CUDA which makes people think that CUDA is much better. That impression makes OpenCL scores on AMD hardware seem like they could be better if only they were using CUDA, which is only available on Nvidia cards, so maybe it's best to buy one of those. The Radeon VII is a monster though.
I still only care about rasterized performance. I hope not a single % of that was sacrificed for any amount of RT performance for the price.
 
I still only care about rasterized performance. I hope not a single % of that was sacrificed for any amount of RT performance for the price.

Nvidias claim of twice the power of previous gen was based on raytracing and going forward RT will be important so I wouldn't be surprised if AMD leaned into it even if it won't be required/essential on the PC for another two years, that is my prediction. Rasterization is without a doubt the most important thing now but people benchmark the stupidest shit.
Whatever they release 1080p won't be a problem and if the price is right and the RT performance is decent then they have some solid cards to sell, Nvidia might be forced to announce a RTX 3050 at a xx50 price point and that's where it gets interesting.
 
Back