GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • Thunk-Provoking
Reactions: Smaug's Smokey Hole
Okay, time for an update on my quest for an upgrade.

I failed to buy an RX 6800 XT. I was feverishly refreshing my phone on Scan and OcUK (the latter of which was borked due to people trying to connect) and did manage to get one in my basket. But in the time between me adding to basket and tapping the checkout icon, it was taken out because it was sold out.

Their entire stock was gone by 2.05 pm for a 2.00 pm launch. Incidentally, the RX 6800 XT actually is as good as it's made out to be. Big Navi lived up to expectations. Pity it's stock didn't.

In other news, my queue position for my RTX 3080 has increased to 188 from 300 back at the start of October.
 
please no retarded cryptofags.
I hate crypto miners. I couldn't upgrade to a 1070 from my 970 during the height of the mining craze due to the inflated MSRP. No way in hell I was going to pay $100 or more over MSRP because of faggot blockchain shekels.

I eventually settled for a 1660ti that does well enough after upgrading from my old i5 4690k to an r5 2600. Went from ~40 FPS in most titles at 1440p to 60FPS. Felt good.
I only upgrade once every 5 years or so but I feel for those of you who are stuck on aging hardware. It's no fun watching your computer age in real time when new games start releasing while the new shiny is out of reach due to market fuckery or getting bought up by scalping Jews or cryptokiddies.
 
6800xt looks amazing. 3080, occasionally 3090 performance for $650 with 16GB RAM, OCs scale well, shows more promise with third party boards using higher spec/faster VRAM and higher power limits. 18Gbps GDDR6 exists, and would serve a 8+8+6 pin 6800xt/6900xt quite well.

The fact this card costs $650 while a dual die 5950x with a smaller combined area is $750 is a bit off putting though.
 
6800xt looks amazing. 3080, occasionally 3090 performance for $650 with 16GB RAM, OCs scale well, shows more promise with third party boards using higher spec/faster VRAM and higher power limits. 18Gbps GDDR6 exists, and would serve a 8+8+6 pin 6800xt/6900xt quite well.

The fact this card costs $650 while a dual die 5950x with a smaller combined area is $750 is a bit off putting though.
no one on kiwifarms needs a 5950x.
 
6800xt looks amazing. 3080, occasionally 3090 performance for $650 with 16GB RAM, OCs scale well, shows more promise with third party boards using higher spec/faster VRAM and higher power limits. 18Gbps GDDR6 exists, and would serve a 8+8+6 pin 6800xt/6900xt quite well.

The fact this card costs $650 while a dual die 5950x with a smaller combined area is $750 is a bit off putting though.

Binning, How many dies can run all 16 cores at full speed. Those that can't are hived off to become the 5900X, or recycled into the single-chiplet 8-core 5800X. This is why the top end Intel CPUs, which are still using monolithic dies rather than chiplets, are even more expensive than the top end Ryzens. There are literally one in a thousand dies that can be top end i9s. At least with chiplets you can recycle the dies that don't quite make it more effectively.

Also, the 6800XT performance seems to be equal to RTX 3080 in standard rasterisation but behind it in ray tracing. Now ray tracing is still a meme, frankly, but soon enough it won't be. The real problem now is that AMD's Fidelity FX isn't done yet. The hardware is there on the chips but the software isn't. Once it is done, it may be an answer to DLSS.

At least it's less of a paper launch than Ampere was. People have actually managed to buy 6800XT founders editions, and the third-party varieties that appear next week might have even more stock. Compare and contrast Nvidia, whose entire stock of RTX 3080 FEs were sent to compliant reviewers and then bots and scalpers as a loss leader, and the third party cards were all over £700.00 and still like gold dust.

AMD could still win here. If they can get enough stock of the 6800XT in stock that they are generally for sale by Christmas, rather than stuck in pre-order limbo, they will beat Nvidia by virtue of actually having cards for sale. Nvidia can bounce and squeak about DLSS and the RTX 3080 Ti all they want but the fact is, they rushed out vaporware. Also, there's still the 6900XT. This is tipped to be almost as good as the RTX 3090 but a massive £500.00 cheaper. If there are any of these available at launch then there is absolutely no point in buying an RTX 3090 ever unless you want the bragging rights or need 24 GB of video memory for hardcore number crunching. And if you are in the market for hardcore number crunching you're better getting a Tesla or Quadro because they have the certified professional-grade drivers.
 
It was odd how Nvidia pushed out a 3090 at the same time as the rest of the RTX series but now that I'm glossing over benchmarks for the new AMD GPUs the RTX 3090 really skews perception at a glance. It is a ridiculous card that no one should/will/can buy but it is a gaming card so it will be right there in the benchmarks showing that Nvidia is still the fastest.

They did a similar thing years ago with the GeForce FX 5800 Ultra - a card that mainly existed in the hands of a few reviewers for benchmarking purposes.
 
Binning, How many dies can run all 16 cores at full speed. Those that can't are hived off to become the 5900X, or recycled into the single-chiplet 8-core 5800X. This is why the top end Intel CPUs, which are still using monolithic dies rather than chiplets, are even more expensive than the top end Ryzens. There are literally one in a thousand dies that can be top end i9s. At least with chiplets you can recycle the dies that don't quite make it more effectively.

Also, the 6800XT performance seems to be equal to RTX 3080 in standard rasterisation but behind it in ray tracing. Now ray tracing is still a meme, frankly, but soon enough it won't be. The real problem now is that AMD's Fidelity FX isn't done yet. The hardware is there on the chips but the software isn't. Once it is done, it may be an answer to DLSS.

At least it's less of a paper launch than Ampere was. People have actually managed to buy 6800XT founders editions, and the third-party varieties that appear next week might have even more stock. Compare and contrast Nvidia, whose entire stock of RTX 3080 FEs were sent to compliant reviewers and then bots and scalpers as a loss leader, and the third party cards were all over £700.00 and still like gold dust.

AMD could still win here. If they can get enough stock of the 6800XT in stock that they are generally for sale by Christmas, rather than stuck in pre-order limbo, they will beat Nvidia by virtue of actually having cards for sale. Nvidia can bounce and squeak about DLSS and the RTX 3080 Ti all they want but the fact is, they rushed out vaporware. Also, there's still the 6900XT. This is tipped to be almost as good as the RTX 3090 but a massive £500.00 cheaper. If there are any of these available at launch then there is absolutely no point in buying an RTX 3090 ever unless you want the bragging rights or need 24 GB of video memory for hardcore number crunching. And if you are in the market for hardcore number crunching you're better getting a Tesla or Quadro because they have the certified professional-grade drivers.

Intel's inability to push past 14nm on their consumer desktop processors is a real hurdle for them. Like you said, their monolithic process die has become an Achilles' Heel in both the small scale R&D world and their large-scale CPU sales world.
For the last 5 years Intel has been struggling to find a new material or lithograpy that would facilitate something below 7nm. Intel has even looked at specialized metalloid alloys and outright phosphates to break past this barrier. While TSMC claims a 5nm node, the common wisdom is that Intel's transistors spacing tends to be denser than that of their rivals'.

What AMD has that Intel doesn't is a proven 7nm node courtesy of TSMC that reduces power draw and heat output. A Zen 3 processor will often need less watts to perform the same task at a comparable or superior speed to a Skylake+ processor. I will refuse to call any "new" 14nm Intel process anything else, it's been incremental Skylake tocks for the last half decade at this point. Skylake+ is archaic in comparison and trying to squeeze yet more performance out of it has resulted in a reversion to a soldered IHS, 100+W TDPs on the now-midrange i7s, the now infamous air conditioner overclock that Intel showed off at a tech conference, and actual OEM peltier AIO liquid coolers for their 10th gen processors. There is nowhere left for 14nm to go.

Intel has managed to make their 10nm Icelake process node functional for consumer grade laptops, but it has yet to be ramped up for desktop chips or even replacement capatity for their 14nm products. Last December they even brought back Haswell. This was to help relieve the strain on their 14nm production chain. If they need to revive the same 22nm architecture that brought us my scrappy 4690k to save its geriatric younger sibling then things aren't going well over at Team Blue.
 
Binning, How many dies can run all 16 cores at full speed. Those that can't are hived off to become the 5900X, or recycled into the single-chiplet 8-core 5800X. This is why the top end Intel CPUs, which are still using monolithic dies rather than chiplets, are even more expensive than the top end Ryzens. There are literally one in a thousand dies that can be top end i9s. At least with chiplets you can recycle the dies that don't quite make it more effectively.

Also, the 6800XT performance seems to be equal to RTX 3080 in standard rasterisation but behind it in ray tracing. Now ray tracing is still a meme, frankly, but soon enough it won't be. The real problem now is that AMD's Fidelity FX isn't done yet. The hardware is there on the chips but the software isn't. Once it is done, it may be an answer to DLSS.

At least it's less of a paper launch than Ampere was. People have actually managed to buy 6800XT founders editions, and the third-party varieties that appear next week might have even more stock. Compare and contrast Nvidia, whose entire stock of RTX 3080 FEs were sent to compliant reviewers and then bots and scalpers as a loss leader, and the third party cards were all over £700.00 and still like gold dust.

AMD could still win here. If they can get enough stock of the 6800XT in stock that they are generally for sale by Christmas, rather than stuck in pre-order limbo, they will beat Nvidia by virtue of actually having cards for sale. Nvidia can bounce and squeak about DLSS and the RTX 3080 Ti all they want but the fact is, they rushed out vaporware. Also, there's still the 6900XT. This is tipped to be almost as good as the RTX 3090 but a massive £500.00 cheaper. If there are any of these available at launch then there is absolutely no point in buying an RTX 3090 ever unless you want the bragging rights or need 24 GB of video memory for hardcore number crunching. And if you are in the market for hardcore number crunching you're better getting a Tesla or Quadro because they have the certified professional-grade drivers.
RT will be a meme for some time. The current gen consoles have far worse RT performance then the 6800, let alone the xt, and that wont get fixed for another 6+ years. By the time RT is a major component of games (or, IMO, produces anything noticeable in terms of image quality) the 6000 series will be ancient history.
no one on kiwifarms needs a 5950x.
Bitch I need 2 fully enabled CCDs with 64MB of L3 cache to load Kiwifarms while literally murdering trannies with my whiteness.
 
This has some hidden upsides. I'm running a Skylake-X processor that still registers as "6th generation" and therefore has full Windows 7/8 compatibility.
you can still install and run windows 7/8.1 with newer processors/chipsets, it's just far more of a pain in the ass than it really needs to be

Ryzen 9 3950X - Windows 7.jpg
 
While I'm glad AMD is sticking it in both Intel and Nvidia, I'm still completely ass blasted about pricing.

But, given the stock situation, I'm kinda glad I got a 2060 last year. Since I'm still on a 1080p display and have well over a dozen games on my Steam backlog, I should be good for another year or two.
RT will be a meme for some time. The current gen consoles have far worse RT performance then the 6800, let alone the xt, and that wont get fixed for another 6+ years. By the time RT is a major component of games (or, IMO, produces anything noticeable in terms of image quality) the 6000 series will be ancient history.
Yeah, pretty much. I really hate the marketing push for raytracing.
 
  • Informative
Reactions: themasterlurker
Why not get a 3950X?
The 5950x is noticeably more capable and more efficient. Choosing less efficient hardware for terminating literally shaking snowflakes is unacceptable.
While I'm glad AMD is sticking it in both Intel and Nvidia, I'm still completely ass blasted about pricing.

But, given the stock situation, I'm kinda glad I got a 2060 last year. Since I'm still on a 1080p display and have well over a dozen games on my Steam backlog, I should be good for another year or two.

Yeah, pretty much. I really hate the marketing push for raytracing.
Same. I'm back on my old 1200p pro art display, I really just dont like 16:9 or screens over 24" for PCs. There is no game out there today my vega 64 cant handle at 60 FPS at this resolution, so I've cooled my heels on upgrading anything now.
 
The 5950x is noticeably more capable and more efficient. Choosing less efficient hardware for terminating literally shaking snowflakes is unacceptable.

Same. I'm back on my old 1200p pro art display, I really just dont like 16:9 or screens over 24" for PCs. There is no game out there today my vega 64 cant handle at 60 FPS at this resolution, so I've cooled my heels on upgrading anything now.
If I can do my troon termination duty on an overclocked 2600 then surely you can make due with a 3950X.

Jokes aside, I feel you with the Vega 64. It isn't the best card out there but it is more than adequate for your chosen resolution. Like I said earlier, I game at 1440p. Why would I want a card that drives resolutions and framerates higher than what my monitor would allow? Why buy into the next PhysX gimmick shit from Nvidia? I only bought a 1660ti because there were no other non RTX options. Admittedly I could've waited for the RX 5700XT or the 5600 but they were announcement vaporware at that point and my GTX 970 wasn't getting any younger.

Pretty graphics are nice but if the game is shit overall then it's just makeup on a pig.
 
Last edited:
Why not get a 3950X?
Zen 3 have a higher IPC(Insinuations Per Cuicide)

I was completely wrong about AMDs new GPU, it looks great and competitive, my prediction was competitive with the 3070 at best and that didn't go well... I thought they would be competitive in RT(not so) similar to how they kicked Nvidia in the knackers with unified shaders after working on the 360, or async compute and Vulkan. Nvidia's 3000 series was such a massive leap in rasterization that it seemed hard to beat, Nvidia then muddying the charts with the 3090 seems to me like they knew something. Let's see how Nvidia will push fancier RT on developers now to make up for it, and to be fair the titles available to benchmark were built using Nvidia cards as a reference so maybe it will balance out a bit with new titles.
 
  • Like
Reactions: DNA_JACKED
I was considering picking up a 6800 when they're in stock but they've botched the pricing of it I feel, especially in Australia where I live. Why would I get the 6800 when I can cough up the extra hundge for an XT and get way more performance instead? I mean, you're already paying an exorbitant amount for a graphics card anyway.

6800.PNG
6800xt.PNG
 
Back