- Joined
- Jun 13, 2016
Bad news, everyone!
"AMD Radeon RX 6800 may be 50% faster than GeForce RTX 3090 for cryptomining"
Not necessarily. If that's so there might people cancelling their orders for Ampere if they're still in the queue.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Bad news, everyone!
"AMD Radeon RX 6800 may be 50% faster than GeForce RTX 3090 for cryptomining"
I hate crypto miners. I couldn't upgrade to a 1070 from my 970 during the height of the mining craze due to the inflated MSRP. No way in hell I was going to pay $100 or more over MSRP because of faggot blockchain shekels.please no retarded cryptofags.
no one on kiwifarms needs a 5950x.6800xt looks amazing. 3080, occasionally 3090 performance for $650 with 16GB RAM, OCs scale well, shows more promise with third party boards using higher spec/faster VRAM and higher power limits. 18Gbps GDDR6 exists, and would serve a 8+8+6 pin 6800xt/6900xt quite well.
The fact this card costs $650 while a dual die 5950x with a smaller combined area is $750 is a bit off putting though.
6800xt looks amazing. 3080, occasionally 3090 performance for $650 with 16GB RAM, OCs scale well, shows more promise with third party boards using higher spec/faster VRAM and higher power limits. 18Gbps GDDR6 exists, and would serve a 8+8+6 pin 6800xt/6900xt quite well.
The fact this card costs $650 while a dual die 5950x with a smaller combined area is $750 is a bit off putting though.
Binning, How many dies can run all 16 cores at full speed. Those that can't are hived off to become the 5900X, or recycled into the single-chiplet 8-core 5800X. This is why the top end Intel CPUs, which are still using monolithic dies rather than chiplets, are even more expensive than the top end Ryzens. There are literally one in a thousand dies that can be top end i9s. At least with chiplets you can recycle the dies that don't quite make it more effectively.
Also, the 6800XT performance seems to be equal to RTX 3080 in standard rasterisation but behind it in ray tracing. Now ray tracing is still a meme, frankly, but soon enough it won't be. The real problem now is that AMD's Fidelity FX isn't done yet. The hardware is there on the chips but the software isn't. Once it is done, it may be an answer to DLSS.
At least it's less of a paper launch than Ampere was. People have actually managed to buy 6800XT founders editions, and the third-party varieties that appear next week might have even more stock. Compare and contrast Nvidia, whose entire stock of RTX 3080 FEs were sent to compliant reviewers and then bots and scalpers as a loss leader, and the third party cards were all over £700.00 and still like gold dust.
AMD could still win here. If they can get enough stock of the 6800XT in stock that they are generally for sale by Christmas, rather than stuck in pre-order limbo, they will beat Nvidia by virtue of actually having cards for sale. Nvidia can bounce and squeak about DLSS and the RTX 3080 Ti all they want but the fact is, they rushed out vaporware. Also, there's still the 6900XT. This is tipped to be almost as good as the RTX 3090 but a massive £500.00 cheaper. If there are any of these available at launch then there is absolutely no point in buying an RTX 3090 ever unless you want the bragging rights or need 24 GB of video memory for hardcore number crunching. And if you are in the market for hardcore number crunching you're better getting a Tesla or Quadro because they have the certified professional-grade drivers.
This has some hidden upsides. I'm running a Skylake-X processor that still registers as "6th generation" and therefore has full Windows 7/8 compatibility.it's been incremental Skylake tocks for the last half decade at this point.
RT will be a meme for some time. The current gen consoles have far worse RT performance then the 6800, let alone the xt, and that wont get fixed for another 6+ years. By the time RT is a major component of games (or, IMO, produces anything noticeable in terms of image quality) the 6000 series will be ancient history.Binning, How many dies can run all 16 cores at full speed. Those that can't are hived off to become the 5900X, or recycled into the single-chiplet 8-core 5800X. This is why the top end Intel CPUs, which are still using monolithic dies rather than chiplets, are even more expensive than the top end Ryzens. There are literally one in a thousand dies that can be top end i9s. At least with chiplets you can recycle the dies that don't quite make it more effectively.
Also, the 6800XT performance seems to be equal to RTX 3080 in standard rasterisation but behind it in ray tracing. Now ray tracing is still a meme, frankly, but soon enough it won't be. The real problem now is that AMD's Fidelity FX isn't done yet. The hardware is there on the chips but the software isn't. Once it is done, it may be an answer to DLSS.
At least it's less of a paper launch than Ampere was. People have actually managed to buy 6800XT founders editions, and the third-party varieties that appear next week might have even more stock. Compare and contrast Nvidia, whose entire stock of RTX 3080 FEs were sent to compliant reviewers and then bots and scalpers as a loss leader, and the third party cards were all over £700.00 and still like gold dust.
AMD could still win here. If they can get enough stock of the 6800XT in stock that they are generally for sale by Christmas, rather than stuck in pre-order limbo, they will beat Nvidia by virtue of actually having cards for sale. Nvidia can bounce and squeak about DLSS and the RTX 3080 Ti all they want but the fact is, they rushed out vaporware. Also, there's still the 6900XT. This is tipped to be almost as good as the RTX 3090 but a massive £500.00 cheaper. If there are any of these available at launch then there is absolutely no point in buying an RTX 3090 ever unless you want the bragging rights or need 24 GB of video memory for hardcore number crunching. And if you are in the market for hardcore number crunching you're better getting a Tesla or Quadro because they have the certified professional-grade drivers.
Bitch I need 2 fully enabled CCDs with 64MB of L3 cache to load Kiwifarms while literally murdering trannies with my whiteness.no one on kiwifarms needs a 5950x.
Why not get a 3950X?Bitch I need 2 fully enabled CCDs with 64MB of L3 cache to load Kiwifarms while literally murdering trannies with my whiteness
you can still install and run windows 7/8.1 with newer processors/chipsets, it's just far more of a pain in the ass than it really needs to beThis has some hidden upsides. I'm running a Skylake-X processor that still registers as "6th generation" and therefore has full Windows 7/8 compatibility.
Yeah, pretty much. I really hate the marketing push for raytracing.RT will be a meme for some time. The current gen consoles have far worse RT performance then the 6800, let alone the xt, and that wont get fixed for another 6+ years. By the time RT is a major component of games (or, IMO, produces anything noticeable in terms of image quality) the 6000 series will be ancient history.
BUT MINECRAFT IN REAL!!!!Yeah, pretty much. I really hate the marketing push for raytracing.
The 5950x is noticeably more capable and more efficient. Choosing less efficient hardware for terminating literally shaking snowflakes is unacceptable.Why not get a 3950X?
Same. I'm back on my old 1200p pro art display, I really just dont like 16:9 or screens over 24" for PCs. There is no game out there today my vega 64 cant handle at 60 FPS at this resolution, so I've cooled my heels on upgrading anything now.While I'm glad AMD is sticking it in both Intel and Nvidia, I'm still completely ass blasted about pricing.
But, given the stock situation, I'm kinda glad I got a 2060 last year. Since I'm still on a 1080p display and have well over a dozen games on my Steam backlog, I should be good for another year or two.
Yeah, pretty much. I really hate the marketing push for raytracing.
If I can do my troon termination duty on an overclocked 2600 then surely you can make due with a 3950X.The 5950x is noticeably more capable and more efficient. Choosing less efficient hardware for terminating literally shaking snowflakes is unacceptable.
Same. I'm back on my old 1200p pro art display, I really just dont like 16:9 or screens over 24" for PCs. There is no game out there today my vega 64 cant handle at 60 FPS at this resolution, so I've cooled my heels on upgrading anything now.
Zen 3 have a higher IPC(Insinuations Per Cuicide)Why not get a 3950X?
I know, but the 3950X is still a capable CPUZen 3 have a higher IPC(Insinuations Per Cuicide)