GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🔧 At about Midnight EST I am going to completely fuck up the site trying to fix something.
Also also, SFF users will be pleased to note that this uses 20W less power than Ampere. That's important because the largest SFX power supply you can get is an 850W Silverstone brick.

You were the one that managed to cram in a 2080 or something into a SFF case, right?

Lol remember when the GTX 780 was $500? Both companies have been going extreme Jew on GPU pricing.

Costs are going up. The size of the 3080 die is 682 square millimeters compared to the 780's 561 square millimeters, it's significantly larger but a key difference is transistor count. 7 billion versus 28 billion. As far as I know, and this is really old, GPUs are designed in a way that is more cost effective in man hours than CPUs, that's why they've always had lower clock speeds. That has probably changed now that limits are getting tighter so R&D have gone up significantly. The manufacturing cost of the chip won't be much compared to the final sales price. Then the cost of fabs making the chips on new processes have always been rising, I've seen it likened to a kind of Moore's law in cost, prices go way up but they won't necessarily produce way more chips, just chips of a similar size to older ones but with way more transistors meaning that the amount of chips they get out per wafer is roughly the same.
They need to recoup that cost and turn a profit so they can build the next fab, fund the development of the next process. Wafers aren't what they charge for, the cost of having the equipment to make the chip is what Nvidia and AMD pays for. Intel is in a special position, they run their own shit but people have speculated on how long that's feasible for them, just look at their 10nm problems and how much it costs to not have it up and running like a should. Here's a quote from 2010:
“The most expensive thing on the planet is a half-empty fab,” says Brian Krzanich, general manager of Intel’s manufacturing and supply chain.

Half-empty 10nm fab?

Looking for some sources I saw that TSMC invested close to 10 billion USD into building Fab 15, what looks to be the next big one, Fab 18, is priced at 17 billion USD, with a four year gap between the start of construction. According to a list on wikipedia, so who knows.
 
You were the one that managed to cram in a 2080 or something into a SFF case, right?

GTX 1080 Ti in a Silverstone SG13. Since then I've done a case transplant when I moved over to a B550 motherboard with a Ryzen 3700X. Still have the 1080 Ti, but it's in a Kolink Rocket.

Absolute bugger of a case to work on btw. Makes Silverstone's puzzle boxes seem logical. When my RTX 3080 arrives I'm probably going to have to either hotrod it or widebody the side panels with standoffs. I also podged a spare Noctua fan into the base with twisty ties to keep the air moving.

Might see if I can get some thick rubber feet and just drill extra airholes in the base so as to be able to podge another fan in there.
 
  • Optimistic
Reactions: Ask Jeeves
GTX 1080 Ti in a Silverstone SG13. Since then I've done a case transplant when I moved over to a B550 motherboard with a Ryzen 3700X. Still have the 1080 Ti, but it's in a Kolink Rocket.

Absolute bugger of a case to work on btw. Makes Silverstone's puzzle boxes seem logical. When my RTX 3080 arrives I'm probably going to have to either hotrod it or widebody the side panels with standoffs. I also podged a spare Noctua fan into the base with twisty ties to keep the air moving.

Might see if I can get some thick rubber feet and just drill extra airholes in the base so as to be able to podge another fan in there.

You can probably buy a set of those conical one inch high rubber feet at IKEA for peanuts to raise it, turn them upside down to get a larger footprint if needed, maybe roughen up the bottom of them with sandpaper if you're worried about grip.
 
Lol remember when the GTX 780 was $500? Both companies have been going extreme Jew on GPU pricing.
The GTX 780 was a much smaller die on a less complicated process that turned out to have amazing yields: 28nm. It also wasnt the top tier GPU, the 780ti was for $650. You also have the compounding issue of the rising price of new memory, GDDR6X was developed by micron for nvidia, and that doesnt come cheap. The last couple years have seen dramatic increases in GPU R+D costs, and that also has to be recouped somewhere.

Also the GTX 8800 ultra launched with a $830 asking price all the way back in 2007. High prices for good silicon are nothing new.


OTOH, the biggest driving force for GPU demand today is resolution and refresh rate. If you are still playing at 1080p60 you can get away with $200 GPUs being able to run modern games at ultra. At 1440p90 my vega 64 is still holding out, and even the 3070 will shit all over that particular card. You dont need to spend big money unless you are pushing 144+ FPS or 4k, or both.
Costs are going up. The size of the 3080 die is 682 square millimeters compared to the 780's 561 square millimeters, it's significantly larger but a key difference is transistor count. 7 billion versus 28 billion. As far as I know, and this is really old, GPUs are designed in a way that is more cost effective in man hours than CPUs, that's why they've always had lower clock speeds. That has probably changed now that limits are getting tighter so R&D have gone up significantly. The manufacturing cost of the chip won't be much compared to the final sales price. Then the cost of fabs making the chips on new processes have always been rising, I've seen it likened to a kind of Moore's law in cost, prices go way up but they won't necessarily produce way more chips, just chips of a similar size to older ones but with way more transistors meaning that the amount of chips they get out per wafer is roughly the same.
They need to recoup that cost and turn a profit so they can build the next fab, fund the development of the next process. Wafers aren't what they charge for, the cost of having the equipment to make the chip is what Nvidia and AMD pays for. Intel is in a special position, they run their own shit but people have speculated on how long that's feasible for them, just look at their 10nm problems and how much it costs to not have it up and running like a should. Here's a quote from 2010:
Memory prices are also on the rise. GDDR5X started a trend of more expensive VRAM as higher speeds were demanded, and these speeds began to affect things like signaling and board design. GDDR6X is so fast now that it has to be practically on top of the GPU die for signal integrity. Look at the older DDR5 cards and how far away the memory is compared to the 3080. And as these roadblocks become more apparent we start seeing things like wider memory buses in a smaller space, which drives up price.
 
Memory prices are also on the rise. GDDR5X started a trend of more expensive VRAM as higher speeds were demanded, and these speeds began to affect things like signaling and board design. GDDR6X is so fast now that it has to be practically on top of the GPU die for signal integrity. Look at the older DDR5 cards and how far away the memory is compared to the 3080. And as these roadblocks become more apparent we start seeing things like wider memory buses in a smaller space, which drives up price.

I didn't even think about the signaling issues.

Part of that cost loops back to cost of the equipment and plants building the RAM chips. To simplify board design HBM(or something similar) will probably be common at some point in the future and manufacturing of that will pick up because it seems a bit dire now. For Vega it was said that AMD had to vacuum up and stockpile HBM chips for close to a year(I think?) to be able to have a product to sell and I don't know how many of those cards were made.

Maybe in the future the sign of a video card for poors is VRAM on the board.

Also the GTX 8800 ultra launched with a $830 asking price all the way back in 2007. High prices for good silicon are nothing new.

Then the 8800 GTS 512 arrived at $350 and it was cheap and glorious.
 
Lol remember when the GTX 780 was $500? Both companies have been going extreme Jew on GPU pricing.
The pricing is fine if they increased the vRam amount decently on these cards gen over gen but people are kidding themselves if 10Gb vs 8GB is going to make a difference in most titles.
 
I really hope we get a TOXIC just for the meme appeal. I've been on nvidia for ages and wouldn't mind a change, especially because AMD actually has a modern looking control panel and doesnt require a fucking account for a >500 dollar product.

I also think RT perf will probably be a bit higher than initial benchmarks suggest. DXR 1.1 offers in-line RT which seems to have been a central point behinds RDNA2s design. That huge cache should mean that the state issues of in-line become lol who cares tier. If lots of games prefer in-line over dynamic, and I'd expect a lot of console ports to do just that, then we may end up in a goofy scenario where nvidia has a substantial RT lead in some games but almost none in others.
 
I really hope we get a TOXIC just for the meme appeal. I've been on nvidia for ages and wouldn't mind a change, especially because AMD actually has a modern looking control panel and doesnt require a fucking account for a >500 dollar product.

I also think RT perf will probably be a bit higher than initial benchmarks suggest. DXR 1.1 offers in-line RT which seems to have been a central point behinds RDNA2s design. That huge cache should mean that the state issues of in-line become lol who cares tier. If lots of games prefer in-line over dynamic, and I'd expect a lot of console ports to do just that, then we may end up in a goofy scenario where nvidia has a substantial RT lead in some games but almost none in others.

I'm hoping we get a proper dual-slot rather than these triple-slot-with-dual-slot-bracket monstrosities that seem to be all the rage. Just admit your card is triple slot or spend more on cooler development.
 
  • Agree
Reactions: Allakazam223
I really hope we get a TOXIC just for the meme appeal. I've been on nvidia for ages and wouldn't mind a change, especially because AMD actually has a modern looking control panel
i just switched to newer amd drivers with the "modern" 2020 adrenalin control panel and you have to track down amdow.exe and move/delete/rename it to avoid it from launching at startup and automatically tracking your game playtime without your consent.. i'm pretty sure both sides have shitty gpu configuration apps
 
then we may end up in a goofy scenario where nvidia has a substantial RT lead in some games but almost none in others.

That doesn't sound so strange, it might be Vulkan and async compute all over again.

The pricing is fine if they increased the vRam amount decently on these cards gen over gen but people are kidding themselves if 10Gb vs 8GB is going to make a difference in most titles.

10GB provides 25% more bandwidth than 8GB at equal speeds, that's probably what they're going for.
 
That doesn't sound so strange, it might be Vulkan and async compute all over again.



10GB provides 25% more bandwidth than 8GB at equal speeds, that's probably what they're going for.

If I'm honest with you I'm slightly leery about how the RTX 3080 has only 10GB while the GTX 1080 Ti I currently run has 11GB. I know fast memory better than big memory but if you start running out that's gonna hurt performance badly.
 
Rumor has it the 6700XT will have 12 GB VRAM. That might be around the 450 mark as well
I could see it. Will definitely give them longer legs. Not everybody (in fact very few) upgrade every single year. Having more than enough ram now is a good way to get long term performance (and value retention).
 
  • Like
Reactions: Allakazam223
Fast ram don't mean shit when you run out. Just ask the FuryX.

10gb on a $700 (lol, no) gpu seems very anemic nowadays. Especially with AMD giving 16gb all the way down to the 6800.

The more I think about it, the more I'm thinking I should cancel my RTX 3080 anyhow. But I really don't want to be stuck in a similar situation with the Radeon 6800 XT.

Speaking of which, I rang Scan today to find out firstly how much stock they're expecting for Big Navi and secondly what measures are being put in place to stop bots and scalpers. They answered:

1. They have no idea yet but might have some idea if I ring back in a couple weeks.

2. They are instituting a captcha system and also a limit of one card per customer enforced by credit card billing address.
 
Back