GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

The answer is anything they do on their 7nm process can be done better on TSMC's 2nm process. Chip companies steal ideas from each other all the time. You are only ever one generation away from being smoked by the competition if you rest on your laurels, and if you're already two generations behind in your fab, you're not going anywhere fast.
People really underestimate how difficult fabrication is. I saw a take on twitter from a popular 'investment guy' who said something along the lines of, "If China wanted to, they could just import an Nvidia GPU, copy its design, and make it cheaper." And people eat this shit up and bet their futures on it.
 
People really underestimate how difficult fabrication is. I saw a take on twitter from a popular 'investment guy' who said something along the lines of, "If China wanted to, they could just import an Nvidia GPU, copy its design, and make it cheaper." And people eat this shit up and bet their futures on it.
I feel like advanced lithography is the closest thing on earth to alchemy/magic.
 
It's cutting-edge nanotechnology that is forced to get better by massive demand. And we may only be scratching the surface (literally, hurr hurr) because we haven't seen what monolithic 3D can do.
And there's, honestly easily hundreds of millions, if not billions being put into it each and every year.

Not to mention, these are fucking big machines. This is ASML's TWINSCAN EXE:5000, a High-NA EUV Lithography machine, they can build about 8 to 12 of them a year.

1738122893147.png


That's 165 tons, $380 million per machine. 185 wafers per hour, and is shipped in 250 crates.
Takes roughly the same amount of engineers 6 months to install.

Intel has one, but its still being installed. TSMC's holding off. Also note, that's just one of the machines needed to make CPUs.

Think of it as the machine that cuts pizza dough into circles. Quite a few steps between the actual dough and a pizza.
 
It's also no longer relevant. Nvidia's "open" drivers work just fine, AMD's advantage has been reduced from "works at all" to "works out of the box", since Nvidia's driver has yet to make it into the kernel proper and still requires you to install from your package manager, which is generally a single command anyway.

I like AMD, but I no longer recommend their GPUs even for Linux users. Nvidia's driver works just as well, nvidia-smi is a smoother way to handle power limits and resource management than AMD's method of directly writing values (which looks like "echo 50000000 > /sys/class/drm/card0/device/hwmon/hwmon0/power1_cap", vs. Nvidia's "nvidia-smi -pl 50" for setting the power limit to 50W). AMD can run AI with things like rocm-pytorch, but Nvidia just runs CUDA directly. Nvidia's raster performance is still a bit lower per money spent on the card, but in exchange you get DLSS, which is just better than XeSS and FSR. And because Nvidia are more popular, they get more support.

It's a shame, the market desperately needs AMD and Intel to be competitive.
I look at the games that uses DLSS and I am still not seeing the need of it. Part of me wants a 4070 SUPER but another tells me that from the games I play, DLSS is not mandatory at all (or even used).
 
I look at the games that uses DLSS and I am still not seeing the need of it. Part of me wants a 4070 SUPER but another tells me that from the games I play, DLSS is not mandatory at all (or even used).
You may not want it today, but games keep getting more demanding (in part because we have DLSS now, so they can), and DLSS means you can display a game rendering in 1440p on a 4k monitor, from my experience the artefacts the AI introduces are so small you have to actively look for them, they’re not visible during gameplay.

Don’t buy a GPU based on what you may need in the future, but don’t discount DLSS as ”just fake frames” or ”only good for unoptimised games”, it’s becoming more necessary every day and it is genuinely good.
 
Tom's Hardware: Nvidia GeForce RTX 5080 Founders Edition review: Incremental gains over the previous generation
Right from the start, the 4K ultra rasterization results give us a reasonable idea of what to expect. The RTX 5080 outperforms the RTX 4080 Super by 9% — and by extension, we'd expect it to be about 12% faster than the vanilla RTX 4080 (which we haven't retested yet). That's... not a lot. In fact, it's a pretty minimal generational improvement.

The 1440p ultra and 1080p ultra/medium results are even worse. At those settings, the 5080 beats the 4080 Super by 7%, 4%, and 3%, respectively. There are a few games where it's actually slower, though we'd chalk those up to immature Blackwell drivers rather than the GPU itself. CPU bottlenecks are definitely a factor at 1080p as well.
The raytracing results are basically identical with an outlier removed.


4K raster = +11%, 4K raytracing = +9%. In two reviews it looks like the RT improvement is worse than raster.

It consistently uses less power than the 4080 Super, and doesn't even hit its 360W TDP on average in the Tom's review.
 
Last edited:
Tom's Hardware: Nvidia GeForce RTX 5080 Founders Edition review: Incremental gains over the previous generation

The raytracing results are basically identical with an outlier removed.


4K raster = +11%, 4K raytracing = +9%. In two reviews it looks like the RT improvement is worse than raster.

It consistently uses less power than the 4080 Super, and doesn't even hit its 360W TDP on average.
obvious outcome looking at the core counts
anyone who has a Lovelace card should definitely be waiting for the next node jump at the very minimum
 
You may not want it today, but games keep getting more demanding (in part because we have DLSS now, so they can), and DLSS means you can display a game rendering in 1440p on a 4k monitor, from my experience the artefacts the AI introduces are so small you have to actively look for them, they’re not visible during gameplay.

Don’t buy a GPU based on what you may need in the future, but don’t discount DLSS as ”just fake frames” or ”only good for unoptimised games”, it’s becoming more necessary every day and it is genuinely good.
I'm not sure I buy this. I enjoy 1080p, I have played games in native 720p not properly upscaled on a 36" 1080p TV and it didn't bother me. I couldn't care less about performance above about 30+ fps at 1080p native, and if the game's fun enough (Maneater) I'll happily play it at lower FPS.

I will probably upgrade from my shitty AMD 4gb card in the next couple years, but it won't be to run higher resolutions- the only thing that excites me about 'high resolution' on LCD's is the chance to run CRT shaders at 4K with retro games, and I'm sure that'll be well within the capabilities of integrated graphics by then if it isn't already.
 
I look at the games that uses DLSS and I am still not seeing the need of it.

It's pretty simple. Games run at faster frame rates at higher settings with it. The practical outcome is that for any given AMD GPU, you can get the same quality by going a full step down with NVIDIA. Sure, my 6700 XT is not exactly stabbing my eyes out, but a 3060 12 GB was $100 cheaper and could run everything just as well by using DLSS. Example, Cyberpunk 2077 runs at 1440p Medium @ 78 fps on a 3060 12 GB with DLSS, while on a 6700 XT, it actually runs a bit slower, closer to 70 fps.

Is Cyberpunk 2077 perfectly fine at 70 fps? Of course it is. But I paid $100 more and run my computer hotter for the experience.

The only reason to turn DLSS off completely is you feel moral outrage at knowing some percentage of the pixels are inferenced rather than rasterized. Because like @Susanna said, the only way you can tell it's on is to take screenshots and pixel-hunt.
 
Last edited:
  • Like
Reactions: Brain Problems
Tom's Hardware: Nvidia GeForce RTX 5080 Founders Edition review: Incremental gains over the previous generation

The raytracing results are basically identical with an outlier removed.


4K raster = +11%, 4K raytracing = +9%. In two reviews it looks like the RT improvement is worse than raster.

It consistently uses less power than the 4080 Super, and doesn't even hit its 360W TDP on average in the Tom's review.
B-b-but 5070 is gonna be like a 4090!

Niggertech strikes again.

Good news is my 4080 laptop will feel just about the same relatively for another generation.
 
It's pretty simple. Games run at faster frame rates at higher settings with it. The practical outcome is that for any given AMD GPU, you can get the same quality by going a full step down with NVIDIA. Sure, my 6700 XT is not exactly stabbing my eyes out, but a 3060 12 GB was $100 cheaper and could run everything just as well by using DLSS. Example, Cyberpunk 2077 runs at 1440p Medium @ 78 fps on a 3060 12 GB with DLSS, while on a 6700 XT, it actually runs a bit slower, closer to 70 fps.

Is Cyberpunk 2077 perfectly fine at 70 fps? Of course it is. But you I paid $100 more and run my computer hotter for the experience.

The only reason to turn DLSS off completely is you feel moral outrage at knowing some percentage of the pixels are inferenced rather than rasterized. Because like @Susanna said, the only way you can tell it's on is to take screenshots and pixel-hunt.
it depends on the implementatio, some games, like cyberpunk, have a great DLSS implementation
other games, like war thunder, have a dogshit implementation
 
I'm not sure I buy this. I enjoy 1080p, I have played games in native 720p not properly upscaled on a 36" 1080p TV and it didn't bother me. I couldn't care less about performance above about 30+ fps at 1080p native, and if the game's fun enough (Maneater) I'll happily play it at lower FPS.
On a TV at normal TV-viewing distances, 1080p is fine and you're going to struggle to resolve more detail. On monitors though, the differences are definitely apparent when you jump from 1080p to 1440p or 4K.

The bigger issue is that a lot of games are increasingly being designed for 1440p or higher (Forspoken was the first major example) and look awful at 1080p. Now Forspoken sucks ass but its existence does seem to imply that we're moving towards a world where 1440p is no longer midrange but now the minimum for big-budget games.

Good news is my 4080 laptop will feel just about the same relatively for another generation.
No one with a Lovelace GPU should really consider upgrading. The only card I'm considering is a 5090 and that's only because I've recently gotten into fucking around with local AI models.
 
The only card I'm considering is a 5090 and that's only because I've recently gotten into fucking around with local AI models.
Keep an eye on Strix Halo. Models with 64-128 GB may be able to beat the 4090 and 5090 when the size is too large for 24-32 GB. It needs to be launched and independently reviewed though.

8yiV8sLajcWb9K5ZqjeVV3.jpg

On the other hand, if you buy a 5090 Founders Edition at the $1,999 MSRP, you can resell it for an ounce of gold.

People Are Already Camping Out for an Nvidia RTX 5090, Despite Retailer Warning About the January Cold
 
I'm probably late as fuck but I just heard Nvidia is getting JUSTed or something regarding AI, what happened?
 
I'm probably late as fuck but I just heard Nvidia is getting JUSTed or something regarding AI, what happened?
Chinese company DeepSeek released a new AI model. It's free, it's open source, and more relevant to Nvidia specifically, it requires a lot less resources while outperforming most other models.
 
is this a good deal or will it explode the moment i plug it in?
In my experience as long as you verify the seller as having a good reputation and plenty of sales, AliExpress wholesale is fine for some electronics. Got a Ryzen 7 7700 for 40% off there (minus fancy branded packaging) after seeing some plebbitors post about it, can recommend. GPUs might be different though, it's a market full of scalpers and scammers

edit: 40% not 50%
 
Last edited:
In my experience as long as you verify the seller as having a good reputation and plenty of sales, AliExpress wholesale is fine for some electronics. Got a Ryzen 7 7700 for 50% off there (minus fancy branded packaging), can recommend. GPUs might be different though, it's a market full of scalpers and scammers
The seller for that particular link appears to be a huge scam based on the reviews.
 
  • Informative
Reactions: Merry
Back