GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I cannot find concrete answers on the web so I will ask here.
How bad are nvidia quadro cards on linux compared to windows? By this I mean, feature wise, performance wise and stability wise.
I know nvidia has bad rep on linux gaming wise, but compute wise I know nothing about, and I am having a hard time finding shit that is not related to gaming.
I do not yet have the full machine at my disposal so I cannot test it myself. My workload specifically is architectural renderings, meaning vram intense scenes.
Weirdly enough Nvidia's largest priority on Linux are maintaining their Quadro and Tesla Drivers. You should be fine. Just use the Nvidia driver instead of the open source one.
Edit: Your mileage may very on the difficulty of getting Cuda and Nvidia SMI running. What distro are you planning on running? Maybe we can give you some pointers.
 
  • Like
Reactions: Premium Dog Food
Weirdly enough Nvidia's largest priority on Linux are maintaining their Quadro and Tesla Drivers. You should be fine. Just use the Nvidia driver instead of the open source one.
Edit: Your mileage may very on the difficulty of getting Cuda and Nvidia SMI running. What distro are you planning on running? Maybe we can give you some pointers.
I am planning on using arch or something based off arch. The machine just needs to run some compute tasks on demand, so I figure going light is the way. Not that it's gonna be a problem on a workstation, but less stuff there is installed the less the chance something fucks up.
 
I am planning on using arch or something based off arch. The machine just needs to run some compute tasks on demand, so I figure going light is the way. Not that it's gonna be a problem on a workstation, but less stuff there is installed the less the chance something fucks up.
I would really advise you to use Ubuntu or Debian. Unless you really know what you are doing with Arch.
 
  • Like
Reactions: Premium Dog Food
Who knows, a month or two when I'm ready AMD might have announced RDNA 3 support is right around the corner and I'll just hang on.
Is this enough support for you?
 
Weirdly enough Nvidia's largest priority on Linux are maintaining their Quadro and Tesla Drivers.

When the Ampere line launched, those A100 GPUs were $20K a pop. A DGX box ran $300K. Datacenter hardware brings oceans of margin compared to gamer hardware.

I am planning on using arch or something based off arch. The machine just needs to run some compute tasks on demand, so I figure going light is the way. Not that it's gonna be a problem on a workstation, but less stuff there is installed the less the chance something fucks up.

Strongly recommend you do not do this. NVIDIA's CUDA drivers are validated on Ubuntu LTS, RHEL, and SLES. Rolling-release versions of Linux are a dodgy proposition.
 
Strongly recommend you do not do this. NVIDIA's CUDA drivers are validated on Ubuntu LTS, RHEL, and SLES. Rolling-release versions of Linux are a dodgy proposition.
I was not planning on updating that machine often under arch anyway, but I will go with Ubuntu. I assume I can use Xubuntu or Kubuntu fine? I cannot stand gnome. I assume differences are mainly about DEs?
 
  • Like
Reactions: Overly Serious
I've been tempted to do something similar with a PS1, Dreamcast, or PS2 collection. I think the PS2 has a few tb for the whole thing. Would want it external though.
Why use an NVMe SSD for that? I've seen several people thinking about going down this particular path and I feel like I'm missing something. PS2 and Xbox games on modded consoles worked just fine over network shares using 100mbit ethernet, there's no need to be able to read the entire ISO in one second. Is there something I'm missing? Even if they're in CHD on crappy mechanical SMR drives those will blow the native PS2 disc read speed and random access times out of the water.
Steam usage statistics show us what people are gaming with. The ~1% of gamers who have a 1080 Ti are happy with it, sure. But the fact that about 10x as many people are gaming with 1050 & 1060 variants as 1080s shows that a product can survive a long time without being high end these days.
I think 1080 owners were the majority of customers buying 2080s, just like 2080 owners were the ones most likely to buy 3080s and so on. They're the consumers that are chasing the high end.
You can emulate those with a 3.0 GHz Core2 duo from 2009 and a little gpu like a GT 710.
Gonna call bullshit on this. I can maybe see Dolphin running on those specs but PCSX2? No chance.
There's a fork of Dolphin that runs well on the Intel IGP found in years old cheap laptops. I know this because Budget Builds is a fun channel on youtube where a guy sees what can be done with old/cheap computer hardware that sometimes can be had for for so little it wouldn't even buy you burger king meal.
 
I think the problem with Nvidia/AMD is that they are trying to make the best card they can to beat the other. The issue is that requirements aren't keeping up with performance, and so cards get more expensive and more powerful for no reason. To the point where people are fine buying used cards from 8 years ago, and still getting decent frames at 1080 which is the most common resolution. There is going to come a time when Nvidia and AMD are going to have a bloodbath with their new line up, and we are seeing that right now with the 4000 series on the Nvidia side. I'm sure AMD isn't far behind. $200-300 is the ideal range for a midrange graphics card in 2k and a high res in 1080, and eventually the markets are going to reflect that.
Agreed, we're in this stretch now where every techtuber is telling us we need more VRAM for shitty new games that aren't graphically impressive whatsoever. Thus, we need GPUs that are so expensive that don't make sense for nearly everyone.

I have a huge 4k monitor and a 3060 Ti. It plays every game fine. Sure, some newer titles only at 60 fps with lower settings or I can play at 1080p resolution if I really want high framerates. They're still more than playable. I guess I have that old mindset when you upgraded your PC, it was just to be able to play games at all.
 
I was not planning on updating that machine often under arch anyway, but I will go with Ubuntu. I assume I can use Xubuntu or Kubuntu fine? I cannot stand gnome. I assume differences are mainly about DEs?
Use Mint XFCE or Mate and call it a day. 21.1 is supported until 2027 and no snap store bullshit to deal with.
 
  • Like
Reactions: Premium Dog Food
I think 1080 owners were the majority of customers buying 2080s, just like 2080 owners were the ones most likely to buy 3080s and so on. They're the consumers that are chasing the high end.

If the majority of people who buy high-end cards are just upgrading last year's high-end card, then that means future-proofing isn't a concern on the high end? Regardless of how often people who buy high-end cards upgrade, if you add up all the 1080, 2080, 3080, and 4090 variants on Steam right now, you come out to something like 6% of the total market. The only point here is that people chasing the high end vastly underestimate how long low and midrange hardware is viable for gaming, and as a consequence completely mispredict when game publishers are going to make yesteryear's high-end card the min spec. That's not even considering the fact that the top games these days are all ancient. The top games on Steam will run on potatoes.
 
If the majority of people who buy high-end cards are just upgrading last year's high-end card, then that means future-proofing isn't a concern on the high end? Regardless of how often people who buy high-end cards upgrade, if you add up all the 1080, 2080, 3080, and 4090 variants on Steam right now, you come out to something like 6% of the total market. The only point here is that people chasing the high end vastly underestimate how long low and midrange hardware is viable for gaming, and as a consequence completely mispredict when game publishers are going to make yesteryear's high-end card the min spec. That's not even considering the fact that the top games these days are all ancient. The top games on Steam will run on potatoes.
Sure(edit: I mean that owners of the newest card are likely to buy the new newest card), those that I know that go for the latest 80/90s do so because there's a new one that is better than the last generation and they have the money(but even that enthusiasm is dropping off).
That does not apply to everyone that buys those cards but there is absolutely a segment that both Nvidia and AMD caters to by making sure that the high end cards is launching before anything else. Point is that some people are chasing the high end just to have the high end and replace the last thing. I personally think that phone consumers are the biggest fish for that kind of thinking, at least pc gamers have some idea of what they're buying.
 
  • Like
Reactions: Brain Problems
I was not planning on updating that machine often under arch anyway, but I will go with Ubuntu. I assume I can use Xubuntu or Kubuntu fine? I cannot stand gnome. I assume differences are mainly about DEs?
You are correct the only difference is the DE on those, and you can install any other DE using apt at any time.

Use Mint XFCE or Mate and call it a day. 21.1 is supported until 2027 and no snap store bullshit to deal with.

Second this Networkd is an abortion of a system that will make you hate Nano. NetworkManager master race.
 
Last edited:
  • Like
Reactions: Premium Dog Food
Slight PL here, because people have some weird reactions to me.

I am saying the things I say because I've been in the enterprise hardware/software biz for a long time, and as the lameness of the PS4 rekindled my interest in gaming hardware, I've largely seen the same trends happening.

Gaming software is largely developed the same way as enterprise software:

1683123569992.png


You have the occasional game that deliberately targets only the top 50%, or even the top 25% of the market (note that Crytek nearly went broke targeting the top ~2% and had to abandon their founding vision).

That middle 90% is where it's at if you want to make big money. Right now, that's gamers with video cards ranging from 3 GB to 12 GB. I'm just eyeballing by clicking through new games on Steam, but most of them require 2 GB to 4 GB VRAM, and a few require 8 GB. So if we want to kind of guesstimate when 16 GB will be the min spec for so many games that it's basically the lowest you can get away with (i.e. you might miss out some AAA titles, but there's still plenty for you to enjoy), given that 3 GB was high end in 2012, and 16 GB was high end in 2022 (24 GB is more like "ultra"), I'm going to guess it becomes the minimum some time in 2030 or even 2032.
 
Last edited:
Slight PL here, because people have some weird reactions to me.
Lol my whole thing is yes, people can use potato hardware just fine. But why would anyone willingly buy something like a 1050 2gb nowadays? Or even a 1050ti 4gb? You can get an rx 580 for under $100 nowadays. You can get even a 1080ti for under $200.

Idc if a 1050 is "good enough". It's great if that's what you have and it's still working. It's a shit card nowadays that can't be sold cheaply enough to justify. Idk if I'd give $50 for one. Any recent APU will be better if you're really aiming for SFF. If people are fine playing 720p on ultra low settings, more power to them. The midrange is larger than ever atm with so many cheap choices that I cannot see why anyone would ever be like "yeah, 2gb is enough for me".

*Edit* Worth noting these are all US prices before someone from Europe gets their panties in a bunch.
 
  • Like
Reactions: Overly Serious
Lol my whole thing is yes, people can use potato hardware just fine. But why would anyone willingly buy something like a 1050 2gb nowadays? Or even a 1050ti 4gb? You can get an rx 580 for under $100 nowadays. You can get even a 1080ti for under $200.

Idc if a 1050 is "good enough". It's great if that's what you have and it's still working. It's a shit card nowadays that can't be sold cheaply enough to justify. Idk if I'd give $50 for one. Any recent APU will be better if you're really aiming for SFF. If people are fine playing 720p on ultra low settings, more power to them. The midrange is larger than ever atm with so many cheap choices that I cannot see why anyone would ever be like "yeah, 2gb is enough for me".

*Edit* Worth noting these are all US prices before someone from Europe gets their panties in a bunch.

You seem to think most people buy a computer every year, since you're fixated on Newegg prices on vintage parts. They don't - the secondary market is borderline irrelevant, so why people buy old shit online doesn't really matter. The vast majority of people gaming on 1050s purchased them in 2016-17, when they were new, and if they're not upgrading, it's in part because not upgrading costs $0.00, and they're still fine with what that gives them access to.
 
That middle 90% is where it's at if you want to make big money. Right now, that's gamers with video cards ranging from 3 GB to 12 GB. I'm just eyeballing by clicking through new games on Steam, but most of them require 2 GB to 4 GB VRAM, and a few require 8 GB. So if we want to kind of guesstimate when 16 GB will be the min spec for so many games that it's basically the lowest you can get away with (i.e. you miss out on some big AAA titles), given that 3 GB was high end in 2012, and 16 GB was high end in 2022, I'm going to guess it becomes the minimum some time in 2030 or even 2032.
"min spec" to me means the game does not outright refuse to run, and could mean as low as 720p, low settings, with a target of 30 FPS (hello Steam Deck). There is no standard for "minimum" and "recommended" requirements and they are flexible. I saw a comparison of Hogwarts Legacy running on GTX 960 2 GB and 4 GB models. Yeah, it will run, but there are compromises. You will need more VRAM for 1440p/4K, and adding ray-tracing on top takes more VRAM.

Just some tard math here: 4K120 is 8 times the pixel rate of 1080p60, 36 times 720p30. A game can be fully playable with a GPU 2% as powerful as what someone else is using, and a fraction of the VRAM. So it's possible for a developer to target the middle 90%, the top 5%, and maybe some of the bottom 5%.

The problem comes when a top 5-10 percenter expects their powerful 8 GB VRAM card like the RTX 3070 Ti card to maintain high settings forever. Nope, you got fucked by weird VRAM segmentation and you'll have to turn some things down. Or the game will do it for you by sneaking in lower quality assets.
 
  • Agree
Reactions: Brain Problems
The problem comes when a top 5-10 percenter expects their powerful 8 GB VRAM card like the RTX 3070 Ti card to maintain high settings forever. Nope, you got fucked by weird VRAM segmentation and you'll have to turn some things down. Or the game will do it for you by sneaking in lower quality assets.

Yeah, I never understood this. Graphics algorithms are fundamentally scalable. "Max settings" is just whatever number the developer decided was the highest anything needed to scale today for various effects - highest number of light sources, deepest level of AA, largest texture size, highest accuracy for shadows, etc. What, you thought nobody would ever come up with a bigger number?
 
Yeah, I never understood this. Graphics algorithms are fundamentally scalable. "Max settings" is just whatever number the developer decided was the highest anything needed to scale today for various effects - highest number of light sources, deepest level of AA, largest texture size, highest accuracy for shadows, etc. What, you thought nobody would ever come up with a bigger number?
So what you’re saying is that game developers are hurting us graphics card enthusiasts by putting in sliders instead of text boxes where we can just enter arbitrary numbers?
 
So what you’re saying is that game developers are hurting us graphics card enthusiasts by putting in sliders instead of text boxes where we can just enter arbitrary numbers?

Purely algorithmic things are already typically unbounded in the software - they just grab whatever's available from your drivers, propagate the UI with options, and scale. A long time ago, the highest resolution anyone had was 1080p. Today, it's 4K or more. All that changed the definition of "max settings" is number go up. If you bought a GPU under the expectation that nobody would ever invent a number bigger than 1080, you were really disappointed when the first 4K monitors hit.
 
Is this enough support for you?

Ah-ha-ha-ha! Talk about timing. Lisa Su confirmed to be lurking on KF! :D

Need to go and see if this checks out. Much obliged.

Slight PL here, because people have some weird reactions to me.

I am saying the things I say because I've been in the enterprise hardware/software biz for a long time, and as the lameness of the PS4 rekindled my interest in gaming hardware, I've largely seen the same trends happening.

Gaming software is largely developed the same way as enterprise software:

View attachment 5110961


You have the occasional game that deliberately targets only the top 50%, or even the top 25% of the market (note that Crytek nearly went broke targeting the top ~2% and had to abandon their founding vision).

That middle 90% is where it's at if you want to make big money. Right now, that's gamers with video cards ranging from 3 GB to 12 GB. I'm just eyeballing by clicking through new games on Steam, but most of them require 2 GB to 4 GB VRAM, and a few require 8 GB. So if we want to kind of guesstimate when 16 GB will be the min spec for so many games that it's basically the lowest you can get away with (i.e. you might miss out some AAA titles, but there's still plenty for you to enjoy), given that 3 GB was high end in 2012, and 16 GB was high end in 2022 (24 GB is more like "ultra"), I'm going to guess it becomes the minimum some time in 2030 or even 2032.
That bell curve graph shows usage but if it were to show profit vs. usage it would have a weird but not insignificant little secondary hump over to the right. The reason is Halo products. Pricing models follow a point of cost vs. benefit for the customers, find the points where people will pay a bit extra to get a bit extra, settle for a little less in return for paying a little less - those are your pricepoints and they follow a sensible(ish) correlation between value and cost. But there is a segment of the market that is outside of the ordinary level disposable income / willingness to buy this specific type of product, that the normal cost-benefit sheers off. These people will just buy the best regardless (more or less). So you get these Halo products where diminishing returns point has been shot right past, it's calculated on a whole different curve.

And we know it's significant because companies find it worth their while to target this demographic with specifically tailored products.

I say all this because your graph shows profit from a game producer's point of view. But we were originally talking about card manufacturer's point of view and that one looks a little different. It's more of a bactrian camel than a dromedary.
 
Back