GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

You are obviously overflowing with knowledge on this stuff. Grant me your wisdom, oh beneficent one -- If I wanted to store the keys on a separate device and require it for starting up my planned home-built NAS (but obviously not leave it plugged in the whole time or it's pointless in the case of theft), what should I be looking at? Just a regular old USB drive or something more sophisticated? I believe there are Yubi Keys and presumably competitors. They'd let me lock the drive the keys are on, plug it in on the occasions I need to reboot the NAS, hide it in a sock the rest of the time, right?
Please don't overstate my abilities. The only reason I know this is because my agricultural job also requires encryption since I do basically everything (including handling customer info) and I don't trust any vendor (They're all cloud and Saas these days) nor other employees...

I don't like biometrics since they are not precise enough and in the USA the 5th/4th amendments do not apply to biometric information (retinal, facial, finger, etc).

For the media it can be any removable media. I've done it for shits and giggles once with a VHS tape... I think you are looking at a Yubikey over SSH solution and a keyfile. You "can" use a yubikey support for LUKS, but only with systemd (to my knowledge) on any base system or this project but I do not know much about it. Otherwise you would have to write your own code.

You can also use a one use key, that is how encrypted swap is setup but I would be careful since that would not have a backup... The other keys would also be able to unlock the files I think if they had the (older) header...

For your choice you have a few options:
GnuPG encrypted keyfile on removable media
Removeable media keyfile
Plain* format on removable media
Keyfile on removable media
Detached LUKS header for the above also possible.
Password

Hint: Make backups on of keyfile/header since fucking bitrot.

*See this for explanation on plain vs LUKS:
2.4 What is the difference between "plain" and LUKS format?
 
From both the replies to me on this I wonder if perhaps people are reading into my posts something I don't intend.
I took it to mean this is marketing by AMD, something I didn't know.

The issue is that requirements aren't keeping up with performance, and so cards get more expensive and more powerful for no reason. To the point where people are fine buying used cards from 8 years ago, and still getting decent frames at 1080 which is the most common resolution.
I agree, but disagree at the same time. I had a GPU die on me, so I bought a 1060 3gb to serve as a stop gap until I could aford something better. ...then crypto happened and I was stuck with it for years. The 3gb VRAM was a bottleneck, especially in Resident Evil 2 Remake where my options were minimum texture (smear town) or stuttering whenever new textures streamed in which was all the time.

Most games are made to run on consoles, so as long as your specs (vram included) match or slightly exceed a console, you should be good.

I just don't feel any new release justifies the expense.
a 16GB card is highly overkill for my use case of light internet/retro gaming/occasional AAA at 1080p.
This has always been the case. You had sane graphics cards, and then you had your GTX Titans of the world. There was no real use for all that power, and by the time you could make use of it the shader models and that kind of stuff had changed.

My interest in a beefier card is I plan to get into VR at some point, which is close to 4k at 90fps. However, most of the games are designed to run on a Snapdragon XR2. ie. A mobile phone chip, so my current 6600 is good enough, supposedly.
 
@♂CANAM productions♂ Thanks for that. Didn't mean to oversell you but your replies to me have shown a level of knowledge I don't have so I appreciate the pointers.

I think what I'm going to do is I'm going to focus on my NAS as originally planned. If I do get into AI art I'll need a good storage solution anyway. And in the meantime I'll tinker around with Shark and DirectML and a cloud-based solution. That third one might be simplest for now.

Thanks all.

 
@Overly Serious Quotes/replies not working for me atm. For your last point that has always been the case. It's only been recently I've seen people treat extra vram as "not worth anything" or even "fluffed up stats"...which is really weird behavior. I'm assuming it's just cope from people who massively overspent on a slew of 8-12gb cards during the past 2 years.

Vram has always been a major factor in gpu longevity. No way would the 1080ti have held on for so long if it had 8gb instead of 11gb. I'll always take beefier hardware than software solutions that are sometimes not implemented.

The 1050 has 2 GB and is more popular than the 1080 Ti right now. Sure, buying a computer that overshoots everything currently out means it's going to be a long time before you have to fiddle with settings to achieve a desired frame rate & resolution, but broader industry trends indicate that the vast majority of people do not care if they can buy a brand new game and flip all the settings to maximum on three-year-old hardware.
 
The 1050 has 2 GB and is more popular than the 1080 Ti right now. Sure, buying a computer that overshoots everything currently out means it's going to be a long time before you have to fiddle with settings to achieve a desired frame rate & resolution, but broader industry trends indicate that the vast majority of people do not care if they can buy a brand new game and flip all the settings to maximum on three-year-old hardware.
What does any of that have to do with longevity? Great. People need a bare minimum gpu to render a desktop. If a person wants 1080ti performance, they sure as hell aren't going to care about a 1050, that card is garbage by now.

If I buy a card today and care about it performing a steady amount of work for the next 4 years, I'm going to care about that vram.

Anyone who passes up extra vram, all else being equal, as "don't need that much" is silly. Same as people saying "all I need is a quad core".
 
What does any of that have to do with longevity? Great. People need a bare minimum gpu to render a desktop.

If I buy a card today and care about it performing a steady amount of work for the next 4 years, I'm going to care about that vram.

Anyone who passes up extra vram, all else being equal, as "don't need that much" is silly. Same as people saying "all I need is a quad core".
You fail to realize that a certain segment of the gamer population has given up on modern day AAA. Fallout NV, Oblivion and Skyrim run fine on 2GB VRAM with a quad core. Plus indie games aren't that graphically intensive most of the time.

Oh yeah plus you can emulate most PS2/GC games with a 1660, and Switches cost less than a modern day GPU.
 
You fail to realize that a certain segment of the gamer population has given up on modern day AAA. Fallout NV, Oblivion and Skyrim run fine on 2GB VRAM with a quad core. Plus indie games aren't that graphically intensive most of the time.
Lol, no, I don't fail to realize it. 2gb and a quad core are relative crap nowadays. There's nothing wrong with using it if it's what gets you by. But that's the honest truth of it. It's cool if that's what you have or you got it for like $100. But now when a 5600X is a tad over $100 and a 1080ti is in the $200s? Nope.

*Edit* Correction, ebay 1080tis are now under $200. Crazy. 1050 2gb usually sell for $60-$100. If I need it for a special sff solution...maybe. For an actual pc? No brainer. 1080ti all day. Even for just a "renders dekstop" card, I'd much rather just run a cpu with igpu. I think the newer igpus actually beat 1050tis.
 
Last edited:
What does any of that have to do with longevity?

The GTX 1050 came out in 2016, a year before the 1080 Ti. Does that not count as longevity?

Great. People need a bare minimum gpu to render a desktop. If a person wants 1080ti performance, they sure as hell aren't going to care about a 1050, that card is garbage by now.

Steam usage statistics show us what people are gaming with. The ~1% of gamers who have a 1080 Ti are happy with it, sure. But the fact that about 10x as many people are gaming with 1050 & 1060 variants as 1080s shows that a product can survive a long time without being high end these days.

If I buy a card today and care about it performing a steady amount of work for the next 4 years, I'm going to care about that vram.

Silicon doesn't wear out. Unless you develop a thermal problem, it can do exactly as much work in 10 years as it can today.

Anyone who passes up extra vram, all else being equal, as "don't need that much" is silly. Same as people saying "all I need is a quad core".

No vendors add extra VRAM for free. And FWIW, more people game on 4 cores or less than 8 cores or more. Point is, the average gamer has much shittier hardware than anyone who's mating a 4090 to a 7800 X3D realizes.
 
You fail to realize that a certain segment of the gamer population has given up on modern day AAA. Fallout NV, Oblivion and Skyrim run fine on 2GB VRAM with a quad core. Plus indie games aren't that graphically intensive most of the time.

Oh yeah plus you can emulate most PS2/GC games with a 1660, and Switches cost less than a modern day GPU.
That and a lot of people can't afford the latest and greatest but still enjoy playing games with older or lower end hardware. There is a lot of marketing hype in PC gaming and a lot of people have been duped into chasing "the best". If you want the best thing money can buy and can afford it that's fine, if you are content gaming on lower end hardware that's also fine.

I think I already posted in this thread about it but for a while the only thing I had that ran games halfway well was a laptop rocking a 950m with 2GB VRAM and I had a blast. I didn't even realise a lot of games were running mid 30s FPS wise until I decided to mess around with afterburner. On the flip side, playing some games on my new desktop with a decent GPU has been a miserable experience. No amount of ultra textures and graphical bells and whistles is going to turn pig slop like the nu Raider trilogy into good games.
 
Gonna call bullshit on this. I can maybe see Dolphin running on those specs but PCSX2? No chance.
It ran good enough for me to play a couple games from start to finish. Shadow of the Colossus was the only one I couldn't make work due to stutter and frame dips. Dolphin ran great at 720p. No problem there. The CPU was an E8600 and the computer had 8gigs of DDR3 ram. GT 710 shat all over the motherboard integrated graphics. (This was all about 7 years ago.)
 
Last edited:
I cannot find concrete answers on the web so I will ask here.
How bad are nvidia quadro cards on linux compared to windows? By this I mean, feature wise, performance wise and stability wise.
I know nvidia has bad rep on linux gaming wise, but compute wise I know nothing about, and I am having a hard time finding shit that is not related to gaming.
I do not yet have the full machine at my disposal so I cannot test it myself. My workload specifically is architectural renderings, meaning vram intense scenes.
 
I cannot find concrete answers on the web so I will ask here.
How bad are nvidia quadro cards on linux compared to windows? By this I mean, feature wise, performance wise and stability wise.
I know nvidia has bad rep on linux gaming wise, but compute wise I know nothing about, and I am having a hard time finding shit that is not related to gaming.
I do not yet have the full machine at my disposal so I cannot test it myself. My workload specifically is architectural renderings, meaning vram intense scenes.
It works alright, especially on LTS kernels. The issue is with either very new kernels (Nvidia's driver comes in the form of a dkms module, which can sometimes be problematic on rolling release distros) or with using Wayland (like at all, it's a major pain). X11, rendering, and CUDA all work fine.
They're remaking their driver in a more Linux-friendly form, so anything 30 series or newer, including the quadro versions, is eventually going to run Wayland as smoothly as AMD's products.
 
Unless they improved it A LOT. I used to emulate pcsx2 on a Q6600 and a 560ti. It was...not great, and core 2 duos were much shittier than the quads.
It ran good enough for me to play a couple games from start to finish. Shadow of the Colossus was the only one I couldn't make work due to stutter and frame dips. Dolphin ran great at 720p. No problem there. The CPU was an E8600 and the computer had 8gigs of DDR3 ram. GT 710 shat all over the motherboard integrated graphics. (This was all about 7 years ago.)
PCSX2 has improved a lot, but this raises the minimum specs due to the increased accuracy. From the Github page

Minimum:
CPUGPU
- Supports SSE4.1
- PassMark Thread Performance rating near or greater than 1800
- Two physical cores, with hyperthreading
- Direct3D10 support
- OpenGL 3.x support
- Vulkan 1.1 support
- Metal support
- PassMark G3D Mark rating around 3000 (Geforce GTX 750, Radeon RX 560, Intel Arc A380)
- 2 GB Video Memory
Recommended:
CPUGPU
- Supports AVX2
- PassMark Single Thread Performance rating near or greater than 2600
- Four physical cores, with or without hyperthreading
- Direct3D12 support
- OpenGL 4.6 support
- Vulkan 1.3 support
- Metal support
- PassMark G3D Mark rating around 6000 (GeForce GTX 1650, Radeon RX 570)
- 4 GB Video Memory

For reference, the E8600 has a passmark single-thread score of 1,401. The GT 710 G3D score is 634. I can absolutely believe it ran at the time, but not in current year.
 
  • Informative
Reactions: Gog & Magog
Honestly, I don't think +10GB of VRAM matters for your average gamer, however having a metric fuckton of VRAM will be beneficial for people who are getting into machine learning, as that's where you benefit the most from extra VRAM, and that's something that's very much used in the consumer market. Obviously it makes no sense to market your card to cater only to that, but this is simply something that a card manufacturer might add to their product just to be competitive in this single niche.

I have a GTX 1060 6GB, and it's not the 6GB of VRAM that's dragging my performance down, but rather the GPU core itself which is like 8 years old by this point.
 
Honestly, I don't think +10GB of VRAM matters for your average gamer, however having a metric fuckton of VRAM will be beneficial for people who are getting into machine learning, as that's where you benefit the most from extra VRAM, and that's something that's very much used in the consumer market. Obviously it makes no sense to market your card to cater only to that, but this is simply something that a card manufacturer might add to their product just to be competitive in this single niche.

I have a GTX 1060 6GB, and it's not the 6GB of VRAM that's dragging my performance down, but rather the GPU core itself which is like 8 years old by this point.
I've used all 8gb of my vram doing hair particle simulation in blender. There's 3d modeling work too that can benefit.
 
Honestly, I don't think +10GB of VRAM matters for your average gamer, however having a metric fuckton of VRAM will be beneficial for people who are getting into machine learning, as that's where you benefit the most from extra VRAM, and that's something that's very much used in the consumer market. Obviously it makes no sense to market your card to cater only to that, but this is simply something that a card manufacturer might add to their product just to be competitive in this single niche.

I have a GTX 1060 6GB, and it's not the 6GB of VRAM that's dragging my performance down, but rather the GPU core itself which is like 8 years old by this point.
Regarding the VRAM debate, I still believe 8GB should be enough for a while for gaming. I place blame on developers not optimizing and compressing their materials (or games at all lately), and you don´t need to run max graphics all the time. Of course, if you spend serious money on a GPU it better be capable of doing that, but having huge amounts of VRAM won´t stop some soydev cramming 8k materials.
 
Back