GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Swapping my PC to a new case and PSU made me thoroughly hate the 24 pin. 12VO is a meh thing thing, IDK. No strong feelings about it.
I think I like it, but it's been out for years and there's no retail motherboards for it. Hopefully MSI is successful at making it a thing.
 
So you're saying time to stock up on some NVME drives now while prices are relatively low.
Yes, well the peak was summer/fall of last year. I bought a bunch of MSI Spatium 1TB for $25 last summer. They should keep going in price for the 1st half of 2024.
 
A NAS like that is thousands. Lol, im not rich. My setup is fine with the parts I have. I'm fine with a bit of Compromise
My NAS is an old prebuilt I bought used for pennies and then swapped some parts, I bought all the drives on sale in the last 3 years and have a couple PB of storage now. All in and I'm less than 800 USD. I was sick of all my drives being 90% full all the time and having to delete games just to edit video. Now I have my own plex server so my family and I can watch whatever whenever wherever for free with no ads and whenever I want to play a new game I just have to drag over a folder. You don't have to go crazy, you can use any old piece of shit for cold storage.
 
  • Like
Reactions: WelperHelper99
My NAS is an old prebuilt I bought used for pennies and then swapped some parts, I bought all the drives on sale in the last 3 years and have a couple PB of storage now. All in and I'm less than 800 USD. I was sick of all my drives being 90% full all the time and having to delete games just to edit video. Now I have my own plex server so my family and I can watch whatever whenever wherever for free with no ads and whenever I want to play a new game I just have to drag over a folder. You don't have to go crazy, you can use any old piece of shit for cold storage.
That sounds like a fun FUTURE project. I just want a all in one rig rn man. Has a few TB of space for games and files, and generally will last until I'm a bit more set up life wise.

And really, I want to have FUN with my build. Multiple storage drives and formats internally is just different, I'm all about that
 
What's a recommended video card for doing AI and light gaming without breaking the bank? I know NVidia is the best brand, but I understand I'll need a lot of video memory - is 12G enough, or should I look for a 16G card?
 
Once again I am justified in getting a 12th gen Intel core ( the highest my board can technically go up to is 13th gen, but it might have issues).

At least I'll be getting good single core performance for games that aren't multi threaded ( as in most of them).

I did notice however even in the article, there were concerns about the rig they tested the 14th gen core in that it wasn't fully optimized. To be fair as well, most builds aren't. Take that as you will
 
I've been holding back to avoid making you guys super jelly ... but my tricked-out, quad-core Mac Pro with high-powered INTEL XEON PROCESSOR (v2, no less!) has now attained levels of performance that rival a core of an i7 processor! Achieving a score of almost 2900 on the GeekBench indicator, this bad baby's going to keep puttering along for at least another decade. The graphics card has 2MB of memory, and there's a second graphics card just sitting there, unusable by any extant Macintosh applications.

Not contented with this nearly unlimited power, I have ordered the 12-core Intel Xeon UPGRADE for the low price of $25 on eBay. If you buy the same processor as an "upgrade kit" from one of the hand-holding boomer ripoff companies, you will pay $299 for the same processor and they'll throw in some thermal paste. The upgrade kit CPUs aren't even new, they're "renewed"!

That's a 10x price differential because Mac users 1) fear change, 2) fear computers, 3) will not independently research things that sound "hard," and 4) have a warped sense of what's "normal" to pay for computing equipment. There are boomers RIGHT NOW with the same trashcan Mac Pro as I who are ordering a $299 "upgrade kit" for a fucking 10-year-old processor worth $25. This is Peak Grift. I should have ordered the damn Mac Mini with its 8GB of RAM. I will say, the 64GB of cheap old RAM on this Mac Pro has been absolutely wonderful. I can have thirty Chrome tabs and half a dozen hundred-page PDFs open and she handles everything I throw at her and more. Yeah!
 
  • Feels
Reactions: The Ugly One
What's a recommended video card for doing AI and light gaming without breaking the bank? I know NVidia is the best brand, but I understand I'll need a lot of video memory - is 12G enough, or should I look for a 16G card?
I'd say 12GB is fine. I have mostly 8GB and they work for most stuff, the problem occurs when I try and do AI and run Chrome. 12GB would give enough headroom to run other apps at the same time. Obviously more memory, more better.

I do find Nvidia's current lineup a bit odd.
8 for the 4060. 8 or 16 for the 4060 Ti. 12 for the 4070s. Back to 16 for the 4080 and then 24 for the 4090. Obviously the 4060 Ti is slower memory but whatever.

I'm guessing the product line was locked in before home AI exploded.
 
I'm guessing the product line was locked in before home AI exploded.
To the extent that Nvidia supports home AI, they still want you using Quadros for it. It's to their credit they support CUDA across the lineup when AMD only does the same for ROCM at great delay, but if the option existed to revoke CUDA from the consumer RTX lineup, they probably would do so. When they first added it the idea was to lock people into their platform by making it the most accessible and user friendly, which is a great approach. You didn't need a quadro, any consumer GPU would do. But CUDA is now established and they could make a lot of money from the home AI community if they forced them to move to Quadros, and there's not much Apple, AMD, or Intel could do to stop that, none have strong counterparts to CUDA and even when you do go through the hassle of setting up something like ROCM-PyTorch the performance isn't impressive.

The gaming cards are allocated memory entirely according to the market they're targeted at: the minimum to support recommendations for the current lineup of AAA games is 8GB, so that's what the 4060 gets. 12GB lets you max out texture resolution in almost any game, and 16GB future proofs you for a few years. Why the 4090 got 24GB is actually a bit confusing to me, no current titles require that much and it seems reasonable to predict that by the time future titles do, the 4090 won't be a top performer. It strikes me as a sort of A6000-lite, and it probably did steal a fair bit of marketshare from that product. A miscalculated attempt to compete with what the 7900XTX should have been?
 
To the extent that Nvidia supports home AI, they still want you using Quadros for it.
they discontinued the quadro line in 2020. its all rtx cards now. you want more vram, you go into server gpu territory like the a100, but that thing is extremely expensive. many thousands of dollars so
 
they discontinued the quadro line in 2020. its all rtx cards now. you want more vram, you go into server gpu territory like the a100, but that thing is extremely expensive. many thousands of dollars so
You're right, I meant the RTX 4500, 5000, and 6000. The workstation cards. Things were so much more convenient when you could just say Quadro instead of having to list a bunch of cards that sound like they're in the gaming segment.
 
Once again I am justified in getting a 12th gen Intel core ( the highest my board can technically go up to is 13th gen, but it might have issues).

At least I'll be getting good single core performance for games that aren't multi threaded ( as in most of them).

I did notice however even in the article, there were concerns about the rig they tested the 14th gen core in that it wasn't fully optimized. To be fair as well, most builds aren't. Take that as you will

This article's about the 14th gen laptop chip, Meteor Lake. The "14th gen" desktop is just 12th gen++.

But CUDA is now established and they could make a lot of money from the home AI community if they forced them to move to Quadros

No, they couldn't. The desktop GPU market is maybe 1/3 the size of the datacenter GPU market, and hobbyists are a rounding error on that. NVIDIA engages desktop AI enthusiasts for two reasons. One is PR & goodwill, and the other is to bootstrap their own ecosystem, same reason Microsoft included GW-BASIC with MS-DOS. Today's nerdy kid playing around with pytorch is tomorrow's software engineer.

Why the 4090 got 24GB is actually a bit confusing to me, no current titles require that much and it seems reasonable to predict that by the time future titles do, the 4090 won't be a top performer.

Lifetime of a high-end GPU is currently about 7 years, with most games requiring 2GB of VRAM to run. With Moore's Law about to die, wouldn't be surprised if a 4090 meets min specs for games 10 years from now.
 
What are my options if I want to sell my video card but not on Ebay? I don't have an account there so I think people wouldn't trust my posting. I've also heard that it's not uncommon for buyers to dispute purchases to scam the seller
 
What are my options if I want to sell my video card but not on Ebay? I don't have an account there so I think people wouldn't trust my posting. I've also heard that it's not uncommon for buyers to dispute purchases to scam the seller
Facebook Marketplace? That way you can just meet near a local police station or somewhere public and deal with cash
 
No, they couldn't. The desktop GPU market is maybe 1/3 the size of the datacenter GPU market, and hobbyists are a rounding error on that.
Well half more specifically. Nvidia does segregate their revenue into Data Center and Gaming, and last quarter Gaming was half the revenue of their Data Center sales. I would expect profits to more weighed for Data Centers given higher end cards have higher margins by far.
 
I'd say 12GB is fine. I have mostly 8GB and they work for most stuff, the problem occurs when I try and do AI and run Chrome. 12GB would give enough headroom to run other apps at the same time. Obviously more memory, more better.

I do find Nvidia's current lineup a bit odd.
8 for the 4060. 8 or 16 for the 4060 Ti. 12 for the 4070s. Back to 16 for the 4080 and then 24 for the 4090. Obviously the 4060 Ti is slower memory but whatever.

I'm guessing the product line was locked in before home AI exploded.
I noticed that when mulling over the 4070 for my PC build, the 12 gigs felt odd considering that there's a 16 gig option for the 4060. Personally debating on a 4070 if only for A.I. processing abilities, even if it's minimal, still want to make art and stuff.
This article's about the 14th gen laptop chip, Meteor Lake. The "14th gen" desktop is just 12th gen++.
To be fair I can't blame them. I haven't heard many problems about 12th gen. It's smart to build off it
 
I noticed that when mulling over the 4070 for my PC build, the 12 gigs felt odd considering that there's a 16 gig option for the 4060. Personally debating on a 4070 if only for A.I. processing abilities, even if it's minimal, still want to make art and stuff.

To be fair I can't blame them. I haven't heard many problems about 12th gen. It's smart to build off it
If I were you I'd lean to the 16GB 4060 rather than the 12GB 4070 if AI is all you're planning to do. It'll be marginally slower, but I've always found speed to be less limiting than capacity. More memory means I can generate bigger frames (or batches) while more compute only speeds up generation time of individual frames. Most of the time your workflow will be "tinker with the prompt by generating single pictures until confident it works, then generate a tonne of pictures while you step away to do whatever else, then return to find the two or three good ones or to refine with img2img". The difference is that you have time to read a whole chapter rather than just half of one, or something like that. A big deal if you're making money off AI, but unimportant for a hobbyist. Being stuck generating sad 512x512 and then using upscaling to get something less useless is pretty depressing when a cheaper GPU can do 1024x1024 with latent upscaling to 2048x2048, if a bit slower. That 4060Ti is basically the ideal hobby AI card as far as I can tell, unless you have the budget for a 4090.
 
If I were you I'd lean to the 16GB 4060 rather than the 12GB 4070 if AI is all you're planning to do. It'll be marginally slower, but I've always found speed to be less limiting than capacity. More memory means I can generate bigger frames (or batches) while more compute only speeds up generation time of individual frames. Most of the time your workflow will be "tinker with the prompt by generating single pictures until confident it works, then generate a tonne of pictures while you step away to do whatever else, then return to find the two or three good ones or to refine with img2img". The difference is that you have time to read a whole chapter rather than just half of one, or something like that. A big deal if you're making money off AI, but unimportant for a hobbyist. Being stuck generating sad 512x512 and then using upscaling to get something less useless is pretty depressing when a cheaper GPU can do 1024x1024 with latent upscaling to 2048x2048, if a bit slower. That 4060Ti is basically the ideal hobby AI card as far as I can tell, unless you have the budget for a 4090.
The thing is the 4070 can do more generally. I'm OK with it being a middle ground on the A.I. front, as long as it does its other tasks well
 
Back