GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Use a Lambda instance. It's more cost efficient and they are just better machines. Trying to keep up with VRAM requirements using local hardware is a crap shoot. Nvidia has made it abundantly clear that high VRAM is premium and only for their most expensive cards.
If you must have an onsite card for small models. Bang for buck I would recommend a 3060 12GB or 3080 12GB. If you don't care about tensor cores, maybe check out the Tesla p40. For the uninitiated it's a Titan XD with no video outputs and 24GB VRAM. Only problem is it's slower than a 3060 and requires fan and PSU mods that usually make it take up 2 extra Pcie slots or extra room in the front. It will still come in at around the price of a 3060 though.
Given the plummeting costs of last gen GPUs, I would prefer to run locally than support Amazon, especially given a GPU has multiple uses. Otherwise, 12GB is becoming the low end when it comes to models, hence my initial question on the condition of used 3090s.
 
Given the plummeting costs of last gen GPUs, I would prefer to run locally than support Amazon, especially given a GPU has multiple uses. Otherwise, 12GB is becoming the low end when it comes to models, hence my initial question on the condition of used 3090s.
I was talking about Lambda Labs not AWS. AWS is prohibitively expensive. Buying GPUs on Ebay isn't that bad. A 3090 is going to run you in the 800-900 range. Compare that to 900 or so hours worth of ML with a card that has 40GB worth of VRAM and you should have your answer.
 
So, does anyone have any experience with used 3090s?
I bought one used some time ago for benchmarking my workloads and see if it was worth keeping. The card ran fine but extremely hot and power hungry, which is what I expected. No memory issues that I have seen during a week or so of testing, but I did return it because I did not like memory temps staying around 95-100 degrees during sustained workloads, with fan speed cranked up.
My opinion is, if the price is good and you have the possibility of returning it, take the gamble. Most likely it is fine.
 
  • Informative
Reactions: Waifu Denier
I know everyone is joking, but Bulldozer was just a piece of shit. Bulldozer-based Opterons were garbage compared to Xeons of the same era.

Among other things DL projects, I have access to good hardware for research but not for personal use. 24GB is a bit overkill but I would prefer to have too much than too little.

If you want 24 GB, P40s and K80s are really cheap on Ebay now, about $75 to $200. The 3090 is about 3x faster, though.
 
I would like to milk my EVGA 3090 FTW3, any ideas other than the usual retard level shit? I know about pads and paste I'm talking about how to just keep this fucking pile of shit thermally managed via software... Precision X1 is alright, but just leaves more to be desired...
Play with reducing the power limit in X1, I set mine at 70%. You might be surprised at how little you notice the difference outside of reduced thermal load since that thing is still a fucking beast of a card.
So, does anyone have any experience with used 3090s? I've been considering picking one up given that, on the used market, they are probably the cheapest way to get 24gb of vram and CUDA. While I generally don't have an issue with used cards, I know that Nvidia fucked up with the memory module cooling, and given that this was the era of crypto mining, which would stress the already hot GDDR6X the most with memory overclocking, I would be concerned about its longevity. Did any AIB designs fix it? Or am I stuck looking for a 3090 Ti which fixed the problem?
This is a bad answer, but if you can find one with a waterblock and ABP installed that's usually a good sign it wasn't used for mining because that setup cost nearly $400 extra dollars during the mining craze and I don't think most miners were about that life given the margins they were working with, even if it was absolutely necessary given how miners tune their cards (power limit as low as is feasible, clock speed as low as it goes, big fucking memory overclock). I'm not sure how much that config will actually cost over the standard card given that all those blocks went on sale of 50% off or so right around the time people started dumping their 3090s.

The only problem is that unless they include the original cooler (a question worth asking before the listing ends, some will throw it in for free just to be rid of it), you're stuck with getting in to open loop cooling to keep it chilly. That said, the ABPs do a REALLY good job keeping that memory chilly.

The better answer potentially is if you can find an LHR version that's also going to be a good sign it wasn't used for mining.

As far as I know the closest we got to AIBs fixing the problem was making sure the backplate wasn't cooking the modules by sealing in the air with its design, had good pad contact with the backplate, and had enough blow-through onto the plate to actually dissipate the heat it's collecting from the padded up memory modules, which frankly I dunno if I trust to be sufficient under assault from miner's memory misadventures, but that seems to have been EVGA's and Dell's solution, off the top of my head.
 
How is the RTX 3060? Found a cheap one with 12 gb of gddr6 ram on Newegg, and I'm thinking of a future build soon. Doesn't need to be a super computer, I'm not looking to run Crisis, just is it good or would a 2070 be a better use of funds?
 
How is the RTX 3060? Found a cheap one with 12 gb of gddr6 ram on Newegg, and I'm thinking of a future build soon. Doesn't need to be a super computer, I'm not looking to run Crisis, just is it good or would a 2070 be a better use of funds?
ive one since i wanted it for ai image shit initially. its technically overkill for the games im playing at the moment, however, since i do play some recent titles now and then that utilize this hardware, it was probably worth it. the gay tracing thing is just the bonus to it
 
  • Informative
Reactions: WelperHelper99
ive one since i wanted it for ai image shit initially. its technically overkill for the games im playing at the moment, however, since i do play some recent titles now and then that utilize this hardware, it was probably worth it. the gay tracing thing is just the bonus to it
Alright. As long as it can keep pace, even if on the low end with recent shit, I can fuck with that. My requirements aren't stressful by any stretch, and I'll be putting in 32 gb of ram if it needs the extra power (all but one build I've done has had 32 standard, found it a good middle ground). The most I  might ever do is pick blender back up and play a recent game.
 
If you aren't doing modern 4k gaming a 3060 should work pretty decently especially if you're willing to optimize your settings. I don't know if you'll see much benefit from the extra VRAM though, so price being equal (and it might not be) if you can find a 3060 TI, that might serve you better.

One thing less obvious about the comparison between the 3060 and 2070 is the 3060 will support Super Resolution right now, that shit's pretty neat and you'll probably find yourself using it a lot more than you think.
 
3060 will still game in 4k just fine, thanks to DLSS. In my experience higher resolution monitor with AI upscaling will look better than a lower resolution monitor displaying natively every time.
 
Yeah, when you conveniently ignore the half of the post where I justify that statement.
Nothing was ignored, I just don't agree. It doesn't address when DLSS isn't an option or when more vram is needed. The 3060 is not a 4k card for recent titles.

Supposed PC enthusiasts are all about celebrating weak hardware releases, upscaling, and now blaming devs for using too many resources/ not including preferred upscaling.

All while costs are increasing.

Maybe you didn't specifically mean that, but it's a growing sentiment.
 
Nothing was ignored, I just don't agree. It doesn't address when DLSS isn't an option or when more vram is needed. The 3060 is not a 4k card for recent titles.

Supposed PC enthusiasts are all about celebrating weak hardware releases, upscaling, and now blaming devs for using too many resources/ not including preferred upscaling.

All while costs are increasing.

Maybe you didn't specifically mean that, but it's a growing sentiment.
The 3060 12GB is not an enthusiasts card. It was designed to compete with AMD's midrange at the time, and has a strange memory configuration as a marketing gimmick to do so. Most people who are buying a 3060 are probably looking at either 1080 or 2k monitors running in the 144hz range or locked at 60FPS.
 
The 3060 12GB is not an enthusiasts card. It was designed to compete with AMD's midrange at the time, and has a strange memory configuration as a marketing gimmick to do so. Most people who are buying a 3060 are probably looking at either 1080 or 2k monitors running in the 144hz range or locked at 60FPS.
Yes. The post I was replying to was claiming that the 3060 will work fine at 4k for "modern gaming" because of dlss.
 
Yes. The post I was replying to was claiming that the 3060 will work fine at 4k for "modern gaming" because of dlss.
Because it will. Yeah it’ll look worse than native 4k, of course if will, but it will still look better than native 1080p, which realistically is the best it can do without the DLSS crutch. It just doesn’t have the shader units for 1440p.
 
  • Informative
Reactions: Kane Lives
I just hate the rising costs in general, and that was waaaaay before any of the Corona bullshit. And Nvidia's fucking ching chong in chief just goes "prices are only going up! buy more save more!"
 
I just hate the rising costs in general, and that was waaaaay before any of the Corona bullshit. And Nvidia's fucking ching chong in chief just goes "prices are only going up! buy more save more!"
So long as AI and the server industry is booming for Nvidia, they will just not care about the consumer market. Case in point
1690396443069.png

This A100 costs around $10,000. Nvidia sold over 10,000 of them for just a single supercomputer. Compare the profit to your average customer buying one $300 card.
 
So long as AI and the server industry is booming for Nvidia, they will just not care about the consumer market. Case in point
View attachment 5235168
This A100 costs around $10,000. Nvidia sold over 10,000 of them for just a single supercomputer. Compare the profit to your average customer buying one $300 card.
Which is fair. I just wish they'd finally tell gamers to fuck off (they kind of already do) instead of rubbing everyones noses in it. Then those same people go on to make excuses to cover it.
 
Back