GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Part of me wonders if this is to cover power spikes, or if they are this demanding only under load.
Reference models seem to only run 2x8 pin PCIE cables, so theoretically wouldn't the draw max at around 325? That's the same amount of power my old 390X needed. I'd guess that AMD is putting recommendated PSU's that high because if you're dropping enthusiast cards into a computer, you're probably also putting enthusiast CPU and playing with overclocking.

3x8 pin custom models will probably need another 200w of headroom, wouldnt be suprised to see 850 or 950 recommended.

Eta; saw the Hellhound model only has 2x8 pin. Suprised as hell. Maybe AMD got excess power draw (I.O, memory interface) minimized for the real glutton (logic).
 
Reference models seem to only run 2x8 pin PCIE cables, so theoretically wouldn't the draw max at around 325? That's the same amount of power my old 390X needed. I'd guess that AMD is putting recommendated PSU's that high because if you're dropping enthusiast cards into a computer, you're probably also putting enthusiast CPU and playing with overclocking.

3x8 pin custom models will probably need another 200w of headroom, wouldnt be suprised to see 850 or 950 recommended.

Eta; saw the Hellhound model only has 2x8 pin. Suprised as hell. Maybe AMD got excess power draw (I.O, memory interface) minimized for the real glutton (logic).
PCIe slot gets you 75 Watts, 2x8 pin gets you 300 Watts, so 375 Watts total.
 
  • Informative
Reactions: Allakazam223
Currently in the states, and the amerimutt consoomer holiday of black friday was today. As I was leaving a barber shop I noticed a small PC shop with a big sign advertising "great sales". Most of the sales were on prebuilt still overpriced garbage but they did have a noctua NH-D15 for 59$ and some WD Green 1TB NVMe drives for 34$ so I grabbed them up to throw in my streaming/storage rig. Was a pretty good deal, thanks america.

edit; oh yea, they were selling ARC cards for like 50-85 cheaper than newegg, but lmao. Garbage GPUs
 
Supposedly they work wonders if you use them on 2 PC Streamer configuration
My little home setup is my main pc/my streaming rig/and my ghetto NAS, and I do want to replace my lowest GPU (2080ti) but a lot of games I play would be fucked because of the directx translation layer, and the machine I would be replacing is linux which as far as I know is a no-go with the intel cards. I haven't really been up to date with them, so maybe they've gotten some driver updates but last I read up on them they were still pretty garbage unfortunately. 3060ti's are pretty cheap* right now, so that's probably what I'm going to grab.

However I can say the intel iris xe arch surprised me, it performs surprisingly well on the little cheap 200$ laptop I got last year. It can handle wii-u and switch emulation and most games I throw at it at mid-high settings no problem.
 
Bathtub Graphs.
It's kind of an engineering logistics thing. Depending on what you do it'll give you hints about the spare parts/warranty situation you have on your hands with a product and do the math accordingly. As it is often with graphs, without context they're essentially meaningless.

I guess some marketing bozo heard of it and misused it and then they all started copying off each other. My favorite in marketing wankery is still "to military specifications". Now, I don't know about your military, but with my military that'd mean done by the cheapest bidder in the cheapest humanly possible way so it only tethers on the border of being dangerous to use. Maybe truth in advertising? More likely is they just marketed it to the teenage boys who think that it sounds cool. Do teenagers even still do this? Put their own computers together? Play Quake and Unreal Tournament? I have no idea. I have the feeling this kind of disposable income in families probably doesn't exist anymore and like many other things used to be done by kids, it's now the realm of balding, joyless middle-aged nerds. (who sometimes wear dresses)
 
Part of me wonders if this is to cover power spikes, or if they are this demanding only under load.
Power spikes and CPUs that draw 200-300W with full load like a 13900K or w/e, plus RGB setup with 5 case fans and 360mm AIO or something.
If your CPU is capping at more sane figures like 200W you should be fine with good 650W Gold/Platinum PSUs for sure, especially if you handle the BIOS to cap power usage or do that in MSI Afterburner for the GPU. Keep in mind that basically no game will utilize in full something like a 13700K, and for the few threads that get utilized, such CPUs (or something like 7900x from AMD) should stay well below 200W in games. I see the PSU requirements as safety measures against idiots that would try to run a 4090 on a 600W Bronze PSU from 10 yrs ago or so.
TL;DR you should be fine with a good 700W PSU for most of these cards, depending on setup, and especially if you're willing to cap power usage for CPU/GPU until you buy new PSU.
 
Upgrading my 1080Ti to a 3080Ti. Got a good deal and picked up the block and backplate. Also moving into a bigger case so I can fit my 3rd 360mm radiator. Going from the Fractal Define S2 to a Lian-li O11 dynamic XL. Considering getting a new pump as well, had this one for 5 years.

Flashed the 3080 Ti with a higher wattage BIOS. Have an EVGA 850w platnium so I'm comfortable with a higher voltage draw.

Upgraded build will be:
MSI Z690 PRO DDR4
12600k @ 5.2ghz
32gb (2x16) 3600mhz OC to 4000mhz
850w Evga platnium
3x 360 rads (2x 60mm and one slim 30mm)
D5 pump
A few m.2 and SATA 1TB SSDs
Lian-li O11 dynamic XL
 
Upgrading my 1080Ti to a 3080Ti. Got a good deal and picked up the block and backplate. Also moving into a bigger case so I can fit my 3rd 360mm radiator. Going from the Fractal Define S2 to a Lian-li O11 dynamic XL. Considering getting a new pump as well, had this one for 5 years.

Flashed the 3080 Ti with a higher wattage BIOS. Have an EVGA 850w platnium so I'm comfortable with a higher voltage draw.

Upgraded build will be:
MSI Z690 PRO DDR4
12600k @ 5.2ghz
32gb (2x16) 3600mhz OC to 4000mhz
850w Evga platnium
3x 360 rads (2x 60mm and one slim 30mm)
D5 pump
A few m.2 and SATA 1TB SSDs
Lian-li O11 dynamic XL
That’s pretty similar to my build except I went with a 3070 and no overclocking. It plays everything I want at 3440x1440@100 at high quality without breaking a sweat. I don’t see myself upgrading again for many years.
 
So after half a decade my rig is starting to show its age, and so am I, I feel like an old man looking at all this new stuff.
Are AMD cards worth getting? I know NVIDIA holds a large market share and everyone seems to have them but goddamn are the prices high, at least AMD has some nice sales going on.
I'm seeing DDR5 RAM starting to pop up is it worth getting over DDR4 for a gaming rig?
 
Are AMD cards worth getting?
I got a 6600 recently and it's doing well. Runs anything I throw at it at 1080p high. No major issues so far. Unless you have a specific reason to get nVidia specifically, AMD seems to work well. I can't say for sure about long term ownership.

I'm seeing DDR5 RAM starting to pop up is it worth getting over DDR4 for a gaming rig?
Unless you're getting AM5 which needs it, then no. I've seen arguments that DDR5 is more "future proof", but unless you plan on replacing your motherboard I fail to see the point. I have DDR4 and after upping the speed from default (2.something) to the rated 3200mhz, I didn't notice any performance difference.
 
Are AMD cards worth getting? I know NVIDIA holds a large market share and everyone seems to have them but goddamn are the prices high, at least AMD has some nice sales going on.
Yes. Nvidia has marketshare and market penetration of proprietary technologies, such as DLSS. However AMD is playing catchup and it may just be a matter of time before it turns into another GSync vs FreeSync. FreeSync ended up "winning" since developers didn't have to pay to license the technology.

I'm seeing DDR5 RAM starting to pop up is it worth getting over DDR4 for a gaming rig?
If you are going to be upgrading again sometime in the next five years, I'd consider DDR5. DDR4 is considerably cheaper, but you're limiting yourself to boards that accept DDR4.

Most reviewers like AMD's 5800X3D for it's large cache, with a 3080. I believe a 12600 and a 6700 XT are good enough and far cheaper.
 
I got a 6600 recently and it's doing well. Runs anything I throw at it at 1080p high. No major issues so far. Unless you have a specific reason to get nVidia specifically, AMD seems to work well. I can't say for sure about long term ownership.


Unless you're getting AM5 which needs it, then no. I've seen arguments that DDR5 is more "future proof", but unless you plan on replacing your motherboard I fail to see the point. I have DDR4 and after upping the speed from default (2.something) to the rated 3200mhz, I didn't notice any performance difference.
Yes. Nvidia has marketshare and market penetration of proprietary technologies, such as DLSS. However AMD is playing catchup and it may just be a matter of time before it turns into another GSync vs FreeSync. FreeSync ended up "winning" since developers didn't have to pay to license the technology.


If you are going to be upgrading again sometime in the next five years, I'd consider DDR5. DDR4 is considerably cheaper, but you're limiting yourself to boards that accept DDR4.

Most reviewers like AMD's 5800X3D for it's large cache, with a 3080. I believe a 12600 and a 6700 XT are good enough and far cheaper.
Thanks for the info guys. I think I'll build a new rig, this year's black friday and cyber monday have turned out pretty disappointing with not a lot of deals, so I'll take my time, familiarize myself with all the new tech and pick up parts here and there when they're in stock. I'm planning on making something that'll last another 5+ years like my current build.
Hell I remember when you could buy all your parts and save 30%+ on those deals, this year I'd be lucky to break 10%
 
This seems like a big deal:


It's 3D GDDR6 with doubled capacity, and I assume better performance than GDDR6X and lower cost than HBM. So we could see smaller cards with half the memory chips for the same capacity, or consumer cards with up to 32-48 GB in a generation or two.
 
Good case but note that it has some retarded "ROG Certified" sticker on the side glass (pictured here). it is possible to remove it with some effort but it left behind a mark on mine.
Damn I didn't catch that. Built a new PC for a friend with a black O11 XL and didn't even notice it. Hate all logos on parts. I'll see how it looks when I'm done and decide on whether to try and remove it. There's not really another option for what I need. The Phanteks Entoos are OK, but I like O11 way better.
 
thoughts/flamewars on i7-12700K vs. the i9-9900k? What major factors are there besides price?
 
thoughts/flamewars on i7-12700K vs. the i9-9900k? What major factors are there besides price?
Performance, the 12th gen should perform better; and efficiency, the 12th gen has dedicated cores for background tasks and/or other tasks that don't require a lot of compute power. DDR5 is faster, as well as PCI-e gen 5 support on some boards.
 
Assembled my Ryzen 9 7900X workhorse build, but didn't have money left over for a GPU. Thought I could run a 4K monitor off it for desktop use with the meager integrated graphics for a bit, but whenever there are multiple videos on-screen, it seems to overwhelm the poor thing and the drivers crash. Apparently it looks like I'll need a discrete card sooner rather than later.

I'm toying around with the idea of taking the GTX1080 out of my 2017ish gaming PC (i7-7700K, 16GB DDR4) to put in the workhorse and buy something else for the gaming rig. It's hooked up to a 1080p TV right now, which I might replace with a newer 4K one within a year or so, at which point I'd need something beefy to run the graphics, but also small because cramped cubecase, which sounds like it might be a no-go.

So my dilemma, I guess, is if there's any sense in trying to prolong the i7's suffering with something like a shorty GTX3060 after I harvest the 1080 and be stuck with an aging sub-par rig, or if I should just suck it up, buy something new for the workhorse, and junk the i7 when I'm ready to completely replace it and eat cardboard and drywall for a few months.
 
Back