GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Assembled my Ryzen 9 7900X workhorse build, but didn't have money left over for a GPU. Thought I could run a 4K monitor off it for desktop use with the meager integrated graphics for a bit, but whenever there are multiple videos on-screen, it seems to overwhelm the poor thing and the drivers crash. Apparently it looks like I'll need a discrete card sooner rather than later.
That's too bad. From what I've heard Intel has always had the superior video decode/encode capabilities. It's hard to find out the exact capabilities of VCN, but if multiple 2K/4K video playback on 2+ 4K monitors don't just work, that's lame. Maybe driver updates will help.
 
Assembled my Ryzen 9 7900X workhorse build, but didn't have money left over for a GPU. Thought I could run a 4K monitor off it for desktop use with the meager integrated graphics for a bit, but whenever there are multiple videos on-screen, it seems to overwhelm the poor thing and the drivers crash. Apparently it looks like I'll need a discrete card sooner rather than later.
Switch from hardware to software decoding, you will burn more electricity but your problem is solved.
 
  • Like
Reactions: Brain Problems
So after half a decade my rig is starting to show its age, and so am I, I feel like an old man looking at all this new stuff.
Are AMD cards worth getting? I know NVIDIA holds a large market share and everyone seems to have them but goddamn are the prices high, at least AMD has some nice sales going on.
I'm seeing DDR5 RAM starting to pop up is it worth getting over DDR4 for a gaming rig?

Yes, AMD cards are worth getting and are very competitive in the price band they've chosen to stick to. If what you want is to win bragging rights for achieving the highest fps in 4K until next year's video cards come out, get NVIDIA. If you aren't interested in spending over $1000 on a card, get whatever's on sale. In a given price band, they're pretty much all equivalent (even the Intel Arc is...about what you'd expect for the price). Game developers target consoles and midrange PCs, plus we're very close to an absolute power ceiling on what users will tolerate, so you'll be fine for the next 5-7 years.

Regarding DDR5, video game software is nowhere close to saturating the bandwidth of DDR4. Consequently, adding more bandwidth doesn't add much real game performance. Gaming benchmarks show, at best, around a 2%-3% performance improvement from going from the slowest DDR4-2133 to the fastest DDR5-6400. You will be absolutely fine with DDR4.

What is your current machine?
 
Yes, AMD cards are worth getting and are very competitive in the price band they've chosen to stick to. If what you want is to win bragging rights for achieving the highest fps in 4K until next year's video cards come out, get NVIDIA. If you aren't interested in spending over $1000 on a card, get whatever's on sale. In a given price band, they're pretty much all equivalent (even the Intel Arc is...about what you'd expect for the price). Game developers target consoles and midrange PCs, plus we're very close to an absolute power ceiling on what users will tolerate, so you'll be fine for the next 5-7 years.

Regarding DDR5, video game software is nowhere close to saturating the bandwidth of DDR4. Consequently, adding more bandwidth doesn't add much real game performance. Gaming benchmarks show, at best, around a 2%-3% performance improvement from going from the slowest DDR4-2133 to the fastest DDR5-6400. You will be absolutely fine with DDR4.

What is your current machine?
my current machine has:
Intal i5 6600k
16gb Ram
Geforce GTX 1650

Aside from a GPU upgrade I did two years ago everything is 7 years old, It runs most of the games I play just fine, but its been getting harder and harder to play games even on the lowest settings. I have a nice 1080p monitor that just as old and I have no plans to upgrade it anytime soon.
 
Assembled my Ryzen 9 7900X workhorse build, but didn't have money left over for a GPU. Thought I could run a 4K monitor off it for desktop use with the meager integrated graphics for a bit, but whenever there are multiple videos on-screen, it seems to overwhelm the poor thing and the drivers crash. Apparently it looks like I'll need a discrete card sooner rather than later.

I'm toying around with the idea of taking the GTX1080 out of my 2017ish gaming PC (i7-7700K, 16GB DDR4) to put in the workhorse and buy something else for the gaming rig. It's hooked up to a 1080p TV right now, which I might replace with a newer 4K one within a year or so, at which point I'd need something beefy to run the graphics, but also small because cramped cubecase, which sounds like it might be a no-go.

So my dilemma, I guess, is if there's any sense in trying to prolong the i7's suffering with something like a shorty GTX3060 after I harvest the 1080 and be stuck with an aging sub-par rig, or if I should just suck it up, buy something new for the workhorse, and junk the i7 when I'm ready to completely replace it and eat cardboard and drywall for a few months.
Radeon 6600's are $200 or so, if you just need something to stuff in the workstation and not worry about.
 
  • Informative
Reactions: spastic_bag
my current machine has:
Intal i5 6600k
16gb Ram
Geforce GTX 1650

Aside from a GPU upgrade I did two years ago everything is 7 years old, It runs most of the games I play just fine, but its been getting harder and harder to play games even on the lowest settings. I have a nice 1080p monitor that just as old and I have no plans to upgrade it anytime soon.

Yeah, that is fairly dated hardware all around. In fact, that's impressive - your hardware was low-end when it was new, and it's still borderline viable 7 years later.

Here's something to think about. Imagine gaming with low-end, 7-year-old hardware in 2005. It's not even doable. The maximum age you can reasonably game with keeps going up. So you just don't need the biggest, hottest stuff. Any of card you can pick up for around $400 and midrange CPUs out now (i7/Ryzen 7) are going to serve you well for a very long time.

FWIW, my new machine:
i9-12900
32 GB DD4-3200
Radeon 6700 XT

thoughts/flamewars on i7-12700K vs. the i9-9900k? What major factors are there besides price?

The i7-12700K is a significantly more powerful processor, and it's cheaper, too. The biggest factors are a lot more cache, more cores, and a new process.

More cache = better overall data throughput, even for same clockspeeds
More cores = better performance under load, everything is multithreaded these days.
New process = able to achieve max clock speed under heavier loads

Can't really overemphasize the high core count. Not only are individual applications all multithreaded today, your typical use case has lots of processes running all the time. Most benchmark sites have found that for most real-world use cases, 4 E-cores outperform a single P-core (which takes up the same die space). I wouldn't go with anything older than Alder Lake now.
 
Is there anywhere to buy a 3080TI for less than 4 digits?

I'm reading that the 3090 is a power hog and I'm not even touching the 4000 series with a 35 & 1/2 foot pole.
 
  • Like
Reactions: Strayserval
Some possible leaks about upcoming Ryzen models with 3d cache. I think that's the guy that is often reliable with these leaks.
compryzen.JPG
 
What's the difference between an RTX 3070 8GB V2 and an RTX 3070 GAMING Z TRIO 8GB LHR?
If you're just planning to play games there's not any real practical difference. *Generally* the slightly more expensive cards in the same tier will have marginally better cooling or overclocking headroom. But unless you're worried about benchmark dick-waving it isn't something you're going to notice.

Some possible leaks about upcoming Ryzen models with 3d cache. I think that's the guy that is often reliable with these leaks.
This along with that article from Techpowerup the other day gives me hope we'll see a 16 core x3d version. I would take it with more than a few grains of salt but I'll remain optimistic.

Y9sO4TK9ZyXI5Ug7.jpg
 
If it's a bigger leap than the 5800X3D was, then there will be a lot of glorious screeching about it.
Sudden wailing and consumer regret is a good sign of sorts at this point, it means something exciting happened!

With current inflation and such I wonder what the non-X Ryxen 7000's will end up being priced at in a year from now.
 
With current inflation and such I wonder what the non-X Ryxen 7000's will end up being priced at in a year from now.

The non-x 5000 series CPUs might be a good indication of where the non-x 7000s will be in a year (or even less 🤞) since the "hardware shortage" isn't around to keep them sky high for so long. I distinctly recall trying to get my hands on a 5950x in Jan 2021 and it being impossible to find one for a normal $800~ before resorting to just making a script to continually watch for me. That kind of ridiculous pricing as you probably remember went on for almost a full year following the 5000 series launch.

59.png

Now in a relatively short time following the 7000 series launch, thanks to competition among other factors, the 7950x for example is both readily available and slipping in price instead of hovering at $750+ until June 2023.

79.png
 
If it's a bigger leap than the 5800X3D was, then there will be a lot of glorious screeching about it.
I just bought a 5800X3D, and it's my first time using an AIO. From what I've read, they are fairly safe these days but I'm still a little bit worried it'll leak at some point in the future. The 5800X3D can be had for $499 in Australia now, so I figured that should last me for a good 3-5 years (if not longer) and I can skip AM5.
 
What consumer workloads does cache affect? I know it does for what we do at work, but the applications we work with are memory bound, and we're finding massive gains going from DDR4 to DDR5, and similarly when AMD introduced the EPYC 7003X series CPUs. By contrast, if games aren't seeing much gain from bandwidth, do they see much gain from larger L3 cache?
 
do they see much gain from larger L3 cache?
From the results I've seen from the 5800X3D, yes. Some games might not see a big gain, but others get massive 20%+ increases literally from swapping from another Zen 3 cpu. It means that to this day, the 5800X3D still beats Zen 4 and Raptor Lake in some games.

I'm very much looking forward to Zen 4 X3D chips.
 
What consumer workloads does cache affect? I know it does for what we do at work, but the applications we work with are memory bound, and we're finding massive gains going from DDR4 to DDR5, and similarly when AMD introduced the EPYC 7003X series CPUs. By contrast, if games aren't seeing much gain from bandwidth, do they see much gain from larger L3 cache?
5800X vs. 5800X3D with the RTX 4090, in 53 games. The 5800X3D is running at a 200-300 MHz lower clock speed:


+18.5% in 1080p
+15.4% in 1440p
+6.8% in 4K, 5.8-7.5% with some different subsets of the games.

+30-50% FPS gains are seen in some games, while other games are +0%.

Autism simulation games like Dwarf Fortress and Factorio benefit bigly. You might see up to a +85% performance gain which I believe is near the limit of how much benefit 3D cache can give on Zen 3:


Other than games, it's hard to find "consoomer workloads" that benefit. Maybe slight benefit in Handbrake, 7zip, WinRAR:


Phoronix found a bunch of workloads where the extra cache helps:

 
Last edited:
Back