GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Been using NVidia and Intel all my life, but I'm thinking of dipping my toes into AMD but I don't know much about their lineup. For a frame of reference, what are some AMD equivelants of NVidia and Intel parts?
 
Their datacenter cards last 1-2 years under load. The really high end datacenter servers last 20+ years under load.
No one is running a 20 year old server. 10 years is pushing it.
Anything it could do, you could do with probably the same or better performance on a raspberry pi.

Been using NVidia and Intel all my life, but I'm thinking of dipping my toes into AMD but I don't know much about their lineup. For a frame of reference, what are some AMD equivelants of NVidia and Intel parts?

AMD kinda copied the numbering scheme from both. Core i3/5/7/9 roughly compare to Ryzen 3/5/7/9. Going up in numbers is a generally better CPU and more cores. For the AMD side, the X3D cards are generally better for gaming.
Similar with the GPU space with the numbers matching close to the NVIDIA ones with the 9070 XT being compared closely with the 5070 Ti right now. AMD doesn't have any GPUs that compete with the highest end NVIDIA cards [eg 5080/5090].
 
Similar with the GPU space with the numbers matching close to the NVIDIA ones with the 9070 XT being compared closely with the 5070 Ti right now. AMD doesn't have any GPUs that compete with the highest end NVIDIA cards [eg 5080/5090].
Is it really as simple as subtracting 4000? What about the series naming conventions for Ryzen CPUs? I know that there are some more modern i3's (I think they've reached 13000 now?) that outclass some dated i5's and i7's in terms of GHz so I'm wondering if the same is true for Ryzens.
 
I don't have to imagine this. A quarter pounder does, in fact, cost what a double quarter pounder used to cost in 2017 or so. Welcome to the world of double digit inflation. Angry yet? Furious that McDonald's leaves the single QP on the menu?
the point is that they are charging more for less man
I really don't. I mean, "a GPU that has enough fill rate to color 500 million pixels a second (i.e. 4k @ 60 fps) with coloring operations typical of the games of its day." If you can choose the right balance of settings for your card to run at a consistent 60 fps at a 4k resolution, I would consider that a "4k capable card." You clearly wouldn't, for whatever reason.

I just think that if you're willing to accept half the VRAM to save $50, then you need to be willing to accept you can't run games with those super-resolution ultra texture packs.
you,re a retard
 
Thinking of Steve here, it feels like computer components besides the main circuit boards and silicon, (AIOs,fans,coolers,cases) seem to be all made in the exact same Chinese factories with different a label slapped on it What's stopping the factories from talking directly to the customer, similar to what's happening with clothing right now.
I keep thinking of the $50 the chinese factory charged him to make a GN "gamer chair".
The big board brands are glorified financiers, marketers, and software people at this point. They put the money down to produce X board of each SKU, determine the SKU, slightly tweak what each factory is making to be something they can sell, and provide end-user guarantees and support.

This makes it sound like I'm underselling them but they really do serve an important purpose in this hierarchy. They have the cash on hand and expertise to sell shit to customers and fulfill contracts. Most of these factories don't. Hence why when you order these sketchy boards off temu or aliexpress, they come with basically no warranty even if they're often the same basic components you're getting from an ASUS or MSI.

Most of the companies actually making these boards have no interest in doing direct-to-customer shit. Doing that would require them to massively expand their headcount until their prices are indistinguishable from the other big brands.

Been using NVidia and Intel all my life, but I'm thinking of dipping my toes into AMD but I don't know much about their lineup. For a frame of reference, what are some AMD equivelants of NVidia and Intel parts?
Same general numbering scheme on CPUs - Ryzen 3, 5, 7, and 9. A cheatsheet for what you should be looking at:

Ryzen 7 - 7700x and 9700x - these are their current basic bitch 8-core CPUs without any bells and whistles. They can clock >5 GHz and provide more performance than most people need despite being lower on the stack.

Ryzen 7 (x3D) - 7800X3D and 9800X3D - these are 8-core CPUs that are basically equivalent to the above except they have additional L3 cache which can improve performance in some applications (mostly games). They come with the downside of being somewhat lower-clocked than the non-X3D equivalents.

Ryzen 9 - 7950X and 9950X - these are their flagship 16-core desktop CPUs. They typically clock >5 GHz and are ideal if you're not simply building a gaming PC and need a lot of cores for other kinds of things (particularly productivity).

Ryzen 9 (x3D) - 7950X3D and 9950X3D - these are X3D variants of the above. Only half the cores have additional L3 cache and as with the R7 X3D they don't clock as high. These are intended for people who want high-end gaming performance but also want to use their PC for applications that need a lot of cores. The caveat with these is that they require special software on Windows to handle loading things onto the correct CPU cores and this can be kinda fiddly.

There's other CPUs but they all kinda have issues. 7600X/9600X are six-core budget CPUs, and 7900X/9900X(plus their X3D variants) have two sets of six-cores which can make them inelegant for gaming workloads (but they do frequently go on sale so maybe keep an eye out).

For GPUs it's much simpler: there are two models out right now. The 9070 is $550 midrange GPU and the 9070 XT is a $650 upper midrange GPU. You're unlikely to find either at these prices and the GPU market sucks ass right now. There's 7000-series GPUs still available on the market but the 9070/XT are such improvements that I wouldn't bother chasing one of those down unless you get a really good deal.
 
Is it really as simple as subtracting 4000? What about the series naming conventions for Ryzen CPUs? I know that there are some more modern i3's (I think they've reached 13000 now?) that outclass some dated i5's and i7's in terms of GHz so I'm wondering if the same is true for Ryzens.
AMD changed their naming scheme to copy Nvidia's and line everything up "properly". 9070 XT competes with 5070 Ti, 9070 non-XT with the 5070, and it's assumed that the upcoming 9060 XT 16 GB (launching in May) will be a bit slower but cheaper than the 5060 Ti 16 GB. Unfortunately, most cards are above MSRP right now in the US.

As far as CPUs go for gaming, you only really need 6-8 fast cores which almost every CPU offers. Going up to Ryzen 9 doesn't get you much.

UacqtYWFJGCCzZSuS553nV-970-80.png.webp
The 6-core 9600X is only 5% behind a tied 8-core 9700X and 16-core 9950X on this chart, probably because of clocks. You won't notice that difference, particularly if you are gaming at 120 FPS.

X3D models have additional L3 cache that allow them to top the chart, but you don't need to spend $480 on your CPU. The 1% lows are higher, which may translate into a smoother, more consistent framerate.

Intel CPUs have historically gained more performance as you go from i3 to i9, because the higher models have more L3 cache in addition to higher clocks. Whereas most AMD models have the same L3 per core. Intel's Arrow Lake has fallen behind previous-generation Raptor Lake due to a move from monolithic dies to chiplets, and lowered voltages.
 
Is it really as simple as subtracting 4000? What about the series naming conventions for Ryzen CPUs? I know that there are some more modern i3's (I think they've reached 13000 now?) that outclass some dated i5's and i7's in terms of GHz so I'm wondering if the same is true for Ryzens.

Nope lol. Last AMD GPU generation was 7000 series, they skipped the 8000. Ryzen CPUs are also on the 9000x generation. For intel, forgot that they change the name slightly in the latest couple generations. It's no longer 12th gen core i5 etc, it's core 5, series 2.
 
The only issue I was routinely encountering was the inability of the GPU to wake the monitor after extended sleep. I dunno if this has fixed it as I haven't put my computer into an extended sleep yet after the install. Guess we'll see tomorrow if I fall asleep with my PC still on.
If you have both hdmi and displayport output, try switching.
 
No one is running a 20 year old server.

There are more than 100,000 AS/400s still in use. IBM hasn't made a new one in 22 years.
sauce: https://informdecisions.com/as400-remains-a-power-to-reckon-with/

Is it because literally nobody was buying the Windows laptops that had an AI which spies on you and couldn't be disabled?

What @The Mass Shooter Ron Soye said, and nobody's buying Gaudi, either, and they've lost half their high-margin Xeon business to EPYC. Everyone's focused on Ryzen grabbing a decent bit of desktop share, but EPYC has absolutely gutted Xeon.

That requires less than 32MB of bandwidth with clever coding and compressed audio for ALL languages!

If lazy devs would just optimize their fucking software, I would not have had to replace my Geforce 4 MX.
 
X3D models have additional L3 cache that allow them to top the chart, but you don't need to spend $480 on your CPU. The 1% lows are higher, which may translate into a smoother, more consistent framerate.
Adding to this - there's >$200 difference between the 9700X and 9800X3D price-wise right now. That's enough money to go up an entire tier of GPU which is going to get much better value for your money compared to the jump to X3D.

X3D is for sick-ass gamer cred when money's no object. I'd argue that $200 is better spent on a better GPU or a better monitor (way too many people spend thousands on a gaming PC and then play games on shitty 1080p60 office monitors).
 
Is it really as simple as subtracting 4000? What about the series naming conventions for Ryzen CPUs? I know that there are some more modern i3's (I think they've reached 13000 now?) that outclass some dated i5's and i7's in terms of GHz so I'm wondering if the same is true for Ryzens.

They have a convention right up until marketing decides continuing with that convention makes them sound stuffy and boring, so they make up a new one to sound cool and game-changing. They and Intel both do this, both in server and desktop CPUs.

X3D is for sick-ass gamer cred when money's no object.

Really smokes engineering workloads compared to the standard model while not being nearly as expensive as a Threadripper.
 
Really smokes engineering workloads compared to the standard model while not being nearly as expensive as a Threadripper.
I just assume engineers and software developers are a class of people for whom money is generally no object when it comes to buying computer shit.

That might be my bias being a burger tho
 
Probably the best suggestion is to make sure the CPU and GPU are matched appropriately to each other here.
A 9600X and a 4090 probably isn't the best. and a 9800x3d is probably wasted with an RX 7700xt.
 
  • Like
Reactions: Brain Problems
Big thanks to the people in this thread that mentioned Apollo/Artemis/Sunshine/Moonlight. I was nerding out with that all weekend and it's cool as hell - I'd never imagined you could stream over a network with so little latency.
Yea, it’s magic shit. Also makes owning a steam deck or any other handheld pointless in my opinion. Just buy a controller for your phone and utilize a device you already carry around 24/7.
 
Really smokes engineering workloads compared to the standard model while not being nearly as expensive as a Threadripper.
Now this is news to me. I wasn't aware X3D chips were good at that workload. I mean it made loading bloated saves of RPG/RTS/TBT games actually possible but hey, good to know its great in that purpose.
 
I just obtained an XFX GTS Radeon RX 580 8GB Triple X Edition.

It is a beast under GNU/Linux, especially under CachyOS, since it's a performance-focused distro.

Setup:
- Motherboard: H97 GAMING 3
- CPU: 4790K
- GPU: XFX GTS Radeon RX 580 8GB Triple X Edition
- RAM: 16 GB DDR3
- Screen: 1920x1080

I just can't understand why people keep buying overpriced NVIDIA GPUs.

Game performance:
- Hogwarts Legacy at Ultra — 70 fps, with FSR 3 and frame generation — 120 fps
- Just Cause 4 at Very High — 80 fps
- Warframe at Ludicrous — 65–140 fps

Advice: Just buy anything literally anything secondhand, and install a Linux distro, and you won’t end up spending nonsense money on hardware for your entire life.

Just look at the release of the RTX 5060 family. These GPUs don't make any sense and will be obsolete after a few years.

Meanwhile AMD Polaris even supports software-emulated ray tracing, thanks to the power of the open-source community—Mesa drivers (RADV).
 
Now this is news to me. I wasn't aware X3D chips were good at that workload. I mean it made loading bloated saves of RPG/RTS/TBT games actually possible but hey, good to know its great in that purpose.

Engineering computations tend to involve a lot of sparse matrix operations that invalidate cache lines a lot. Bigger L3 cache helps.
 
Now this is news to me. I wasn't aware X3D chips were good at that workload. I mean it made loading bloated saves of RPG/RTS/TBT games actually possible but hey, good to know its great in that purpose.
It might also amaze you to learn that the 9800X3D can fit three whole 4K frame buffers in it's 96mb of L3 cache!
 
Back