GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

DLSS2, sure. In most games, it looks fine, especially with a 4k monitor. It looks noticeably cleaner than just lowering resolutions.

DLSS3, it's impressive technology that I fail to see the purpose of. I easily noticed latency issues attempting to bump say ~30 fps to 60+.

DLSS3 seems like it would be more useful pushing an already good framerate to 100+ fps, not fixing a shitty frame rate. So like going from 75 fps to 120 fps.
 
That's really scary.
Alright I should be more precise with my language. The actual Coral drivers are probably written in some lower level language, but the PyCoral python library that is the required API used to interface with the Coral is written in python.
 
  • Like
Reactions: Susanna
So Krzanich let their process node fall behind, and now they are in deep shit in their core business as AMD has basically taken their entire workstation business (Threadripper PRO has a 95% market share), is gobbling up their lucrative server business, is taking more and more of their desktop business, and is even starting to make inroads into their once-inviolable laptop business.
Don't forget that Krzanich also lost one of their biggest customers and margin makers - Apple. The 20-30% of the US desktop computing market that uses Macs basically vanished overnight as a revenue source for Intel. And it's even worse than it seems since (from what I understand) Apple was always willing to pay top-dollar to guarantee chip availability so Intel made higher % per chip selling to Apple than pretty much any other customer.

I'm long-term optimistic about them since they have a lot of cash on hand and infinite shitty deserts to build giant next-generation fabs on (which is something that might actually be useful for the upcoming process nodes and an advantage TSMC doesn't have being based on a tiny island), but they really need to get their shit together.
 
Apple and Intel had an interesting relationship. A fair few of the Intel CPUs used were clocked slightly higher, or for the iGPUs, were on the higher end of what was available.

The problem is, Apple mostly ships thin, narrow laptops, and Intel's... not exactly the best option for that. There were signs of Apple being dissatisfied with Intel though, things like the T2 chip, and the 2016 CPU fiasco...

I'd say that Apple always had the plan to go with their own silicon, but Intel failing with their timeline probably sped things up by 2 or 3 years.
 
  • Informative
Reactions: Brain Problems
During the Krzanich era, it was just assumed that Intel's near-monopoly (90%+ market share) in server & desktop CPUs was untouchable, and they hurled money at countless initiatives and acquisitions that were largely unrelated to their core business and producing niche products that never had a big market. This includes the 2016 acquisition of Movidius and the NCS sticks. So Krzanich let their process node fall behind, and now they are in deep shit in their core business as AMD has basically taken their entire workstation business (Threadripper PRO has a 95% market share), is gobbling up their lucrative server business, is taking more and more of their desktop business, and is even starting to make inroads into their once-inviolable laptop business.

Intel is in big fucking trouble, and Pat Gelsinger is basically cutting everything that isn't going into either dekstops or datacenters at blinding speed. Cold comfort to you, I know, but the lesson here is that any Intel product released up to about 2019 that isn't a mainstream device isn't something for which you should expect long-term support.
So how does modern Xeon preform, their big server chips? Heard the new ones come with 64 cores? Don't know about you, but that sounds like they're trying to catch back up (barring buisness consumer relations). Whats holding it bach tech wise?
 
So how does modern Xeon preform, their big server chips? Heard the new ones come with 64 cores? Don't know about you, but that sounds like they're trying to catch back up (barring buisness consumer relations). Whats holding it bach tech wise?

They're a generation behind, which means the only reason anybody is buying any at all is EPYC 9004 is on back order. In the server space, heat matters way, way more than in desktops, because power draw & cooling fundamentally limits how many racks you can put in a given facility. Since 2019, the only reason I've seen anybody buy any Xeons at all is EPYC was out of stock.
 
They're a generation behind, which means the only reason anybody is buying any at all is EPYC 9004 is on back order. In the server space, heat matters way, way more than in desktops, because power draw & cooling fundamentally limits how many racks you can put in a given facility. Since 2019, the only reason I've seen anybody buy any Xeons at all is EPYC was out of stock.
Ah. I mean that's a concern I had with a much smaller chip. Can't imagine that a big server chip would make you worry less. I guess that's what they get though for taking so long to get to 7nm, the cores are just beefy to do the same amount of work.

Edit:
chrome_screenshot_Jan 10, 2024 6_17_14 AM MST.png
Found a WD gold drive on a bargain. With a gift card I got for Christmas, it was less than 70. File storage here I come!
 
Last edited:
@WelperHelper99 I mean this in the nicest possible way— You’re going to have such a good time in 2028 when you build your second/third computer and really understand what you want.
Oh trust me, I get it lol. I'm assembling a shitbox with half the parts bought on sale. I'm expecting jank when this thing boots. Next one will be the nice one
 
Ah. I mean that's a concern I had with a much smaller chip. Can't imagine that a big server chip would make you worry less. I guess that's what they get though for taking so long to get to 7nm, the cores are just beefy to do the same amount of work.

Sapphire Rapids was supposed to come out in 2019 and compete directly with EPYC 7002 series. That tells you how far behind they are. On the desktop, it's not as big a deal. It's not that it's totally irrelevant so much as it is that your GPU is 2x-3x hotter than your CPU anyway. In a server room, to keep all my nodes running to spec, I have to be able to pump out heat as fast as they generate it. See below.

Xeon 8593QEPYC 9654
Cores6496
Base Freq2.2 GHz2.4 GHz
Max Freq3.9 GHz3.7 GHz
L3 Cache320 MB384 MB
RAM channels8xDDR5-560012xDDR5-4800
TDP385W360W
List Price$12,400$11,805

Literally zero reason to buy a Xeon right now.
 
  • Horrifying
Reactions: WelperHelper99
Sapphire Rapids was supposed to come out in 2019 and compete directly with EPYC 7002 series. That tells you how far behind they are. On the desktop, it's not as big a deal. It's not that it's totally irrelevant so much as it is that your GPU is 2x-3x hotter than your CPU anyway. In a server room, to keep all my nodes running to spec, I have to be able to pump out heat as fast as they generate it. See below.

Xeon 8593QEPYC 9654
Cores6496
Base Freq2.2 GHz2.4 GHz
Max Freq3.9 GHz3.7 GHz
L3 Cache320 MB384 MB
RAM channels8xDDR5-560012xDDR5-4800
TDP385W360W
List Price$12,400$11,805

Literally zero reason to buy a Xeon right now.
Costs more, less cores, gets hotter. I guess you always have them in stock, if you need a server now, it'll do it. But yeah I get why it isn't selling. Catching up is going to take drastic steps.
 
Costs more, less cores, gets hotter. I guess you always have them in stock, if you need a server now, it'll do it. But yeah I get why it isn't selling. Catching up is going to take drastic steps.
Just go AM4 platform and call it a day. AMD just released some new CPUs and APUs and prices are reasonable. DDR4 ram is pretty reasonable too and you can easily max it out to 64gb. For your needs it will be good enough.
 
Just go AM4 platform and call it a day. AMD just released some new CPUs and APUs and prices are reasonable. DDR4 ram is pretty reasonable too and you can easily max it out to 64gb. For your needs it will be good enough.
Too bad I'm stuck already with a Intel motherboard lol. It's alright. I don't plan on going hard with it anyway
 
Just go AM4 platform and call it a day. AMD just released some new CPUs and APUs and prices are reasonable. DDR4 ram is pretty reasonable too and you can easily max it out to 64gb. For your needs it will be good enough.

if what you need is a 64-core server CPU, a Ryzen won't cut it.

there is a reason for upgrade to the new 7000 cards or im still good on my 6700xt?

If you normally upgrade your GPU every year, then yes, if no, then no.
 
  • Feels
Reactions: WelperHelper99
I don't think that's going to be welper's use case as a simple desktop user.

The specific conversation you replied to was in reply to a question about how Intel's server tech is doing these days. Since we have a lot of parallel conversations in here, I'll just summarize everything I think about the subject in a table and not think about how autistic this makes me.

CPUGPU
LaptopApple, everything else is dog shit by comparison. If you must go x86, the latest Ryzens are good. Maybe the latest gen Intels are good at power management, but I don't really know.As of Meteor Lake, it's really a toss-up between Intel and AMD on the iGPU front. Anything previously, AMD is a lot better. If you insist on a gaming laptop, just be aware that NVIDIA's laptop dGPUs will render you infertile.
MinisNo battery life considerations, so whatever you get the best deal on.AMD for iGPUs, but some minis have dGPUs, in which case NVIDIA is best, but I'd take Intel over AMD if it came to that.
DesktopFor gaming, any current i7/Ryzen 7 or better, doesn't matter, and the rest of your system will be obsolete before the CPU can't keep up with your games. Gaming tech sites that say otherwise are basically lying. Buy whatever you got the best deal on, you'll be happy.

For work, depends on the work.
If you're spending big boy money, NVIDIA. If you're budget constrained, Intel. DLSS puts NVIDIA GPUs well above any AMD GPU in the same price band. XeSS is almost as good as DLSS when running on Intel GPUs and somewhat better than FSR when running on AMD GPUs.
WorkstationAMD, no contest, expensive mistake to buy IntelTechnology like DLSS is irrelevant, although NVIDIA support by ISVs is broadly better. Still, AMD's professional GPUs are fine.
ServerAMD, no contest, expensive mistake to buy IntelNVIDIA, no contest, expensive mistake to buy AMD or Intel (implying you can even find a Max GPU). Although maybe Gaudi is good for the specific thing it does, don't ask me for advice if you're building a $100m AI/ML server farm.
 
Last edited:
Back