GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

AMD earnings call out today had revenue up only 2% yoy. All the money going in to AI and data centers isn't flowing their way.
AI is still an experimental tech too. That car dealer that added AI to their chat support had to pull it be abuse it was agreeing to very unrealistic loan agreements and customers were trying to hold the company to them.

There was a joke going around TikTok where a guy's boss gave him a picture of a knob with the word AI on it and was told to implement it, and the more details he got the more unfeasible the project turned out to be.

I don't think a lot of companies are deploying AI on a large scale unless they really know what they are doing (or are utterly convinced that they do)
 
Seems like Intel it catching some shit with reports of 13/14 series parts failing/degrading due to a combination of loose power guidelines from Intel and mobo makers pushing the chips hard out of the box.

I believe ryzen 7000 had a similar issue, though if I remember right it was mobo makers exceeding the limits set by AMD.

Should have learned from that, but gotta win those benchmarks I guess.
 
AMD earnings call out today had revenue up only 2% yoy. All the money going in to AI and data centers isn't flowing their way.
I know they've been selling a lot of MI300s. I wonder if that is being offset by declines elsewhere, or if a lot of that revenue is coming in later quarters.
 
  • Thunk-Provoking
Reactions: Brain Problems
Seems like Intel it catching some shit with reports of 13/14 series parts failing/degrading due to a combination of loose power guidelines from Intel and mobo makers pushing the chips hard out of the box.

I believe ryzen 7000 had a similar issue, though if I remember right it was mobo makers exceeding the limits set by AMD.

Should have learned from that, but gotta win those benchmarks I guess.
Heard about that. My choice to stick with 12th gen is paying dividends.
 
  • Like
Reactions: Overly Serious
Hey, is a EPYC 7282 a good CPU at $100?

At $100? Its not a bad deal. However, with a boost clock of only 3.2 GHz, you may find it struggles in games if you were thinking about sticking it in a desktop. I recently experimented with turbo boost off on my i9, capping the frequency at 2.4 GHz, and some games ran like ass.

Main thing going for this CPU is 8 channels of DDR4 and enough PCIe lanes to serve up to 4 GPUs, or a giant pile of storage, or whatever other devices you have in mind.
 
You're right, data center revenue was up 80%, it's the gaming segment that was down 48% and embedded down 46%.
It feels like when you're buying a AMD cpu, they hide the embedded apu G series option unless you specify plug it into the search bar. Honestly they do a terrible job promoting it.
 
It feels like when you're buying a AMD cpu, they hide the embedded apu G series option unless you specify plug it into the search bar. Honestly they do a terrible job promoting it.
7000-series and up all have iGPUs. On the non-Gs it’s pretty basic and struggles with games even at 1080p low, but if all you need is something to browse the web or watch films, it’ll do fine.
 
7000-series and up all have iGPUs. On the non-Gs it’s pretty basic and struggles with games even at 1080p low, but if all you need is something to browse the web or watch films, it’ll do fine.
I'm talking stuff like the 5000 series and below, which if you want to make a little server for cheap using last gen chips, you might be fooled into thinking you also need a card.
 
AMD earnings call out today had revenue up only 2% yoy. All the money going in to AI and data centers isn't flowing their way.
Does AMD even have a foot in the AI door? (genuinely asking) ROCM is such a fucking joke, on Linux (and let's be real that's where all the AI shit is running) you can literally just slap your Nvidia card in and as long as it isn't 900 years old you are good to go with the standard driver install.
I gave up trying to get ROCM working, Hell even just trying to find out if your card is supported is confusing.

I am in this awkward spot where the PCIE on my mobo is kind of fucked and randomly breaks output whenever I insert a card but I am still on AM4 so its going to be a full upgrade or buying a mobo that's gonna be eol soon. kill me, these new CPUs don't really impress me for what the cost is going to be.
 
13 and 14th gen is fucking weird. They go to a smaller die size but voltage goes up. It's literally less efficient than the 10nm cores lmao.
"Voltage Go Up" was their only way to make the chips faster as, joking tone aside, the 13th and 14th gen chips are basically just the 12th gen again. Probably someone like @The Ugly One can point to some actual differences between the generations and doubtless an Intel diehard could argue why those differences matter to anyone. But to me, it's just the same architecture again and again, it seems.

So to sell it, as you said: "Voltage goes up".
 
  • Agree
Reactions: WelperHelper99
Does AMD even have a foot in the AI door? (genuinely asking) ROCM is such a fucking joke, on Linux (and let's be real that's where all the AI shit is running) you can literally just slap your Nvidia card in and as long as it isn't 900 years old you are good to go with the standard driver install.
I gave up trying to get ROCM working, Hell even just trying to find out if your card is supported is confusing.

I am in this awkward spot where the PCIE on my mobo is kind of fucked and randomly breaks output whenever I insert a card but I am still on AM4 so its going to be a full upgrade or buying a mobo that's gonna be eol soon. kill me, these new CPUs don't really impress me for what the cost is going to be.
AI is their biggest source of revenue, so yes. They’re not at the level of NVIDIA, obviously, but ROCM works just fine on their enterprise hardware, and while it’s unreliable and tricky to get working on consumer cards, in a real situation you would simply version lock it to whatever works, and if you’re a big enough customer AMD will appoint engineers to help your team set it up.
Select consumer cards work just fine with ROCM, such as my old 6900XT, which was just as plug and play as CUDA, it’s just you install the ROCM-pytorch docker instead of the pure pytorch one. That’s what all the CDNA cards are like, since it’s basically their whole purpose. They also perform much better at AI tasks than the RDNA consumer cards.
 
Costs of hobbies varies wildly. But something that’s different about traditional hobbies and building PCs is that PC parts will become obsolete much faster. For that I consider PC gaming to be a pretty expensive hobby even if you pirate everything. You can save a lot of money by being happy with lower resolutions and lower settings. There’s no reason to run at 4K Ultra for everything. It doesn’t affect gameplay.
Once you get smart, you can buy a refurbished office PC, stick an RTX 3050 6GB or some other sub-75W card in it, and have a decent 1080p gaming experience for $300.

Luckily for me, modern gayming is such a turnoff that I can use an old Intel quad-core w/ iGPU and not care.
 
"Voltage Go Up" was their only way to make the chips faster as, joking tone aside, the 13th and 14th gen chips are basically just the 12th gen again. Probably someone like @The Ugly One can point to some actual differences between the generations and doubtless an Intel diehard could argue why those differences matter to anyone. But to me, it's just the same architecture again and again, it seems.

So to sell it, as you said: "Voltage goes up".
Yeah saw some crazy values of like 1.6-1.7v being tossed around, but since Intel never said to not do that, mobos were just running with it.

Insanity.
 
Does AMD even have a foot in the AI door? (genuinely asking) ROCM is such a fucking joke, on Linux (and let's be real that's where all the AI shit is running) you can literally just slap your Nvidia card in and as long as it isn't 900 years old you are good to go with the standard driver install.
I gave up trying to get ROCM working, Hell even just trying to find out if your card is supported is confusing.

I am in this awkward spot where the PCIE on my mobo is kind of fucked and randomly breaks output whenever I insert a card but I am still on AM4 so its going to be a full upgrade or buying a mobo that's gonna be eol soon. kill me, these new CPUs don't really impress me for what the cost is going to be.

Gaming hardware is not important to the AI market. It is true that ROCm is a giant pile of shit. However, the people who buy $400,000 server nodes by the shipping container to ingest data by the petabyte do not really care. What they do care about is the MI300 delivers a very good price v performance value, and H100 is extremely hard to get ahold of, so they're buying a lot of AMD hardware now. The fact that the software is hot garbage is just something developers have to cope with, because there's a gold rush going on.

but ROCM works just fine on their enterprise hardware

Well, not after the last update. I can't believe how much they managed to break.

Probably someone like @The Ugly One can point to some actual differences between the generations

12th -> 13th, a bit more cache, using higher yields to add E-cores & clock speed
13th -> 14th, using higher yields to add a bit more clock speed.

It's not much, but it's also not nothing. However, if you are building a PC, unless you are going to put a bomb ass cooling solution on your board, there isn't that much merit to one "generation" over another, just get whatever gets you the best deal for cores & GHz vs dollars.
 
At $100? Its not a bad deal. However, with a boost clock of only 3.2 GHz, you may find it struggles in games if you were thinking about sticking it in a desktop. I recently experimented with turbo boost off on my i9, capping the frequency at 2.4 GHz, and some games ran like ass.

Main thing going for this CPU is 8 channels of DDR4 and enough PCIe lanes to serve up to 4 GPUs, or a giant pile of storage, or whatever other devices you have in mind.
It would be used in a home server to replace one with a Xeon E3-1220v2
 
Does AMD even have a foot in the AI door? (genuinely asking) ROCM is such a fucking joke, on Linux (and let's be real that's where all the AI shit is running) you can literally just slap your Nvidia card in and as long as it isn't 900 years old you are good to go with the standard driver install.
I have a 7900XTX and it works ok for the big name AI like LLM and StableDiffusion. A bit slower than I'd expect though.

On the other hand if you want anything remotely not-mainstream like WhisperX or anything purely written for CUDA then you're just fucked. Luckily I have plenty of Nvidia for that.
 
Back