GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Questions for computer Kiwis.

Long story short, the basic AM4 5600 PC I wanted to build months ago didn't happen for real life reasons. I'm finally able to get it, but I wonder if it's worth waiting for AM5 at this point.

I'm also considering a mini-PC as an emulation PC or as a stop gap desktop. Thoughts?
 
Questions for computer Kiwis.

Long story short, the basic AM4 5600 PC I wanted to build months ago didn't happen for real life reasons. I'm finally able to get it, but I wonder if it's worth waiting for AM5 at this point.

I'm also considering a mini-PC as an emulation PC or as a stop gap desktop. Thoughts?
AM5 could be rough for early adopters, at least because of everything being expensive at the start. I've written off Zen 4 entirely because I think Zen 5 is going to boost core counts at all price points, possibly add big.LITTLE and accelerators, and coincide with cheaper DDR5. Also, the AM5 APUs are coming later.

We could see great AM4 CPU prices and availability going forward, because AMD wants to keep it around for a while to address the budget market underneath the $300 7600X. The 5600G was as low as $132 lately. Black Friday is around the corner and there could be some good deals then for PCs or parts.

I say go for AM4 or some cheap PC.
 
Welcome back fellow faggots and trannyhaters. We missed some real madness in our slumber:

$900 for the '4070' is absolute madness. If AMD follows with these prices then PC gaming deserves to die.
It's milking time. They want to sell through a huge Ampere stockpile so they pulled this shit. When the time is right, prices will fall and you'll see a 4070, 4060, etc. Rebranding the 4070 into a 4080 is a nuisance for this generation but Nvidia will do anything to make a buck. Like the RTX 3060 with 12 GB.

AM5 socket: Don't buy now, motherboard makers are milking and DDR5 prices are elevated. Those should improve by the time 3D cache parts come out.

God I hope not.
The benefits are clear, even on desktop. That's why Intel is literally doubling down on it by adding more E-cores. When AMD is finally on board, they will miss a lot of the issues that plagued Alder Lake, like DRM going haywire. AMD is also late to adding AVX-512 support, but their implementation doesn't have the frequency problems that Intel had.
 
Last edited:
  • Informative
Reactions: Brain Problems
The new 7000 series having a 95C degree "target" worries me, now I am a retard stuck in the ancient times where 95C was the super danger zone, won't that degrade the chip or the cache in the long term?
The performance honestly is not impressive, at least to me, but the fucking power draw of 200W+ for the 7950x is just insane, same goes for the whole platform price.
As for nvidia gpus, I am thankful I managed to get a 2060ti for my work machine (rendering) for a decent price, AMD literally doing fuckall for computing does not help with the pricing of professional cards, or even software availability. It's very hard to find rendering software compatible with AMD cards nowadays, they all support CUDA.
 
The performance honestly is not impressive, at least to me, but the fucking power draw of 200W+ for the 7950x is just insane, same goes for the whole platform price.
I can just imagine what a Threadripper of that might do.

It's very hard to find rendering software compatible with AMD cards nowadays, they all support CUDA.
I'm no developer and can barely understand some of the more technical jargon and discussions, but it comes off as CUDA being easier to work with than OpenCL. No idea if that's actually true and one does have to wonder if we have a Way It's Meant To Be Rendered situation in our hands.
 
I'm no developer and can barely understand some of the more technical jargon and discussions, but it comes off as CUDA being easier to work with than OpenCL. No idea if that's actually true and one does have to wonder if we have a Way It's Meant To Be Rendered situation in our hands.
That might be part of it but the funny thing is that the only way to directly measure CUDA vs OpenCL effectiveness is on an Nvidia card that supports both, and it is said that Nvidia isn't that concerned with their OpenCL performance and focuses more time and effort on CUDA...

VEGA 56 and 64 were OpenCL monsters though so if you're doing productivity oriented work and can get one for cheap they're still good.
 
Imagine a market where fucking  INTEL becomes the budget friendly option.

Power of a 3060 ti for $329

It would be so fuckin sweet if this really was on par with a 3060ti. We need competition in this market badly and anything that shakes up nVidia is fine with me.

(Missed you faggots while the site was regenerating from the tranny attack force)
 
The new 7000 series having a 95C degree "target" worries me, now I am a retard stuck in the ancient times where 95C was the super danger zone, won't that degrade the chip or the cache in the long term?
The performance honestly is not impressive, at least to me, but the fucking power draw of 200W+ for the 7950x is just insane, same goes for the whole platform price.
As for nvidia gpus, I am thankful I managed to get a 2060ti for my work machine (rendering) for a decent price, AMD literally doing fuckall for computing does not help with the pricing of professional cards, or even software availability. It's very hard to find rendering software compatible with AMD cards nowadays, they all support CUDA.
AMD says that it is safe up to 95C, and they said the same about Zen 2/3.


Cooling is the new "overclocking", don't bother ever overvolting manually. Use an ECO mode or manual tweaks if you want to reduce power consumption and heat.

AMD has been fucked on GPU compute, and they are throwing everything they can at the wall to live under CUDA domination. Eventually something will stick:

 
Back