GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

My God, I really wanted to build a PC but the prices and Nvidia being dogshit as usual has made it hard for me to be excited. Also, 95 degrees Celsius normal for a Ryzen 7000 series? Fuck me I have to wait another year or 2 just for a 5800/X3D price decrease. PC Gaming really got fucked right now.
 
It would be so fuckin sweet if this really was on par with a 3060ti. We need competition in this market badly and anything that shakes up nVidia is fine with me.

(Missed you faggots while the site was regenerating from the tranny attack force)
Did you manage to recover any data from that drive? The site going down turned that into a cliffhanger.

I would be really happy with 3060ti levels of performance if the price was fair(~€300) so I am rooting for Intel which feels weird...
 
  • Feels
Reactions: Brain Problems
I don't particularly care about the next new best CPU's. Damn near everything I do is GPU bound. I have one program that is CPU dependent. It is a 3D model slicer for 3D printing. It can slice a model in several seconds. Waiting seconds is a nuisance, but it's not upgrading to new motherboards and ram and shit nuisance. I have a 10900k and the only game I try to play is Cyberpunk. My 3090 manages 57 fps in ray tracing ultra and my 10900k is hanging around 50 precent. As far as I am concerned, until something changes in computing, CPU speeds are basically moot at this point until they can't keep up with GPU's.
 
Checked GPU prices recently. Still atrocious.
AMD says that it is safe up to 95C, and they said the same about Zen 2/3.
I'm getting Fermi flashbacks from 12 years ago. FUCK I'm getting old.
That might be part of it but the funny thing is that the only way to directly measure CUDA vs OpenCL effectiveness is on an Nvidia card that supports both, and it is said that Nvidia isn't that concerned with their OpenCL performance and focuses more time and effort on CUDA...
IIRC, if you wanted OpenCL performance on green, you needed pay for the Quadros. No idea about that part, tho.
 
  • Like
Reactions: AgendaPoster
ETA Prime did a 7950 build and the results are surprising.

Yes, boosting to 95 degrees is by design, but it appears to only do that when the system is under heavy load like benchmarking. In practice it seems to only get as hot as it needs to, with it hovering around 60-70 degrees or so while gaming at 4k ultra settings.

Here's what I suspect. The tech YouTubers and commenters are going to shit all over it because they're just looking at the behaviour while benchmarking and assuming that's how it is 24/7. It'll be interesting to see how the chip performs in more realistic situations.
 
About the Ryzen 7000 heat. Turns out you can adjust settings in bios to reduce power consumption significantly without losing performance. Delidding as also shown the IHS is at least 1mm too thick. Delidding dropped 20 degrees off the chip.

It's AMDs fault for releasing them as power hogs that burn like an Intel, but apparently it's easy to get in under control. There was really no reason for the 95c talk.

*Edit* More pissed about the boards though. X670 borads that cost nearly $300 have 4 sata ports, minimal heat sinks, and just look cheap. Apparently some m.2 slots won't support sata drives anymore, only pcie. Yay for paying more for less!
 
Last edited:
ETA Prime did a 7950 build and the results are surprising.

Yes, boosting to 95 degrees is by design, but it appears to only do that when the system is under heavy load like benchmarking. In practice it seems to only get as hot as it needs to, with it hovering around 60-70 degrees or so while gaming at 4k ultra settings.

Here's what I suspect. The tech YouTubers and commenters are going to shit all over it because they're just looking at the behaviour while benchmarking and assuming that's how it is 24/7. It'll be interesting to see how the chip performs in more realistic situations.
It's designed to run at 95 degrees 24/7. It will only actually do that if given demanding workloads that aren't relevant to gaming. But the 7950X is a workstation-lite chip that can outclass older Threadrippers, so some people will run it hard.

There's no game in existence that needs 16 cores to be running at over 5 GHz simultaneously. I hope.
 
Any chance of getting a 3090TI at less than 4 digit prices?

I'm badly wanting to abandon my thirdhand 900 series and not need another GPU for another decade but the price of that one component is my whole build budget at the moment.
Why do you need that much power?
If your budget is limited, it would make sense to accept playing some heavier games without RTX (it gets old anyway after you enabled it and saw all the cool reflections), and at lower resolutions, like 1440p.
If you want 60-100fps at max settings @4k, you're just putting yourself on a path of frustration if you're not rich.
 
US? Recently saw a 6700xt for $370 on Amazon. Newegg had a 6800xt for I think $550 after a rebate. They were/are running a 15% off over $350 if you use an alternate payment system. Brings a 6800xt under $500.

Nvidia prices are still whack, though.
Most users tend to only buy Nvidia due to mind share, which is retarded as fuck. I blame the current state of the market on Nvidiots, as usual
 
Why do you need that much power?
If your budget is limited, it would make sense to accept playing some heavier games without RTX (it gets old anyway after you enabled it and saw all the cool reflections), and at lower resolutions, like 1440p.
If you want 60-100fps at max settings @4k, you're just putting yourself on a path of frustration if you're not rich.
If the hashcat benchmarks i'm finding are right it's literally ten times faster, which would be extremely convenient and much less headache than finding 3 other matching cards for my older one.
 
Most users tend to only buy Nvidia due to mind share, which is retarded as fuck. I blame the current state of the market on Nvidiots, as usual
And the old AMD reputation. Back in 2008 you really needed nVidia to if wanted your games to run right. I remember swearing off of Radeon for that reason. That was 14 years ago though.
 
And the old AMD reputation. Back in 2008 you really needed nVidia to if wanted your games to run right. I remember swearing off of Radeon for that reason. That was 14 years ago though.
I've said this before, but it's funny how many home PC builders act like they're pro streamers/3d modelers/machine learning devs that just have to have shit like CUDA/nvenc/ muh tensor cores to live.

Oh, and they totally only play rtx games.

Oh, and AMD drivers killed their dog or something 10 years ago.

*Edit* it's also funny that despite AMDs low market share, everyone totally has owned an AMD card recently/ seemingly knows people that have that have ALL had issues. Amazing.
 
Back