Second Sun
kiwifarms.net
- Joined
- Jul 12, 2021
Unless you really need the last 2% of performance, you can always undervolt. Good for heat and with current electricity costs.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Did you manage to recover any data from that drive? The site going down turned that into a cliffhanger.It would be so fuckin sweet if this really was on par with a 3060ti. We need competition in this market badly and anything that shakes up nVidia is fine with me.
(Missed you faggots while the site was regenerating from the tranny attack force)
I'm getting Fermi flashbacks from 12 years ago. FUCK I'm getting old.AMD says that it is safe up to 95C, and they said the same about Zen 2/3.
IIRC, if you wanted OpenCL performance on green, you needed pay for the Quadros. No idea about that part, tho.That might be part of it but the funny thing is that the only way to directly measure CUDA vs OpenCL effectiveness is on an Nvidia card that supports both, and it is said that Nvidia isn't that concerned with their OpenCL performance and focuses more time and effort on CUDA...
US? Recently saw a 6700xt for $370 on Amazon. Newegg had a 6800xt for I think $550 after a rebate. They were/are running a 15% off over $350 if you use an alternate payment system. Brings a 6800xt under $500.Checked GPU prices recently. Still atrocious
It's designed to run at 95 degrees 24/7. It will only actually do that if given demanding workloads that aren't relevant to gaming. But the 7950X is a workstation-lite chip that can outclass older Threadrippers, so some people will run it hard.ETA Prime did a 7950 build and the results are surprising.
Yes, boosting to 95 degrees is by design, but it appears to only do that when the system is under heavy load like benchmarking. In practice it seems to only get as hot as it needs to, with it hovering around 60-70 degrees or so while gaming at 4k ultra settings.
Here's what I suspect. The tech YouTubers and commenters are going to shit all over it because they're just looking at the behaviour while benchmarking and assuming that's how it is 24/7. It'll be interesting to see how the chip performs in more realistic situations.
Ah, I see you’ve never played Dwarf Fortress.There's no game in existence that needs 16 cores to be running at over 5 GHz simultaneously. I hope.
That's the game that needs a mountain of 3D cache.Ah, I see you’ve never played Dwarf Fortress.
Why do you need that much power?Any chance of getting a 3090TI at less than 4 digit prices?
I'm badly wanting to abandon my thirdhand 900 series and not need another GPU for another decade but the price of that one component is my whole build budget at the moment.
Most users tend to only buy Nvidia due to mind share, which is retarded as fuck. I blame the current state of the market on Nvidiots, as usualUS? Recently saw a 6700xt for $370 on Amazon. Newegg had a 6800xt for I think $550 after a rebate. They were/are running a 15% off over $350 if you use an alternate payment system. Brings a 6800xt under $500.
Nvidia prices are still whack, though.
If the hashcat benchmarks i'm finding are right it's literally ten times faster, which would be extremely convenient and much less headache than finding 3 other matching cards for my older one.Why do you need that much power?
If your budget is limited, it would make sense to accept playing some heavier games without RTX (it gets old anyway after you enabled it and saw all the cool reflections), and at lower resolutions, like 1440p.
If you want 60-100fps at max settings @4k, you're just putting yourself on a path of frustration if you're not rich.
And the old AMD reputation. Back in 2008 you really needed nVidia to if wanted your games to run right. I remember swearing off of Radeon for that reason. That was 14 years ago though.Most users tend to only buy Nvidia due to mind share, which is retarded as fuck. I blame the current state of the market on Nvidiots, as usual
I've said this before, but it's funny how many home PC builders act like they're pro streamers/3d modelers/machine learning devs that just have to have shit like CUDA/nvenc/ muh tensor cores to live.And the old AMD reputation. Back in 2008 you really needed nVidia to if wanted your games to run right. I remember swearing off of Radeon for that reason. That was 14 years ago though.