- Joined
- Dec 17, 2019
Oh god and they're the nightmare Alienware prebuilds as wellUK Royal Navy builds 'esports suite' loaded with gaming PCs onboard its newest warship — eight high-powered PC battle stations added to war room

Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Oh god and they're the nightmare Alienware prebuilds as wellUK Royal Navy builds 'esports suite' loaded with gaming PCs onboard its newest warship — eight high-powered PC battle stations added to war room
RDNA absolutely sucked for general compute. Don't ask me why, I don't know the details. I just know that it did and couldn't really be made to work. I am not an EE, but my Dad works at Nintendo.I always wondered why they switched. I remember seeing gcn/vega reviews and they were going toe to toe with the Titan class cards in productivity when they came out. It was a genuine strong point of AMD at the time, but then they switched to a separate gaming and compute division for rdna/cdna just really odd choices.
GPU architectures in general tend not to be good at it. That's why they're GPUs and not CPUs.RDNA absolutely sucked for general compute
Probably just different vram amounts. I don't see them massively differentiating them again as that would kill the goal of unifying compute and gaming.I wonder what the catches will be with UDNA. They'll have to nerf the consumer cards in some ways to avoid undercutting the pro cards (FP64 being an obvious one).
They probably completely stripped anything related to compute out.RDNA absolutely sucked for general compute
Shaders wouldn't work if they did. All I know is that when it came to ingesting large arrays and executing arbitrary code, it was woefully inefficient and couldn't really be fixed. I don't understand nearly enough about electronics design to explain why.They probably completely stepped anything related to compute out.
GPU architectures in general tend not to be good at it. That's why they're GPUs and not CPUs.
its a bit underwhelming. AI performance is nice but the gaming performance uplift is a lot less impactful than i had anticipated from my 3090Don't you own a 5090?
From some quick reading it seems like CDNA is like a really fast cpu(I am most likely wrong), while RDNA functions more like a typical gpu.Shaders wouldn't work if they did. All I know is that when it came to ingesting large arrays and executing arbitrary code, it was woefully inefficient and couldn't really be fixed. I don't understand nearly enough about electronics design to explain why.
Some differences I can find online:
- CDNA has higher compute density, i.e. a greater % of the die area is for doing math
- CDNA has smaller warps (16 vs 32). A lot of HPC & AI code has branching and other exceptional behavior that cause warps to stall, so smaller warps execute more efficiently.
- RDNA is designed for low latency, while CDNA is designed for high bandwidth. Latency doesn't matter much for HPC or AI.
- CDNA cache was more optimized for bandwidth than capacity
NVIDIA GPUs are very good at it, and by "general compute," I mean generic computations on large arrays of data, or what used to be called GPGPU.
Yep. I think the idea is games are so badly optimized, boosting high during a bottleneck and lower during light utilization gives better a framerate. But in practice it doesn't work out.Is it that by default, when GPU's constantly whiplash between boosting to the highest clock possible, then overcorrect it to a safer one, they end up being less performant than if you were to force them at a given nominal frequency and not let them exceed it? Is that what I'm noticing?
Is it that by default, when GPU's constantly whiplash between boosting to the highest clock possible, then overcorrect it to a safer one, they end up being less performant than if you were to force them at a given nominal frequency and not let them exceed it? Is that what I'm noticing?
yes, but even a 5090 is cheap compared to $10,000+ workstation cardsDon't you own a 5090?
prob why undervolting is all the rage in the overclocking scene. helps cull thermals and power draw a little.Close, but it's not an overcorrection. They boost to the highest clock possible, which gets the GPU absolutely blazing hot, which then requires throttling it back so it can cool down. This results in the average speed being less than if you just held it at a higher, but thermally safe setting. CPUs do the same thing with even more extreme clock variation.
even just dropping down the power limit from 100% to 80% can save a ton of power while barely hitting the framerate, a lot of cards are totally juiced out of the boxprob why undervolting is all the rage in the overclocking scene. helps cull thermals and power draw a little.
Doesnt help that nigvidya has been pushing for the housefire connector while jacking the hell out of the power draw each gen
Yeah, i dropped power limit to 450W since im a bit paranoid when it comes to the power connector. Card also isnt as much of a furnace compared to stockeven just dropping down the power limit from 100% to 80% can save a ton of power while barely hitting the framerate, a lot of cards are totally juiced out of the box
even at 450w the 5090 still smokes every other card on the marketYeah, but then you don't get your card at the top of Tomshardware's next bar chart, and no youtuber will soyface for you.
Listen, when it comes to getting the coveted Soyface Award, you can't be too careful. One day, you're running your card at a wattage that doesn't melt connecters, and the next, your competitor releases a card juiced so hot that all the YT thumbnails with your card are getting the dreaded scowling head smack.even at 450w the 5090 still smokes every other card on the market
awww man! I wish i had done a shunt mod to my $2000+ graphics card to get an extra 5 fps in UE5 slop. Clearly that cobson lookalike called frame chasers has the right ideaListen, when it comes to getting the coveted Soyface Award, you can't be too careful. One day, you're running your card at a wattage that doesn't melt connecters, and the next, your competitor releases a card juiced so hot that all the YT thumbnails with your card are getting the dreaded scowling head smack.
A lawsuit against the Taiwan Semiconductor Manufacturing Company (TSMC), which was initially filed last year, has been expanded to include 17 plaintiffs who allege a host of discriminatory and unsafe practices at the company. The suit surrounds TSMC's much-hyped Arizona plant, with the plaintiffs, all of whom are American citizens, alleging that the firm discriminated against them during the time they spent at TSMC Arizona. The allegations include a preference for Mandarin or Chinese during the hiring process, routine bias against non-Taiwanese employees and unsafe working conditions.
I swear I've posted here that their drivers suck because they moved all their competent staff over to AI shit and left the GeForce team with incompetent rajamahans.Here's your driver problem Nvidia.