GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I don't see any reason to distrust their benchmarks, I doubt they have time to hand-sort every benchmark and cherry-pick from thousands of runs to add a -3% anti-AMD bias.
They kind of arbitrarily decide what's "effective" or not. When they mention "effective FPS" or "effective speed", they're referring to a proprietary system they came up with that they haven't explained to anyone. They say high average FPS isn't indicative of a realistic experience, which is true, but then they don't explain what they actually mean with "effective FPS". They explain what 0.1% and 1% lows are, and they even have math for that, but no such thing exists for their "effective fps" metric. There's also things like this from their "effective CPU speeds" page:

Updates​

Our indices are based on today’s performance requirements, we don't predict the future.

July 2019​

We reduced the contribution from thread counts higher than eight. The 32-core AMD 2990WX moved from first position to 48th. Meanwhile the 8-core Intel 9900K moved from 7th to first position.

Smear campaign​

Within hours of the July 2019 changes, userbenchmark was subjected to an intense and coordinated smear campaign, increased cyber-attacks and personal threats by parties claiming to be AMD fans (for the record, no other brand “fans” exhibit this behaviour). Their very specific demand was to rebalance our CPU Effective Speed in AMD's favour.

November 2020​

During the Ryzen 5000 release event, as well as discussing the importance of single core performance and CPU latency, AMD provided benchmarks for 10 games of their choice. According to AMD's official figures, UserBenchmark overestimates Ryzen 3000 by ≈ 5%. Meanwhile, AMD "fans" continue to smear UserBenchmark via an army of anonymous accounts on reddit, youtube, forums and deal sites.

The link about them saying they overestimated AMD is funny because when you go to it, it's actually them saying they underestimated Intel by 5%, but maybe they thought it would sound like they're being generous to AMD by phrasing it like that.

Links to their effective FPS and effective speed pages:
 
Last edited:
56k is just the earliest point most Americans got online because the release of the first mass-market 56k modems in 1997 almost coincided with AOL's shift to a flat $20/mo fee (as opposed to pay-per-minute) which caused the population of internet users to 10x almost overnight. Even most boomers would have first gotten online right around then just because it was quite expensive prior to 1996/1997.
yeah but the apple II GS came out in 1986 my guy, there's an entire decade of time between "my first computer was an apple II" and "i first connected to the internet using 56k"

i don't tell people my first computer was a computer my mom bought when i was a kid that they replaced with a newer computer lmao
 
They kind of arbitrarily decide what's "effective" or not. When they mention "effective FPS" or "effective speed", they're referring to a proprietary system they came up with that they haven't explained to anyone. They say high average FPS isn't indicative of a realistic experience, which is true, but then they don't explain what they actually mean with "effective FPS". They explain what 0.1% and 1% lows are, and they even have math for that, but no such thing exists for their "effective fps" metric.

You're acting like the formula is something like

measured FPS x (gpu_vendor == AMD ? 0.9 : 1)

AMD skimps on bandwidth (ironic given what they do with EPYC) and is perpetually behind on RT, which more and more games support. This means they lose big on a lot of benchmarks. Some of the more egregious numbers from Gamer Jesus:

Game9700 XT vs 5070 Ti (Rasterization)(Raytracing)
FF XIV-23%
Black Myth: Wukong-12%-43%
Dying Light 2-12%-19%
Cyber Punk 2077+3%-19%

Trying to claim they're "about even" is just some kind of brand bias. Maybe a certain set of very online gamers loves them because they're the underdog, or because they open-source their drivers, or because Jensen Huang is an annoying faggot. But every gen for over decade, AMD's been lagging here, cutting corners there, and producing imbalanced GPUs that shit their pants in key scenarios and are behind on feature set. I bought into the AMD hype once and got a disappointing product with very uneven performance. They still haven't caught up. Sure, a 9700 XT is selling for about $250 less than a 5070 Ti, but there's a very good reason for that.

There's also things like this from their "effective CPU speeds" page:

Seems reasonable to me. The improvement to the typical user experience by adding more cores is more logarithmic than linear, and this ironically benefits AMD given that Intel's blowing out core counts and typically leading multi-thread performance with E-cores lately.
 
Last edited:
The entire userbenchmark website and benchmark experience is extremely deceptive and constructed to manipulate you into thinking the results they're showing you are data-driven and they aren't. The e-fps they list is NOT derrived from user aggregate, it's actually just the test results THEY got from THEIR test bench. This is not clear based on their website design, in fact it's purposefully designed to lead you to the opposite conclusion. that doesn't mean you can't use the benchmarks at all, both are valid methodologies, but the fact that they're purposefully confusing you on one for the other is EXTREMELY SUSPECT.

Why are they trying to convince users that their fps results are the results of aggregate testing, rather than specific benchmarks? Why is the website using the appearance of being a collective aggregate user benchmark website to pre-emptively defend themselves against criticism of unclear data collection practices? Uhhh... probably because they're fucking with the numbers, chief
 
The entire userbenchmark website and benchmark experience is extremely deceptive and constructed to manipulate you into thinking the results they're showing you are data-driven and they aren't. The e-fps they list is NOT derrived from user aggregate, it's actually just the test results THEY got from THEIR test bench. This is not clear based on their website design, in fact it's purposefully designed to lead you to the opposite conclusion. that doesn't mean you can't use the benchmarks at all, both are valid methodologies, but the fact that they're purposefully confusing you on one for the other is EXTREMELY SUSPECT.
Sounds like hearsay tbh.
 
You know the way men over forty talk about how "manly" it is to drive their muscle car with a V8 carbureted engine and manual transmission on the highway and that anything else just wouldn't be a "real" car?
That's probably how I look for not just out of comfort but principle at this point preferring to use a desktop with its big screen and replaceable parts that almost exclusively runs free and open source software instead of using a horrid tablet and relying on cloud storage or SaaS like most people
you use open sauce programs for peacocking reasons from your midlife crsis.
i use open sauce programs to avoid being sued till i'm peniless because i used a cracked program as i've learned in school how to dodge copyright troll cuckery.

we are not the same.
Depends on the CPU and the game. Some games are more CPU bound. The X3D CPUs can really shine in some of these instances. IIRC, Cyberpunk might be one of those games, but I can't remember for sure. The real crime there is recommending anyone buy Intel 13th gen when they're well known to suffer from horrible manufacturing issues that cause the CPUs to die over time
not just cyberbug, a lot of games benefit from it because of the L3 cache in order to handle data, single/multi threading doesn't matter that much in how much of a bonus it is, cache size for operations do and X3D have lots of L3 ones, sim games benefit heavily due to the needed calculations even if johnny zoomie can gayme better on his CS2/Bobox account... were it not for me going on a meme build intentionally i'd pick the 5800X3D back then and would enjoy less retarded ai and economy on X4 but... retard gotta retard i guess :ow:
Anyone defending userbenchmark is retarded. They've been a joke for years.
precisely when ryzens released, something buck broke on the guy once he saw ryzens rawdogging intel, been a shadow of his former self ever since.
 
Last edited:
Sounds like hearsay tbh.


brother i don't know what to tell you it's literally just how the website is laid out. They are doing two separate things and motte and baileying between them. it doesn't compromise it's ability to be used as a benchmark, but they're specifically manipulating first impressions for people who aren't autistic enough to do nitty gritty comparisons - which is the entire reason people get recommended websites like this in the first place. This shit is so jewish it's unreal
 
i have literally never seen that page before today, i just look at this one

1777056529392.png

amd btfo again
 
Trying to claim they're "about even" is just some kind of brand bias.
You picked 4 games 3 of which are single player and didn't show any of the fps numbers. for all we know you could have a situation where the 5070ti gets 100 and the 9070xt gets 80. Fps numbers that don't make a huge difference in single player and ffxiv is an mmo so fps isn't super important there either(60+ is fine). (ignore this part)

You cant say the 13600k is a better buy over the 9800x3d because "It can get you to 144 fps" but ignore the fact that for most games the 9070xt is a 1440p 120+ / 4k60. By your own standard the amd card is a better buy and functionally it would be the same or better than the nvidia because its much cheaper
 
You picked 4 games 3 of which are single player and didn't show any of the fps numbers
"Those games are single-player" is a new cope for sure.

for all we know
You can click the link to the original article, is your mouse broken?

Fps numbers that don't make a huge difference in single player and ffxiv is an mmo so fps isn't super important there either(60+ is fine).

AMD GPUs perform equal to NVIDIA at the same tier
I don't use raytracing because of the FPS hit
Actually FPS doesn't even matter
===== YOU ARE HERE =====
Nobody even plays those games
I prefer 1080p, really
AMD supports the Linux community
I use arch btw

Either AMD GPUs perform equally to their NVIDIA counterparts, or they don't. This isn't a matter of preference, what games you like or whether you think the $250 premium for the 5070 Ti's superior performance is worth it. It's a matter of objective numbers. No amount of coping changes the fact that a 9070 XT isn't equal to the 5070 Ti. Maybe AMD will finally get it right with RDNA6.
 
Either AMD GPUs perform equally to their NVIDIA counterparts, or they don't
They do in most games. There are RT riles where the performance is similar. They only fall short in PT and the odd game. There are many benchmarks that validate this. Just as easily as you picked those games that the 9070xt lost badly i could use CoD as proof that its a 4090 competitor, but when you look at the average performance across most games the difference is negligible. The article you posted shows they are very similar cards performance wise.
 
Trannybenchmark has a vendetta against AMD ever since the 1st gen Ryzen, the fag who runs it is a total schizo. The ONLY value that website has is to see how your specific hardware stacks up against other users with the same hardware, to see if you have a bottleneck somewhere, or if a piece of your hardware is under performing or broken. Their built in synthetic benchmarks should NOT be used to compare apples to oranges.
 
They do in most games. There are RT riles where the performance is similar. They only fall short in PT and the odd game. There are many benchmarks that validate this. Just as easily as you picked those games that the 9070xt lost badly i could use CoD as proof that its a 4090 competitor, but when you look at the average performance across most games the difference is negligible. The article you posted shows they are very similar cards performance wise.

"Average performance across most games" isn't equality. Equality is equality. Equality means the player's experience is the same. Not "statistically the same across most games." Not "the same unless the game uses cutting-edge rendering technology." Not "the same as long as you aren't expecting to play anything from this list of current, popular games." The same.

There are lots of popular, mainstream titles out right now where the 9070 XT simply isn't cutting it. Some games where its performance throttles down almost as low as the 5060 Ti. The market's discounted it accordingly.
 
"Average performance across most games" isn't equality. Equality is equality. Equality means the player's experience is the same. Not "statistically the same across most games." Not "the same unless the game uses cutting-edge rendering technology." Not "the same as long as you aren't expecting to play anything from this list of current, popular games." The same.

There are lots of popular, mainstream titles out right now where the 9070 XT simply isn't cutting it. Some games where its performance throttles down almost as low as the 5060 Ti. The market's discounted it accordingly.
Are these metrics accounting for the popularity of games? Nobody cares if Concord requires a i11-99999x for example.
 
All I know is that my buddy who has a 7900 GRE had constant driver timeouts and game compat issues that required fiddling, whereas outside of last year when Nvidia shat the bed back to back with their GeForce drivers, I've rarely had to fiddle with my 4080 to get shit to work.
 
There are lots of popular, mainstream titles out right now where the 9070 XT simply isn't cutting it.
List 10 of them and they need to have come out in the last 2 years.
"Average performance across most games" isn't equality. Equality is equality. Equality means the player's experience is the same.
Then no gpus are equal. And no one views it that way when we say equality we just mean similar enough. it'd be stupid if i said a 3080 is not equal to a 4070 because i can find 2 games where it runs better than it. That's retarded
 
Then no gpus are equal
It's true, nothing AMD had made is the equal of an NVIDIA card since the DX9 days.
it'd be stupid if i said a 3080 is not equal to a 4070 because i can find 2 games where it runs better
I can find way more than 2 games where the 9070 is inferior to a 5070 Ti so this is kind of a red herring.

This would include 100% of games with DLSS4 support and 100% of games that support DLSS and not FSR4.
 
All I know is that my buddy who has a 7900 GRE had constant driver timeouts and game compat issues that required fiddling, whereas outside of last year when Nvidia shat the bed back to back with their GeForce drivers, I've rarely had to fiddle with my 4080 to get shit to work.
Not sure what they were doing. When I was using Windows, I just installed the AMD Catalyst drivers and played the games.

On Linux the drivers are built into the OS. I did have to setup and install corectrl to get the best performance in HellDiver 2. But that was more distro specific stuff than anything wrong with the card/drivers itself.
 
I remember running some benchmark software on my computer. Noticed the next day there was over 60GBs of space on my c drive taken up by crap left behind by it. It was testing the SSD read and write operations? IDK if it was userbenchmark but I'm still blaming them anyhow.
 
i don't know where to post but it's sort of GPU/AI related video from Vex.
my dude bought a shitty NV3 which doesn't even have dram for 62$ and now it's 143$, well over 50% increase, insanity.
 
Back
Top Bottom