GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
How would they relay that information to the viewer, though? You either have to trust their opinion on it or hope that what they show you on screen is an accurate representation of what you would see in person....which it probably isn't going to be.

You summarize the results of a double-blind with a write-up after you do the study. They're pretty effective at sussing out whether perceived differences are real or psychosmatic.

A simple double-blind would be to have people play a or view a series of games for a few minutes, then write down whether they think upscaling is on, and whether there's "a lot" or "a little." This would tell us
  • Whether people can actually tell if upscaling is on
  • Whether there's a significant difference between DLSS & FSR when it comes to people realizing upscaling is on.
  • How much the level matters.

Maybe they don't accept the results because they honestly see/hear/taste it differently than the people who did the test?

Psychosomatic effects are "real" but just establish that your senses are affected by your thoughts. The label on a bottle of wine affects what people actually taste, despite obviously not affecting a single molecule in the wine itself. When an audiophile thinks his music sounds better after buying a magical chip, he's not lying. But it doesn't mean those things actually do anything other than influence your mind.
 
A simple double-blind would be to have people play a or view a series of games for a few minutes, then write down whether they think upscaling is on, and whether there's "a lot" or "a little." This would tell us
  • Whether people can actually tell if upscaling is on
  • Whether there's a significant difference between DLSS & FSR when it comes to people realizing upscaling is on.
  • How much the level matters.
Idk, I could see it being more reliable if they get a number of people to establish some sort of average. No idea what that number of people would be. Then at least people could know, based on an average perception, this is what X will probably look like.
 
Part of it is also how much quality would a reasonable person be willing to sacrifice, any details, etc.

A blurry wine label on a bottle, if the bottle's like, 5 metres away, probably won't concern most people, since they could just walk closer to the bottle, and ideally have it get clearer.

Low-poly birds in the background, or trees that seem a bit basic, etc. People probably won't mind if it lets you get 10-15 extra fps.

Has to be within reason though.
 
"Lets fix subjective reviews by introducing even more subjective assessments." Yeah, no thank you.
It's very simple; what can your card achieve with normal rasterization? What can your card achieve with RTX? What is the price? What is the power usage?

All the rest is bullshit. You can already look at every DLSS and FSR comparison. And if you are already set on DLSS then there's no point even watching an AMD review.
Not to mention that those features are being constantly updated. If FSR5 comes out in 6 months then your review is now pointless.
 
"Lets fix subjective reviews by introducing even more subjective assessments." Yeah, no thank you.
It's very simple; what can your card achieve with normal rasterization? What can your card achieve with RTX? What is the price? What is the power usage?

All the rest is bullshit. You can already look at every DLSS and FSR comparison. And if you are already set on DLSS then there's no point even watching an AMD review.
Not to mention that those features are being constantly updated. If FSR5 comes out in 6 months then your review is now pointless.
Yep. Again places like DF already do comparisons so I really don't see why YouTubers would care.
 
We need ray tracing to accurately render moments like this.



Reviewer records 150°C temperature on 12V-2×6 PSU power cable connected to RTX 5090 graphics card



AMD’s Next-Gen Ryzen Zen 6 “Medusa Ridge” CPUs To Come In 12, 24 & 32 Core Flavors, Up To 128 MB L3 Cache (archive)

Chinaman says that there will be 32-core Zen 6C on AM5. Also that the regular 12-core Zen 6 CCD will include 48 MiB of L3 cache, up from 32 MiB, keeping the same 4 MiB per core ratio. Both of these are a little tough for me to believe. I'm going to assume the reference to 128 MiB L3 on a Zen 6C CCD is a mistake, and that it is 32 cores in 1 CCD unlike 2 like Wccftech incorrectly states. A 32-core Zen 6C CPU doesn't make much sense for most consumers, but there are embedded/enterprise CPUs that have made it onto AM5 such as the Epyc 4004 series which are just rebranded Ryzen 7000 with more features enabled.

Going to 48 MiB for a 12-core CCD is aggressive, since L3 cache doesn't scale well with node shrinks. But scaling isn't completely dead as has been reported, since TSMC N2 supposedly increases SRAM density by 19%. So if Zen 6 skips TSMC's 3nm nodes entirely (the Zen 5 CCD is on N4X), maybe 12 cores and 48 MiB is possible within a 75mm^2 chiplet.

Zen 6 on N2 is plausible because there's a narrative that more Zen 5 products were supposed to use N3 but AMD held back because of early yield issues. In particular, Strix Point was originally planned to use chiplets, but ended up as a monolithic N4P die with mixed cores.

This also means X3D would change. It's 32+64 = 96 MiB on an 3D V-Cache-equipped chiplet now. With the same 64 MiB cache chiplet, you would get 112 MiB, and if they decide to triple the total again, 144 MiB. There should be diminishing returns from increasing L3 cache, but it will be highly dependent on how specific games utilize the cache. There could be some hard-to-quantify benefit for running games and other applications at the same time.
 
Last edited:
Idk, I could see it being more reliable if they get a number of people to establish some sort of average. No idea what that number of people would be. Then at least people could know, based on an average perception, this is what X will probably look like.
"On a scale from -4 to +7, to what extent do you think this DLSS/FRS screenshot deviates from the reference rasterizer. Thank you, next slide please..." - it's impossible, unless there's some serious errors happening.

iirc it was with D3D9 that Microsoft put their foot down and told GPU manufacturers that the image they're rendering better be the exact same image their hardware is instructed to render or else they won't be DirectX certified. No deviations! DLSS and FSR kind of messes with that.
 
Idk, I could see it being more reliable if they get a number of people to establish some sort of average.

That's what a double blind study is. You can't do a double blind with one person. You need enough people to get meaningful statistics.

No idea what that number of people would be. Then at least people could know, based on an average perception, this is what X will probably look like.

A dozen is a good start. The outcome of the study would be something like this.

In a double-blind study, participants were asked to identify if the game was using upscaling. Games were shown uncompressed game footage running at 60 fps and 1440p with appropriate settings to maintan this frame rate depending on the upscaling used. A control group was shown only games without upscaling and identified the games were using upscaling 15% of the time. For the rest of the study, each participant was shown a randomized series of game footage for ten games, with randomized settings, five minutes of footage each, no repeats. Correct identification of the presence of upscaling in the randomized test is as follows:

SettingIdentification rate
Native23%
DLSS Quality28%
DLSS Performance44%
FSR Quality55%
FSR Performance92%

Conclusions:

There was no stastical difference in the participants' perception between DLSS Quality and Native. DLSS Performance and FSR Quality had similar outcomes, as approximately half the time, users believed they were viewing footage at native resolution. FSR Performance was nearly always identified correctly. In interviews conducted after the study, participants noted that certain games had substantial amounts of shimmer, blurring, and poorly-resolved features, and these were in fact the games using FSR Performance.

LTT, GN, and TH all have the resources to do something like this. TH seems to have a moral objection to inferencing. Certain writers openly seethed about DLSS when it first launched on the 20 series.
 
Ah, blind testing, one of my favorite topics.

I know someone who was insistent on being able to tell the difference between 320mp3 files and FLACs. We did one test track and it turns out the CD rip on that one was screwed up. When I nulled the two files, it produced a silent track - there was literally no difference in the two files.

"BUT I HEAR IT!!!!!!" No fucker, I inverted the wave on one of the files and summed it with the other one and it produced a completely silent track. It was a literal +1 + -1 = 0. "BUT I HEAAAAAAAAR IIIIIIIT!"

But the greatest blind test story is when French critics scored California wines better than French wines. Allegedly, one of the French tasters insulted a French wine, thinking it was Californian, and then praised a Californian wine, thinking it was French.
 
But the greatest blind test story is when French critics scored California wines better than French wines. Allegedly, one of the French tasters insulted a French wine, thinking it was Californian, and then praised a Californian wine, thinking it was French.
A classic story
The+judgment+of+paris+where+french+winesnobs+get+humiliated_ca8ccd_9954429.jpg
 
AI upscaling and framegen is dogshit because neither Nvidia, nor AMD, nor Intel can implement it in the driver, and instead it has to be implemented by a per-game basis by the game devs. You want to play an older title that came out before the 20 series came out? Well tough shit chucklefuck, you're not getting our new fancy schmancy toys! Even though some drunk Ukrainian managed to whip up his own upscaler and frame gen that tackles the game window. Obviously nowhere near as high quality as DLSS or FSR, but hey, he's proven you don't need it baked into the engine to be somewhat passable.

I'm sure Nvidia/AMD/Intel could do a better job since they can go lower than the final window render, so it could be of better quality, with less oddities and potentially less input lag/overhead, and still be applicable in any DirectX/OpenGL/Vulkan game. Nvidia has shown it can target the game render itself with RTX Remix, at least the DX9 render, so it would be feasible to make, to an extent, DLSS upscaling/frame gen as a driver setting akin to NIS. Something that would sit between the driver and the final window render and do the magic with all of the limitations of such a technique in mind, and potentially run as low as Turing/Ampere. It wouldn't be as impressive as DLSS/FSR baked into the engine, but that also means that the hardware demand for such a technique would be lower.

Clearly, there is a demand for at the very least 2x frame gen for emulators/old games that are locked to 30FPS that cannot be unlocked, or will end up with game breaking bugs if unlocked, so if Nvidia/AMD could come up with something to do that on a driver level better than Lossless Scaling, it would be a great selling point for many. But I guess there's no money in improving older games ran on older cards and there's more money in upselling newest titles to buy the newest cards. Have you noticed how Nvidia stopped talking about RTX Remix? Yeah, no money to be made on older titles, gotta keep the gravy train going.
 
I finally found a use case for FSR. With triple monitors in some games, it's better to use FSR than not. Just too many pixels on screen for the GPU to deliver 60+ fps.

AI upscaling and framegen is dogshit because neither Nvidia, nor AMD, nor Intel can implement it in the driver, and instead it has to be implemented by a per-game basis by the game devs.

AMD can add FSR 2 and Fluid Motion Frames in the driver to any game using DX11 or newer. The reason it doesn't work in DX10 or previous is it requires features that aren't present in DX10. NVIDIA has an in-driver upscaler called Image Scaling that is similar to FSR 1.

I'm sure Nvidia/AMD/Intel could do a better job since they can go lower than the final window render, so it could be of better quality, with less oddities and potentially less input lag/overhead, and still be applicable in any DirectX/OpenGL/Vulkan game.

The reason nobody is trying to make a version of these that works on DirectX 10 is that DX10 is 18 years old, and games made during Obama's first term don't actually need help to run well on modern GPUs. Meanwhile, OpenGL is dying, and Vulkan is stillborn.

However, FSR 1 is open source. It's fully implemented in Lossless Scaling and works with anything. The problem with frame interpolation in LS it is purely interpolated, so it introduces a full frame delay. DLSS4 and FMF appear to extrapolate the next frame, so the delay is much, much smaller.

You want to play an older title that came out before the 20 series came out

If you're trying to run a 7-year-old game on new hardware, it probably runs at 200 fps and max settings without upscaling.
 
Last edited:
The recent debacle about the 32 bit Physx and people running second cards just for physx has me thinking.... why doesn't Nvidia or AMD make a dual card system for RT? They could devote full GPU dies to raster going forward and then user's could also choose to opt in to RT with a separate purchase of a pure RT card. a pure die dedicated to RT cores also seemingly should be able to last quite awhile as well (depending on the die size/amount of RT cores).
 
The recent debacle about the 32 bit Physx and people running second cards just for physx has me thinking.... why doesn't Nvidia or AMD make a dual card system for RT? They could devote full GPU dies to raster going forward and then user's could also choose to opt in to RT with a separate purchase of a pure RT card. a pure die dedicated to RT cores also seemingly should be able to last quite awhile as well (depending on the die size/amount of RT cores).

Dual-GPU rendering has always had latency & bandwidth problems. You'd need both your raytracing accelerator and your rasterization accelerator to work on the same triangles. The PCIe bus would probably choke to death.
 
Ah, blind testing, one of my favorite topics.

I know someone who was insistent on being able to tell the difference between 320mp3 files and FLACs. We did one test track and it turns out the CD rip on that one was screwed up. When I nulled the two files, it produced a silent track - there was literally no difference in the two files.

"BUT I HEAR IT!!!!!!" No fucker, I inverted the wave on one of the files and summed it with the other one and it produced a completely silent track. It was a literal +1 + -1 = 0. "BUT I HEAAAAAAAAR IIIIIIIT!"

But the greatest blind test story is when French critics scored California wines better than French wines. Allegedly, one of the French tasters insulted a French wine, thinking it was Californian, and then praised a Californian wine, thinking it was French.
There are still people who, despite having the Nyquist-Shannon Sampling Theorem explained to them and having irrefutable mathematical proof in front of their eyes, continue to believe that digital CD audio is an imperfect replica of the 'perfect' analog audio of vinyl. I ran into no shortage of people, even highly educated ones with backgrounds in engineering (civil and mech engineering mind you but still), who absolutely refuse to budge on this.

So it doesn't surprise me that there will always be holdouts when it comes to inferenced upscaling.
 
All this talk of quality ironically reminds me of the time I was a vidya tryhard. I deliberately turned my graphics settings down because at the time, I thought the shittier graphics meant things were easier to spot. Looking back, it's suspect if that helped me at all.
digital CD audio
I don't do vinyls cause I'm a pirate and could careless where the rips came from, but the defense of vinyls nowadays is that the tracks are allegedly mastered differently.
 
I deliberately turned my graphics settings down because at the time, I thought the shittier graphics meant things were easier to spot.

For me, it was to maximize FPS and make them easier to spot.

Plus too poor for anything actually good to have both the frames and the visuals.
 
Hello I don't know if this is an appropriate question for this thread but I'm going to shoot anyways

My 14 year old motherboard got fried, it's giving me bluescreen errors every boot and I'm worried about it destroying my HDD, also 14 years old. I've declared my pc officially dead and am planning to buy a laptop for the time being. I would like recommendations for one, budget is around 1000 USD. It doesn't have to be a hardcore rig but should at least have gpu power of a 1050 or 1080, should have the cpu power of a i7 6th gen or greater and should have 16 gigs of ram at minimum.

No apple products, goes without saying.

Thanks in advance
 
All this talk of quality ironically reminds me of the time I was a vidya tryhard. I deliberately turned my graphics settings down because at the time, I thought the shittier graphics meant things were easier to spot. Looking back, it's suspect if that helped me at all.

There have been many games where turning down your graphics settings makes things like foliage disappear. I remember years ago getting absolutely destroyed in some game whenever I'd hide in bushes...found out some time later you could make the bushes disappear beyond a distance of about 20 feet by minimizing some slider, so I was just hiding in the open.

Hello I don't know if this is an appropriate question for this thread but I'm going to shoot anyways

My 14 year old motherboard got fried, it's giving me bluescreen errors every boot and I'm worried about it destroying my HDD, also 14 years old. I've declared my pc officially dead and am planning to buy a laptop for the time being. I would like recommendations for one, budget is around 1000 USD. It doesn't have to be a hardcore rig but should at least have gpu power of a 1050 or 1080, should have the cpu power of a i7 6th gen or greater and should have 16 gigs of ram at minimum.

No apple products, goes without saying.

Thanks in advance

Virtually any new laptop will meet your requirements. A laptop with a dedicated GPU will exceed them, but have a short battery life. I've been disappointed by the build quality of my ASUS Tuf and don't recommend it.
 
Last edited:
Back