GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Diablo IV (2023):
View attachment 7123776View attachment 7123778View attachment 7123779View attachment 7123781




Because the reason shadow maps can't do penumbras right and SSR can't capture off-screen elements has nothing to do with camera angle.

Shadow maps on the left, RT shadows on the right
View attachment 7123786View attachment 7123799

SSR on the left, RT on the right (this is the very top of the screen)

View attachment 7124006View attachment 7123998
As we see, SSR can only reflect the stuff up on the edge there once it comes into view:
View attachment 7124015

My entry-level laptop GPU can handle this just fine. My desktop GPU only pretends to handle it because AMD sucks. This is where you say all that shit is gay and doesn't matter, and we should just never move on from some arbitrary fixed point in technology because it means sacrificing frames. Then I respond by pointing out that all technologies mean sacrificing frames. Why stop at RT shadows? Why not rewind the clock to Gouraud shaded polygons and static cube maps?
Was there ever a time when new graphics tech came out that there wasn’t autistic screeching about it?
 
Was there ever a time when new graphics tech came out that there wasn’t autistic screeching about it?

The rule is that if a new game comes out that my GPU can't handle on max settings, it's lazy unoptimized crap, while if a new game comes out that a GPU significantly weaker than mine can handle at max settings, it's a lazy console port.
 
Diablo IV (2023):
View attachment 7123776View attachment 7123778View attachment 7123779View attachment 7123781




Because the reason shadow maps can't do penumbras right and SSR can't capture off-screen elements has nothing to do with camera angle.

Shadow maps on the left, RT shadows on the right
View attachment 7123786View attachment 7123799

SSR on the left, RT on the right (this is the very top of the screen)

View attachment 7124006View attachment 7123998
As we see, SSR can only reflect the stuff up on the edge there once it comes into view:
View attachment 7124015

My entry-level laptop GPU can handle this just fine. My desktop GPU only pretends to handle it because AMD sucks. This is where you say all that shit is gay and doesn't matter, and we should just never move on from some arbitrary fixed point in technology because it means sacrificing frames. Then I respond by pointing out that all technologies mean sacrificing frames. Why stop at RT shadows? Why not rewind the clock to Gouraud shaded polygons and static cube maps?
This is just sloppy craftsmanship. Considering its blizzard, it's expected. They fucked up overwatch 2 so bad it didn't work on x299 cpus due to an off-spec implementation of avx-512

I don't even dislike rt as a concept it just needs to actually, you know, do something. Showing me screenshots from a game designed to look bad if you don't have an rt card just makes me embarrassed to have given nvidia money for an rt card
 
Was there ever a time when new graphics tech came out that there wasn’t autistic screeching about it?
Idk usually people are excited about it unless there's a serious performance regression for it or it makes older methods unusable. We've been using deferred rendering for like 10 years though and it's a really backwards and nonsensical methodology for real time 3d graphics which is why most indie games generally don't use it, so if you're young enough you might not remember a world when new graphics weren't worse than the last generation
 
  • Like
Reactions: TrinityReformed
Diablo IV (2023):
View attachment 7123776View attachment 7123778View attachment 7123779View attachment 7123781




Because the reason shadow maps can't do penumbras right and SSR can't capture off-screen elements has nothing to do with camera angle.

Shadow maps on the left, RT shadows on the right
View attachment 7123786View attachment 7123799

SSR on the left, RT on the right (this is the very top of the screen)

View attachment 7124006View attachment 7123998
As we see, SSR can only reflect the stuff up on the edge there once it comes into view:
View attachment 7124015

My entry-level laptop GPU can handle this just fine. My desktop GPU only pretends to handle it because AMD sucks. This is where you say all that shit is gay and doesn't matter, and we should just never move on from some arbitrary fixed point in technology because it means sacrificing frames. Then I respond by pointing out that all technologies mean sacrificing frames. Why stop at RT shadows? Why not rewind the clock to Gouraud shaded polygons and static cube maps?
I mean to a non developer these aren't great examples of why i should be excited for RT. I mean, why take the fps hit and the noisy image for maybe a softer shadow or a reflection you wont see unless you stop everything you're doing in game, take a screenshot and zoom in a bunch. Also the majority of games past 2015 largely developers have done a decent job of hiding the flaws of raster to the point where this is even a discussion. It's very obvious why 3d from 2d was a big jump, and it also created new ways to play games we just don't see that from ray tracing, so i don't get the comparisons there. to pretend like anyone critical ray tracing is just scared of changed or poor is kind of dumb. Especially because you have games from 2017-18 that look better than many of the current rt games coming out. and if your just going to enable the most basic rt effects in your game why put it in at all, they barely look any better and you still get that performance hit.
 
I mean to a non developer these aren't great examples of why i should be excited for RT. I mean, why take the fps hit and the noisy image for maybe a softer shadow or a reflection you wont see unless you stop everything you're doing in game, take a screenshot and zoom in a bunch.

The same reason we take the FPS hit to have bumpy surfaces, or rippling water, or self-shadows, or anything that's changed about 3D graphics since the Nintendo 64. It looks better when reflections don't disappear as you move around.

It's very obvious why 3d from 2d was a big jump, and it also created new ways to play games we just don't see that from ray tracing, so i don't get the comparisons there.

We haven't seen a rendering technology that enables a new way to play in over 20 years. The shadows in Splinter Cell were probably the last one.
 
Was there ever a time when new graphics tech came out that there wasn’t autistic screeching about it?
no need to screech when "the new tech" was obviously better. you could show someone quake running on glide and the difference was obvious. same for TNL, bumbmaps asf.
and that "new tech" wasn't as expensive and then still had an obvious jump in performance ON TOP the improved fidelity. comparing RTX to the progress of the 90's till the 2010's is retarded as fuck.

now you get people trying to sell you "new tech" with a game that looks a decade old and cuts your fps in half even if you spend 500 bucks for a midlvl gpu 5 years ago.
whew, that reflection in the puddle most people won't even notice without a side to side comparison is totally noticeable in the midst of turning monsters into loot drops. but look at these pretty screenshots, what a great cost/benefit calculation, totally worth it. no one gives a fuck NOW that it "might" be great at some point in the future.

so what do you actually get besides what most people consider technobabble? wait, let me take a step back - looks even more the same :story:

worse, RT is not even "new tech", it's just that the requirement are so high were finally getting closer where it can be used in realtime, barely and with lot of shortcuts and fake frames to hide the performance hit. "new tech" also isn't automatically better, otherwise prove me wrong and show me your NFT collection you bought from all the money you made in blockchain games.

when you literally fail the ebussy test and the answer to "what's the use case" is extra cost, lower performance and hardly discernable benefit (especially for the fucking end user) is when people really need to take a step back and reevaluate what they're shilling, let alone how retarded it makes them sound.
 
Screen space reflections can't reflect things that aren't in screen space, it's in the name. It has nothing to do with "craftsmanship."
half life 2 did it and that game is 20 years old. it doesn't use screenspace reflections. screenspace reflections are shit for the same reason current raytracing is shit, it's a poor approximation of optical phenomenon everyone with eyes is familiar with
 
half life 2 did it and that game is 20 years old
>b-but it ran like dogshit on top GPU's in 2004 when it released so it's a shit comparison ray tracing is the future
>n-no moore's law is dead you can't bring up today's transistor count as an argument ray tracing is the future
 
  • Like
Reactions: Wine Mouse
Reflections in a puddle and shadow that looks like garbage with temporal smearing is new technology but 120hz displays somehow is shit that already existed since the 90s?
 
Was there ever a time when new graphics tech came out that there wasn’t autistic screeching about it?

Given that there was autistic screeching about the Quake III engine not having a software renderer, I'm gonna say..."no."

no need to screech when "the new tech" was obviously better. you could show someone quake running on glide and the difference was obvious. same for TNL, bumbmaps asf.

You must have been in diapers in the 90s, because that's not how any of that went down. At every stage of advance, from Quake requiring a Pentium to display all of three monsters to Quake III not having a software renderer to Doom III running like absolute dog shit and having no outdoor areas, there was a cohort of gamers who autistically screeched about the new technology being unnecessary and developers just being lazy and/or greedy and possibly in cahoots with graphics accelerator companies. People bitched about moving from 2.5D to full 3D, from sprites to polygons, to requiring 3D accelerators, to requiring T&L, to requiring programmable pixel shaders, because at each jump, many of the new games ran like absolute dog shit and had gameplay regressions for the new tech to even be workable.

The autistic screeching I remember most vividly was Quake III not having a software renderer. That same year, Unreal Tournament's software renderer had colored lights, colored transparency, dynamic lights, texture filtering (albeit dithered), palettized textures, and procedural water effects, none of which even Quake II's software renderer had. The words "lazy" and "unoptimized" got thrown around a lot. This is what UT looked like on the Pentium III laptop I had at the time:

1742760549108.png1742760191685.png


And there absolutely were gamers back then insisting they didn't see what the big deal about Quake III's lighting engine was compared to Unreal's, certainly nothing justify not having asoftware renderer.


and that "new tech" wasn't as expensive and then still had an obvious jump in performance ON TOP the improved fidelity. comparing RTX to the progress of the 90's till the 2010's is retarded as fuck.

No, new tech was insanely expensive back then. Cutting-edge games required cutting-edge CPUs in the 1990s. Quake (1996) barely playable on the fast Pentium from 1994, the P100. Quake II (1997) and Unreal (1998) were slide shows, even if you had a Voodoo. It wasn't like today, when a good CPU lasts you 10 years and you just upgrade your GPU every 3 to 5 years. Back then, a high-end CPU got you maybe 2 years before games came out that it couldn't handle. So your choices were to either buy a new $2000 PC ($4K-$5K in today's money) every couple years or just accept there were going new games you couldn't play, and mostly picked up older games at Circuit City or Babbage's or wherever, but that was fine, because hey, Doom didn't run on the 286 you replaced with that P100, so you finally got to see what the fuss was about.

half life 2 did it and that game is 20 years old.

Half-Life 2 didn't have real-time dynamic reflections. It used static environment maps and couldn't reflect dynamic objects. Note how the boat, trees, and those wooden beams aren't reflected (not sure if the buildings should be reflected).

1742742670376.png

"Vampire objects" isn't an issue with optimization, artistry, or processing power. It is a fundamental limitation of environment mapping. All the rasterization-based methods have fundamental limitations:

Method...
Can do...PlanarStatic mapsDynamic mapsSSRRaytracing
Off-screen✅✅✅❌✅
Dynamic objs/fx✅❌✅✅✅
High res✅✅❌✅✅
Curved surfaces❌✅✅✅✅
Irregular surfaces❌⚠️⚠️✅✅
Many surfaces❌⚠️⚠️✅✅
Diffuse❌❌❌✅✅
Multi/Recursive❌❌❌❌✅

There are lots of things that can't be rasterized at all, like anything with multiple moving, reflective objects reflecting off each other. Think something like multiple cars racing next to office buildings.
 
Arm's ASR upscaler for mobile devices is finally available — Plugins planned for Unity and Unreal Engine
Months after announcement, Arm is finally making its ASR (Accuracy Super Resolution) upscaler open to developers to integrate into their games.
ASR is built on AMD's FidelityFX Super Resolution 2 technology, and at its initial unveiling, Arm claimed 53% more FPS at 2x upscaling with the Immortalis-G720 GPU (Dimensity 8400 and 9300). At the same time, ASR is claimed to deliver power savings of 20% with ASR set to 2x upscaling using the quality preset.
What took you so long, bro?
As a part of their promotional efforts, Arm has introduced a new chapter in its Mali Manga series featuring ASR and to be honest, I could get used to reading PRs like this.
Yup, there's a 9 page manga for this mobile FSR2.
 

Attachments

  • Lunacy
Reactions: Ibanez RG 350EX
Back