NVIDIA DLSS 5 - Seeing salt flow with much more clarity

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
We live in a society where people cannot even WANT to age properly. I have countless pictures of older women that actually look and age better than whatever AI spits out.
Yeah rubbing vaseline on the camera lens any time a lady of a certain vintage is in front of it will do that. Unless you're talking about older women now, and not the pics you have of grannies from the analog era.
That witch with the wand got hit mega hard though, they added shadows so extreme it looked less like she had wrinkles and more like she was partially mummified.
 
I think it looks pretty good in the clip with Leon Kennedy and looks really bad in the rest. But games already look pretty good and graphics are rarely the problem now. I want better performance, and more importantly I want more games that don't suck and aren't boring as shit (more Mewgenics and fewer Concords).
 
I never thought I'd be thankful for modern games being so dogshit.
Seeing DLSS5 and current prices, I don't feel the need to upgrade, since nothing is worth it.
I will not own new parts or new games and I'll be happy.
 
I thought this thread would be about sharing the memes. Oh well.

On topic, seeing Digital Foundry eat shit for running an undisclosed ad, and failing at damage control was fun at least.
From Know Your Meme
Digital Foundry also uploaded a video on DLSS 5 to its YouTube channel around the same timeframe, which received over 795,100 views, 23,200 comments, 24,000 likes and 59,000 dislikes (estimated by the Return YouTube Dislike browser extension) in 21 hours.
At time of posting 27k likes, 72k dislikes.
 
I never thought I'd be thankful for modern games being so dogshit.
Seeing DLSS5 and current prices, I don't feel the need to upgrade, since nothing is worth it.
I will not own new parts or new games and I'll be happy.
Truly. This technology has made such leaps over the past decade, nowadays being the most powerful it's ever been, and for what? For the shittiest AAA games ever which in some cases run poorly even on such high-end rigs. What's even the point anymore investing in new hardware (:_(
 
[Rant incoming due to turd-world takes expressed ITT]
(qualifications - over a decade of making and rendering animated 3d scenes in blender, starting with literal pentiums and gpus from 2003 or so, made a few homebrew games for uni, and am actively developing shit without engines - not much, but more than whichever videoessayist you copied your take from)
___________________________________________________________________________________________________
Every fucking time someone whines about dlss or similar tech that is meant to undo the fuckups of expensive, brute-force ("muh ground truth") methods like ray tracing and its derivatives, they usually show their ass not long after and expose themselves as either:
>a nigger whose idea to fix performance is just to tell nvidia to make an infinite core, infinite ram, infinite everything gpu
or
>a jeet who used his scambux to buy some crypto miner's sloppy seconds 3060, turned everything to max, runs his games at 15 fps native, and refuses the loss of izzat that would come if he admits that his "new" gpu has limitations
___________________________________________________________________________________________________
In both cases their solution is to whine and proclaim "optimize your gaymz benchod reee", typically at the few remaining un-laid-off programmers, the only competent (and white) members of the team, whose code is already optimized to 99.999% of the theoretical maximum. Nobody ever directs this to the niggerlicious digital "artists" who think a background asset should have 6 gorillion polygons with no LOD because "muh artistic vision", or any other "creative" role. Everything they do is 100% kosher. People who do it will then, of course, never use the actual optimizations due to minor tradeoffs in appearance, or even because they just dislike the fact that its an approximation.
___________________________________________________________________________________________________
There is no magical lossles compression method that will both keep things small and be fast to decode. The pre-rtx options for lighting could get close to photorealism but had limitations, and implementing the dynamic ones often meant refactoring the rendering pipeline. Not to mention the mathematical chops one would need to be able to churn out most of them. Ray tracing was in reality implemented because the industry is too jeet-saturated for most teams to be able to make something that would justify selling the next xbox, or gpu. Unfortunately, niggerlicious marketing made people adopt the simplistic view that "rt=good and no rt=lmao ps2 looking ahh 67". The new moore's law, which states "goycattle will find a reason to buy a product with double the transistors for no performance gain every 2 years" has since applied.
___________________________________________________________________________________________________
While this does run at 2 5090s (which does not mean 2x the performance, due to overhead - and I doubt that 2 would be needed for titles that werent already using pt such as starfield), by the time this shit is out for a few years and is beyond the tech demo stage it will likely outperform rt that looks remotely as detailed while also giving you post processing (tho given the backlash, that part will likely be optional). Lighting approximation via AI alone is going to work better and also usually look better since the rt you see in games is a fraction of what cgi uses. AI is already used to denoise the grainy output game-levels of samples for rt typically produce.
And for those of you that are whining about her looking "30 years older" (as if she looked like anything other than a anime goonerbait fetish girl before), I am sure they will let you train your own model on your fave Masha Babko porn collection.
 
Grace vasoline'd just to appease the pajeets.
 
SMAA died so we could have instagram filters in games. It was supposedly too expensive to run, but requiring 2nd GPU to run an AI for some hallucinated frames is ok.
 
Every fucking time someone whines about dlss or similar tech that is meant to undo the fuckups of expensive, brute-force ("muh ground truth") methods like ray tracing and its derivatives, they usually show their ass not long after and expose themselves as either:
>a nigger whose idea to fix performance is just to tell nvidia to make an infinite core, infinite ram, infinite everything gpu
or
>a jeet who used his scambux to buy some crypto miner's sloppy seconds 3060, turned everything to max, runs his games at 15 fps native, and refuses the loss of izzat that would come if he admits that his "new" gpu has limitations
Good observation. Those faggots ranting either do not own the hardware to appreciate that kinda stuff in the first place, or do not have the brains to understand the simple fact that it's a work in progress and not the final product. The aging effect is definitely overdone right now, but that doesn't mean it will stay that way.
 
The pre-rtx options for lighting could get close to photorealism but had limitations, and implementing the dynamic ones often meant refactoring the rendering pipeline.
They looked better though. Nvidia has been pushing real time ray tracing since 2019 and after seven years it still kills performance and makes games look worse than games from 2014, while also leading to shit like reduction of physics in games since even the mighty 5090 chokes and dies when it has to calculate shadows cast by a moving object while it’s lit by another moving object.
 
>a jeet who used his scambux to buy some crypto miner's sloppy seconds 3060, turned everything to max, runs his games at 15 fps native, and refuses the loss of izzat that would come if he admits that his "new" gpu has limitations
One of the most eye opening incidents for this was the release of Starfield. People complaining about long load times, only for it to turn out they don't even own a SSD. You don't have to own a super mega m.2 or whatever, but using a spinning platter drive on a 8 year old PC and then complaining about load times was insane to me.

Truly. This technology has made such leaps over the past decade, nowadays being the most powerful it's ever been, and for what? For the shittiest AAA games ever which in some cases run poorly even on such high-end rigs. What's even the point anymore investing in new hardware (:_(
Enjoy buying an rtx6090 in a few months when your current 5090 can't run the next ue5slop, i guess, tyrone.

This has been the case for the last decade. Only Resident Evil games seem to break this trend.

Borderland 4 was a great example of this. The game launched with horrible performance. The answer? "Just use upscaling and frame gen", but the problem was that the only cards at the time that supported those features were the newest, top tier models.

So, either you
  1. had a top tier card, and the game ran like shit
  2. had a top tier card, and the game looked like shit
  3. you had a slightly older card, and the game ran like shit even on potato settings.
 
Last edited:
One of the most eye opening incidents for this was the release of Starfield. People complaining about long load times, only for it to turn out they don't even own a SSD. You don't have to own a super mega m.2 or whatever, but using a spinning platter drive on a 8 year old PC and then complaining about load times was insane to me.




This has been the case for the last decade. Only Resident Evil games seem to break this trend.

Borderland 4 was a great example of this. The game launched with horrible performance. The answer? "Just use upscaling and frame gen", but the problem was that the only cards at the time were that supported those features were the newest, top tier models.

So, either you
  1. had a top tier card, and the game ran like shit
  2. had a top tier card, and the game looked like shit
  3. you had a slightly older card, and the game ran like shit even on potato settings.
Borderlands 3 had the same problem when it came out eariler.
 
Back
Top Bottom