GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
So, the Half-Life 2 RTX demo is out, and it's a great glimpse into the future, and it's fucking grim...

1742383823873.png
1742383840078.png
1742383855043.png
You don't suck off RTX as the second coming of Christ? Well that's a Jester award dump for you bucko.
1742383913900.png
>dude it's a free mod stop complaining lmfao
Yeah and ignore how Nvidia keeps promoting it as a benchmark in how great the 50 series is.
1742383971918.png
You can't have fun playing a game if it runs like shit. And it just so happens that this tech demo is an existing game. Another pointless deflection.
1742384011877.png
1742384420363.png
1742384432442.png
>lol amd toddler stop being poor
Yep. Just buy a $3000 GPU to play a mod for 20 year old game. If you complain about it running like shit on that $3000 GPU then you're just poor.
1742384307343.png
40fps for a 20 year old game is horrendous. Under no circumstances should 40fps ever be considered "playable", but with the context that it's 40fps on Half-Life 2 it's a completely delusional statement.

For reference, here are Nvidia© certified benchmark charts for the 50 Series™ GPU's:
1742384154141.png
1742384168392.png
1742384177016.png
Yeah, you need a 5090 to make it push out those minimal raw 60fps at 1440p for frame gen to not suffer from unplayable input lag. Everything else is just cope. And 4K ends on the 5080 that pushes out a whopping raw 15fps, with the 5090 pushing out 27fps, which will mean that with a top end GPU and monitor, you will have awful input lag. But look at the framerate, bigger number better! Truly the future of gaming.
1742384564423.png
Yeah let's ignore the fact that we're hitting the upper limits of how far we can expand on GPU technology as we get closer and closer to physical limits of EUV litography and we can't push native 60fps with this amazing tech demo. Crysis came out in 2007 and the top end GPU's of then got completely decimated within a few years of release. The same exponential performance growth won't happen again.

This is the future of gaming. Gamers are complete and utter nigger cattle. They will gobble up every little bit of garbage marketing that Jensen shows on stage. Nvidia's graphs are so misleading because they know consumers are nigger cattle, they know they won't notice how low the base framerate is and what it implies, they will only see the big green bar with bigger number better and they will run out to buy the 5090 for any price just to keep consuming. This will send a clear signal to developers that they'll have to care less and less about actually making their game run good because Nvidia will just make the number bigger and the nigger cattle will happily buy it all. And if anyone dares to speak against this trend, the same herd of nigger cattle will come out of the woodwork to silence all the naysayers. Nvidia is great, Jensen is great, the more you buy the more you save, if you can't run the game you must be poor. RTX™ is the future of game technology and it's the only way forward.

It will only get worse. And the only reason that it will get worse is because the average consumer is a sub-100 IQ drooling mongoloid retard that will buy everything, no matter how shit it is.
 
is the future of gaming. Gamers are complete and utter nigger cattle. They will gobble up every little bit of garbage marketing that Jensen shows on stage. Nvidia's graphs are so misleading because they know consumers are nigger cattle, they know they won't notice how low the base framerate is and what it implies, they will only see the big green bar with bigger number better and they will run out to buy the 5090 for any price just to keep consuming. This will send a clear signal to developers that they'll have to care less and less about actually making their game run good because Nvidia will just make the number bigger and the nigger cattle will happily buy it all. And if anyone dares to speak against this trend, the same herd of nigger cattle will come out of the woodwork to silence all the naysayers. Nvidia is great, Jensen is great, the more you buy the more you save, if you can't run the game you must be poor. RTX™ is the future of game technology and it's the only way forward.
Again, you are completely fucking retarded on this subject. Do you even understand that this is a tech demo for Path tracing? So go ahead and whine about "MUH optimization" here. It tells you it's Path traced and that's about the hardest tech to run. Again, read before raging.
I know you are pulling sour grapes as you can't afford new tech.
 
It tells you it's Path traced and that's about the hardest tech to run.
This is a 20 year old game running at the same framerate as Cyberpunk, which has infinitely more going on. It's just embarrassing.
I know you are pulling sour grapes as you can't afford new tech.
It goes both ways. You are trying very hard to justify your $3000 lemon.
 
Again, you are completely fucking retarded on this subject.
Educate me then. Show me where I'm wrong. So far I only see the same deflections I've pointed out in the Steam comments, "you're stupid" and "you're poor". You've clearly stated numerous times you're an expert on optimization and graphics tech, yet you still refuse to actually explain the technical details of why my assessments are wrong.
 
Current gamers have the same mentality as fashion obsessed crazies that will spend $4,000 for a pair of boots that cost $10 to make from a sweatshop on Indonesia and are super uncomfortable and ugly and will fall apart in a week of wearing it, but it has a name brand and a famous celebrity wore it to an event.
 
You know, I was willing to give Half-Life 2 RTX a try on my own machine just to see how badly it would run.
1742388804302.png
But guess what, if I try to run it, it throws a "Failed to create a D3D device" error, and with every other attempt to launch it, NvRemixBridge.exe just hangs in a limbo.

Now, before you try to deflect this by saying I'm using a GTX 560 and I'm kvetching over not being able to afford modern hardware, I don't. My GPU is on the official, Nvidia© branded optimal settings chart.
1742388962902.png
I also updated my drivers to the newest version to make sure it's Game Ready™. And I can't even launch the damn thing. With how hard Nvidia is pushing this tech demo as a pinnacle of what their hardware can do, this really puts my confidence in the future of gaming. Witcher 3 on Full RT runs in mostly playable framerates on my rig. Quake II RTX flies on my machine. Yet this tech demo can't even launch. Fucking baffling.
 
The game looks worse with this path-tracing demo. It's so ridiculous how bad it all looks. There's so much temporal bullshit involved in the rendering that the lighting effects work in slow motion.

Every time a muzzle flash happens (you know, something extremely rare on a FPS) the game takes almost a second to dim the room back to dark. It's laughable how terrible it is. We have reached a point in time where technology is being used to tank the performance of a 20-year-old game to unplayable levels to make it look objectively worse in every possible way.


At 10:39 you can clearly see the muzzle illuminating the room and it never gets dark again. Fucking retarded shit.
 
I'd be perfectly happy with a game that only used cell shading and instead focused on actually being fun to play.
 
The game looks worse with this path-tracing demo. It's so ridiculous how bad it all looks. There's so much temporal bullshit involved in the rendering that the lighting effects work in slow motion.

Every time a muzzle flash happens (you know, something extremely rare on a FPS) the game takes almost a second to dim the room back to dark. It's laughable how terrible it is. We have reached a point in time where technology is being used to tank the performance of a 20-year-old game to unplayable levels to make it look objectively worse in every possible way.


At 10:39 you can clearly see the muzzle illuminating the room and it never gets dark again. Fucking retarded shit.
This bit in 2kliksphilip's video is especially egregious.
Of course he glazes Nvidia since he's not exactly a millionaire, lives in the shithole that is UK so he won't shoot down his free review samples.

But holy fuck, it took an entire second for the room to light up after a light change. And people glorify this as the future of video game graphics. STALKER: Shadow of Chernobyl did dynamic lighting better than this, and that was a game that went through development hell that ran on a custom slavjank engine. Hell, I remember noticing the same issue when playing Serious Sam: TFE RTX. After opening up the Croteam heads secret and going down to the rocket pickup, there was glaring temporal smearing and it took a good second for the tunnel to turn pitch black after collecting the only light source.

Though I'd love to hear from our local adamant Nvidia whiteknights why my complaining about this is wrong. Is it because I don't own a 5090 so I'm poor? Is it because I don't understand graphics technology so everything I say is meaningless, even though no one else can explain why? Is it because in some theoretical timeline this will become instantaneous despite the assertion that Moore's Law is dead? Or maybe I shouldn't trust my lying eyes and drink Jensen's Kool-Aid?
 
Educate me then. Show me where I'm wrong. So far I only see the same deflections I've pointed out in the Steam comments, "you're stupid" and "you're poor". You've clearly stated numerous times you're an expert on optimization and graphics tech, yet you still refuse to actually explain the technical details of why my assessments are wrong.

Have you ever read a single article explaining the difference between rasterization and raytracing to understand why raytracing is so much more computationally expensive? Because I could write one for you, but it's a lot of work.

I will make it very brief. Rasterization does two separate steps. First it processes all the vertices to get pixel values (e.g. color & z), then you process the pixels. The problem with separating things out like this is the pixel step has no access to off-screen information, while the primitive step has no access to information that is generated during the pixel step. Thus, in situation where you really need to process them as a unified whole, you get extremely nonphysical, unrealistic scenes like this image of Skyrim below:

1742391970019.png

Does this look better than a PS2 game? Sure. But it still looks fundamentally wrong. There are countless examples where rasterization is incapable of generating a physically correct result. It is a fundamental limitation of the rasterization approach, and workarounds (like screen-space depth of field) amount to hacks that are intrinsically prone to various artifacts (like DOF "bleeding").

In order to avoid the artifacts resulting from segregating these two steps and using nonphysical methods, you have to do a unified step using a physical method. This is what raytracing is.

Raytracing involves tracing rays from each pixel and bouncing them around the environment. This can involve many, many tensor operations per pixel to get the final color. With path tracing, every time we hit a surface, we scatter off multiple new rays per bounce and trace them. This is necessary to accurately capture light scattering off surfaces.

But then why does it run like shit in Half-Life 2? Half-Life 2 is old!

Because, and this is really important, unlike rasterization, which is heavily dominated by polygon count and map resolution, the computational complexity of path tracing is dominated instead by the number of rays cast per pixel, the number of bounces per ray, and the number of rays scattered per bounce. Note this has little to do with asset fidelity - If you have a maximum of 5 bounces per ray and generate 20 rays per scatter, it really doesn't matter if you are running Tomb Raider 2 or Battlefield 4, you're going to incur about the same expense for your lighting either way.

Why don't they just optimize the raytracing, then?

Because they do. Full, physically realistic raytracing for a 4K image would take thousands of 4090s to do at 60 fps. They do every trick in the book already to get the cost of raytracing down to something consumer hardware can manage and still get something aesthically pleasing.

To do fully realistic raytracing to correctly capture global illumination, i.e. path tracing, you need hundreds of thousands or even millions of rays per pixel (when accounting for all the bounces & scatters) to get a nice result with no obvious artifacts. To put that in perspective, a 1080 can generate about 250M rays per second. Total. So at 30 fps and 1080p, that's 4 rays per pixel, which is not nearly enough to do this in real time:

1742399061074.png
source (pdf)

Real-time pathtraced games are obviously not doing a million rays per pixel. They are doing a tiny fraction of the necessary rays to achieve high-quality output, and using various tricks, one of the most important of which is AI inferencing, to get the final image result to an acceptable level of quality. Recall that a 1080 can do about 4 samples per pixel at 1080p & 30 fps. This is what you get without any kind of inferencing or denoising:

1742401915862.png
Source: https://developer.nvidia.com/blog/ray-tracing-essentials-part-7-denoising-for-ray-tracing/

Getting to a non-shit level of quality, i.e. the 3rd image, requires 250 1080s running in parallel to do 60 fps at 1080p. Getting a nearly noise-free image, like at the end there, would require 2,500 1080s running in parallel for realtime raytracing.

All these games you point to as being "optimized" are doing basically the same thing. They are aggressively economizing on asset & effect fidelity in order to fit the game onto the target platform and use various tricks to get the final image to look as nice as possible, going all the way back to using bilinear filtering to interpolate texture colors rather than just using super high-res textures.

So when you look at Half-Life 2 running at 40 fps with path tracing and wondering where the optimization is...you're looking at it. Without all the optimization tricks, it would run at about 1 frame per minute, or maybe worse.

Is it because I don't understand graphics technology so everything I say is meaningless, even though no one else can explain why?

Why raytracing is orders of magnitude more expensive than rasterization is easy to explain, you just don't like any explanation other than "a smart guy could totally do this at 100 fps and 4K and make it look 100x better; everyone writing raytracing engines right now is just lazy and stupid." But you can look at all those links I posted and learn that's not the case.

I would put it this way, a 4090 is to raytracing what an N64 was to rasterization.
 
Last edited:
Have you ever read a single article explaining the difference between rasterization and raytracing...
All these words just to say that technology isn't anywhere close to running this shit competently.

You can put wheels on a boat and come up with all the cope in the world about how it performs better than you'd expect. But sometimes a car would suffice.
 
I still have yet to find anyone that can justify why people should be this mad about raytracing.

"I don't like raytracing, the games don't look good and the performance sucks."
Then don't play raytraced games. There's an entire three decades of PC games at your fingertips that you can play right now that don't require this. Hell there are still new AAA games coming out that don't use raytracing. Why is this something to get upset about? Are you really particularly bothered that you're not able to play modern slop like Alan Woke at 100+ fps? By the time any of this matters enough to impact a game you'll want to play, 4090-level raytracing will be available on $200 cards.

I don't understand why there's this persistent pool of people who seethe uncontrollably at every RTX demo.
 
Then don't play raytraced games.
Oh it's this argument again. Where have it heard it before? "Don't play microtransaction games." "Don't play unfinished games." "Don't play..."
How's that working out?
It's almost as if developers are now relying on this technology as a shortcut for actual artistry. You are cucked even if you play without RTX because they no longer put the same amount of effort into lighting and other effects with rasterization.
 
Oh it's this argument again. Where have it heard it before? "Don't play microtransaction games." "Don't play unfinished games." "Don't play..."
How's that working out?
It's almost as if developers are now relying on this technology as a shortcut for actual artistry. You are cucked even if you play without RTX because they no longer put the same amount of effort into lighting and other effects with rasterization.
At the end of the day, all you can do is not play the things you don't like. You don't really get control over what other people choose to make. I don't play games with mtx and get by just fine. You're the one who seems hopelessly addicted to slurping down FOTM even when it's full of stuff you hate.
 
At the end of the day, all you can do is not play the things you don't like. You don't really get control over what other people choose to make. I don't play games with mtx and get by just fine. You're the one who seems hopelessly addicted to slurping down FOTM even when it's full of stuff you hate.
The Pascal generation being so long has had some secondary order effects.
 
Back