GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
The argument is that the ease of use for these features and the lack of those limitations will lead to better looking games.

It already has. Games look better when reflections don't disappear when you look down.

Stalker 2 has worse lighting than the original from 2007 - lights don't cast shadows.

"Lights" do cast shadows in Stalker. It's just one really prominent light, the flashlight, doesn't. Apparently, STALKER 2 has some janky software-based raytracing solution, and the code for flashlight shadows is in the binaries, it just fucks up when illuminating monsters. If I judged technology based on the ability of Slavs to not fuck it up, I would conclude we should never have any technology at all.
 
  • Agree
Reactions: Betonhaus
No you see these games look so bad because the developers don't want to use these fantastic new features properly that will be the future of gaming whether you like it or not. If they did you'd be in the gaming El Dorado right now. But who cares about such convoluted abstracts let's hyperfixate on why ray tracing is the future by looking at it in an autistic spec sheet vacuum and act like it's on topic because the currently discussed topic was already discussed 50 pages ago and you should know it all by heart by now.
slav.jpg
 
The anti ray tracers have a point. These new games do get blurry with it on., the fps does dip, and the games do not look much better with it on. Ratchet and clank for example, very good looking game. Raytracing on vs off no difference to me, but i lose half the fps. In RE4 remake with raytracing on there's no difference but i lose fps and the game has a weird blur to it. If i have to zoom in 600x to see a marginal difference in the lighting. Is that really worth the performance loss?
 
No you see these games look so bad because the developers don't want to use these fantastic new features properly

This literally is the problem with STALKER 2. Modders have already fixed all this shit that the Ukrainian shitware developers were unable to do correctly. This should surpise nobody, since STALKER 1 was an absolute shitpile.

Modder fixes the jank-ass reflections by using NVIDIA's out-of-the-box denoising:

Flashlight fix:

The problem isn't raytracing; it's Ukrainians.
 
The anti ray tracers have a point. These new games do get blurry with it on., the fps does dip, and the games do not look much better with it on. Ratchet and clank for example, very good looking game. Raytracing on vs off no difference to me, but i lose half the fps. In RE4 remake with raytracing on there's no difference but i lose fps and the game has a weird blur to it. If i have to zoom in 600x to see a marginal difference in the lighting. Is that really worth the performance loss?
i'm not even anti-raytracing i just tried out some games and it just looks bad. the entire tech is hanging on noise reduction algorithms and temporal sampling which is just a terrible approach to any sort of interactive real time graphics. if you can trace enough rays to make a frame look good in the time it takes to render a frame then fuck yeah go for it but i'm not going to pretend its ok to make all of your dynamic lights and reflections slowly fade in and out because it's taking a rolling average of past frames. light doesn't do that
 
i'm not even anti-raytracing i just tried out some games and it just looks bad. the entire tech is hanging on noise reduction algorithms and temporal sampling which is just a terrible approach to any sort of interactive real time graphics. if you can trace enough rays to make a frame look good in the time it takes to render a frame then fuck yeah go for it but i'm not going to pretend its ok to make all of your dynamic lights and reflections slowly fade in and out because it's taking a rolling average of past frames. light doesn't do that
Currently, partial raytracing (reflections and shadows only) looks good and runs well on 20 series and above GeForces. It pretends to sort of run well on 6000 series Radeon before periodically puking everywhere, like I get a 60 fps average in Diablo IV with RT on Low or Medium with lows of 12 fps, which is stupid. Can't speak to the 7000 series, because I don't have one. Full raytracing seems to need a 40 series to really look good, and I feel like path tracing isn't really quite there yet. I only use raytracing at all when I'm on my laptop (3050 Ti Laptop). It's bad enough on my desktop (6700 XT) to never use it.
 
This literally is the problem with STALKER 2. Modders have already fixed all this shit that the Ukrainian shitware developers were unable to do correctly. This should surpise nobody, since STALKER 1 was an absolute shitpile.

Modder fixes the jank-ass reflections by using NVIDIA's out-of-the-box denoising:

Flashlight fix:

The problem isn't raytracing; it's Ukrainians.
Definity needs hardware ray tracing. There's a mod for Robocop rogue city that activates hardware lumen, and it looks so much better.
 
  • Informative
Reactions: Brain Problems
Currently, partial raytracing (reflections and shadows only) looks good and runs well on 20 series and above GeForces. It pretends to sort of run well on 6000 series Radeon before periodically puking everywhere, like I get a 60 fps average in Diablo IV with RT on Low or Medium with lows of 12 fps, which is stupid. Can't speak to the 7000 series, because I don't have one. Full raytracing seems to need a 40 series to really look good, and I feel like path tracing isn't really quite there yet. I only use raytracing at all when I'm on my laptop (3050 Ti Laptop). It's bad enough on my desktop (6700 XT) to never use it.
i have a 3060 ti and i'm playing at 1080p. every ray tracing game i've tried has the same problem, it's taking a rolling average of frames with a noise reduction algorithm. You cannot fix that with a more powerful gpu. The fact that we're talking about sub-60 fps is fucking ridiculous dude. It's diablo! THE GAME HAS A FIXED CAMERA ANGLE AND YOU'RE ONLY GETTING 60 FPS?
 
i have a 3060 ti and i'm playing at 1080p. every ray tracing game i've tried has the same problem, it's taking a rolling average of frames with a noise reduction algorithm. You cannot fix that with a more powerful gpu.

You actually can, since all these methods are ways of dealing with a low sample rate. The higher the sample rate, the less you have to lean on post to fix the image.

A 3060 Ti can do, if my Google-fu is any good, about 4 billion rays per second. At 60 fps, that's 66.7 million rays per frame. At 1080p, that's 32 rays per pixel. I found this image to illustrate what raw raytraced data looks like at different sample levels prior to any postprocessing:

1742648948380.png

At 32 samples per pixel, post has to do a lot of heavy lifting. Post tech is also very primitive right now. Cinematic raytracing didn't really need it, since each frame is computed offline. So between samples per pixel going up over the next couple years and post tech improving, expect the overall quality of raytraced games to improve.

Remember how shitty perspective-correct, bilinear-filtered, mip-mapped, z-buffered, anti-aliased rasterization looked on the N64 compared to affine-mapped, nearest-neighbored, sorted, aliased rasterization on the PS1. Lower res textures, worse frame rates, smeary graphics. Yet a few years later, that's how everything was done.

THE GAME HAS A FIXED CAMERA ANGLE AND YOU'RE ONLY GETTING 60 FPS?

The cost of raytracing really has little to do with the camera angle. It's basically (Number of Rays per Pixel) x (Number of Bounces) x (Bounce Calculation Complexity). I like how suddenly 60 fps is intolerably shitty now that we're talking about raytracing, whereas before it showed up, 60 fps was totally fine for gaming, it was 30 fps console games that were unplayable.
 
You actually can, since all these methods are ways of dealing with a low sample rate. The higher the sample rate, the less you have to lean on post to fix the image.

A 3060 Ti can do, if my Google-fu is any good, about 4 billion rays per second. At 60 fps, that's 66.7 million rays per frame. At 1080p, that's 32 rays per pixel. I found this image to illustrate what raw raytraced data looks like at different sample levels prior to any postprocessing:

View attachment 7122226

At 32 samples per pixel, post has to do a lot of heavy lifting. Post tech is also very primitive right now. Cinematic raytracing didn't really need it, since each frame is computed offline. So between samples per pixel going up over the next couple years and post tech improving, expect the overall quality of raytraced games to improve.

Remember how shitty perspective-correct, bilinear-filtered, mip-mapped, z-buffered, anti-aliased rasterization looked on the N64 compared to affine-mapped, nearest-neighbored, sorted, aliased rasterization on the PS1. Lower res textures, worse frame rates, smeary graphics. Yet a few years later, that's how everything was done.



The cost of raytracing really has little to do with the camera angle. It's basically (Number of Rays per Pixel) x (Number of Bounces) x (Bounce Calculation Complexity). I like how suddenly 60 fps is intolerably shitty now that we're talking about raytracing, whereas before it showed up, 60 fps was totally fine for gaming, it was 30 fps console games that were unplayable.
dude you're literally sitting here and telling me that the stuff i have is better than yours and easily capable of doing what you claim to be "good enough" and i'm telling you that it still sucks. Take a step back and re-consider what you're trying to do here
 
My phone has a 120hz screen, my TV is 120hz, and even my toilet has a 120hz screen. But vydea Gaymes are still stuck in the world of sub-60 fps.

I repeat myself, are you fuckers still living in 2012?
whats funny is that mobile games usually run at an unlocked framerate and look super crisp so every time i see some retard playing some stupid phone game is usually looks really good but a $2000 video card using cutting edge technology demonstrators just comes off as a sloppy garrys mod tech demo
 
dude you're literally sitting here and telling me that the stuff i have is better than yours and easily capable of doing what you claim to be "good enough" and i'm telling you that it still sucks. Take a step back and re-consider what you're trying to do here

IDK man, all I can tell you is Diablo IV looks good on my 3050 Ti laptop with RT shadows & reflections enabled and doesn't have any of these problems. Hold on, let me take a step back, like you suggested.

...yep, still looks good.

whats funny is that mobile games usually run at an unlocked framerate and look super crisp so every time i see some retard playing some stupid phone game is usually looks really good but a $2000 video card using cutting edge technology demonstrators just comes off as a sloppy garrys mod tech demo

There's a really simple solution to this, PC games should do what phone games do and not give gamers access to any granular settings at all. What PC games should do is just detect what hardware you have and do whatever they need to in order to run at the frame rate and resolution the developer picks for you.
 
Last edited:
  • Like
Reactions: Rololowlo
My phone has a 120hz screen, my TV is 120hz, and even my toilet has a 120hz screen. But vydea Gaymes are still stuck in the world of sub-60 fps.

I repeat myself, are you fuckers still living in 2012?
This is a very good point. @The Ugly One constantly focuses on the 60fps target whenever he defends the glory of ray tracing by pointing out how the 2060 can do that at 720p, which is a target that may have been the bee's knees 10 years ago. Right now there is an overabundance of 100Hz 1080p IPS office monitors, and 75Hz has been the target budget refresh rate for longer than that.

Right now the minimum requirement is: can it hold stable 100fps, as in never dip below that, at 1080p with ray tracing? Now that's trickier to defend, since you will have to admit that only the high end cards from the 20, or even only the 30 series will be able to push that, and for the midrange hardware to hit that number you'll need frame generation, so 40 series up, where a single GPU will cost you as much as an entire console that does all of that.

So unless an absolute miracle of optimization happens, and it works retroactively, PC gaming will be way more expensive than console gaming in the future for the same end result on screen. Apparently this is called "progress" and I'm a luddite if I dare to criticize the trajectory the market is heading for.
 
  • Lunacy
Reactions: Betonhaus
ASUS’s Radeon RX 9060 XT Custom GPUs Surfaces Online, Featuring DUAL, PRIME & TUF Models With 8GB/16GB Configurations

Not content to buttfuck the 5070 and 5070 Ti, AMD now sets their sights on buttfucking the upcoming 5060 and 5060 Ti. The naming is a bit confusing considering that they could just call the 8 GB model '9060' without the XT but when has AMD ever been good at naming things?
It's a pack of 128-bit cards. I assume the 9060 XT will fall short of the 7700 XT because the latter has 69% more CUs and 35% more bandwidth. But it gets 16 GB (optionally).

The 5060 Ti will have 5.9% more cores than the 4060 Ti, so I doubt it will impress. The 5060 on the other hand will have 25% more cores than the 4060. There was a big unfilled gap between the 4060 Ti and 4060 which is being narrowed this time, and then an RTX 5050 will reportedly be tossed into the low-end hole. 5060 sentiment can be repaired later with a 12 GB option using 3 GB modules.

GDDR7 bandwidth increases could affect performance more for the 60 tier 8 GB cards than it did elsewhere, with the 5060 probably getting 65% more bandwidth than the 4060 for example. If it launches to desktop, the 5050 gets GDDR6 to keep the cost down to a reasonable... $200-250.

So I think it will be harder for AMD to get a clean win down there using the Navi 44 die. High Nvidia pricing and low availability will be what makes AMD look good, not the performance or VRAM.
 
  • Informative
Reactions: The Ugly One
IDK man, all I can tell you is Diablo IV looks good on my 3050 Ti laptop with RT shadows & reflections enabled and doesn't have any of these problems. Hold on, let me take a step back, like you suggested.

...yep, still looks good.



There's a really simple solution to this, PC games should do what phone games do and not give gamers access to any granular settings at all. What PC games should do is just detect what hardware you have and do whatever they need to in order to run at the frame rate and resolution the developer picks for you.
games haven't had granular settings since like 2007. there was a giant controversy about crysis 2 not having any graphics options and that game came out 15 years ago. Are you trying to argue against a position in your head that you remember to be a lot closer than it actually is?

also... why is there raytracing in a game with a fixed camera angle? that's one of the few situations that you can with certainty say that you don't need to use raytracing. I'm not even trying to be petty here that is the textbook application for nearly all of the graphics tricks that raytracing is presented as a way to break free from.
 
games haven't had granular settings since like 2007.

Diablo IV (2023):
1742680238027.png1742680269036.png1742680292128.png1742680317488.png


also... why is there raytracing in a game with a fixed camera angle?

Because the reason shadow maps can't do penumbras right and SSR can't capture off-screen elements has nothing to do with camera angle.

Shadow maps on the left, RT shadows on the right
Screenshot 2025-03-21 204912.pngScreenshot 2025-03-21 204637.png

SSR on the left, RT on the right (this is the very top of the screen)

1742682019536.png1742681967283.png
As we see, SSR can only reflect the stuff up on the edge there once it comes into view:
1742682096823.png

My entry-level laptop GPU can handle this just fine. My desktop GPU only pretends to handle it because AMD sucks. This is where you say all that shit is gay and doesn't matter, and we should just never move on from some arbitrary fixed point in technology because it means sacrificing frames. Then I respond by pointing out that all technologies mean sacrificing frames. Why stop at RT shadows? Why not rewind the clock to Gouraud shaded polygons and static cube maps?
 
Back