- Joined
- Sep 7, 2016
It should also be noted that rasterization at this point is just hacks upon hacks upon hacks upon hacks. RT could eventually create a clean slate and in addition to that it will make it easier and cheaper for indies to crank out their own engine that implements things people expect like "decent lighting" in the form of radiosity and GI. There's probably some cool things that can be done that isn't just light and reflections and we will see what that is when developers start getting creative.@ZMOT As long as complete photorealism hasn't been achieved, the industry should push graphics forward using techniques such as ray tracing. Game developers will want to use it since it's easier to do lighting. Whether or not you want to buy into it is up to you, but your hand may be forced if new games start ditching rasterization entirely several years from now. If you don't care because new games suck, then you have no problem, but every new GPU, APU, and phone will have ray tracing acceleration anyway.
It won't require several orders of magnitude more performance. The AI denoising techniques used allow it to be done in real time in the first place. Rasterization and ray tracing performance will increase in parallel with new generations of GPUs until nobody cares about rasterization anymore.
Right now RT hardware is sort of stupid to invest in but that was also true of first gen Transform & Lighting and the GeForce 3 with its shaders. A few games supported the latter at the time and absolutely nothing required it, putting it in the same situation as RT is now. T&L offloaded some things onto the graphics card instead of running it on the CPU but everything was designed to run smoothly on the CPU and the GeForce 256 itself was barely any faster than the high-end TNT2's it replaced. They were both kind of stupid but also necessary.
What is fascinating is that Unreal Engine 5 and its Lumen successor does not use or support hardware raytracing. At this point, at least.