- Joined
- Nov 15, 2021
Luddite behavior. Technology advances and so do techniques. It's not like Ray Tracing is taking over tomorrow. I remember watching some behind the scenes footage about the Halo 2 demo at E3. It was ray traced, but on the Xbox they had to scrap it because it wasn't viable for the console hardware. Now it actually kinda is. That is fucking awesome what's happened in just around 20 years.
I don't think it's unreasonable to need to upgrade your hardware every 7-8 years. The 20 series GeForce is now 6 years old and is entirely capable of running games like Cyberpunk with RT on. Even the 2060 can handle it at over 60 fps and 720p native res. The 6000 series Radeon is four years old and inferior to the 20 series GeForce for raytracing (not to mention upscaling), but it can handle partial raytracing (i.e. just shadows & reflections) just fine, and the 6700 XT can muddle by tolerably on full scene raytracing at 1080p if you insist. It can't handle path tracing at all, don't try.
At this point, the reason so many games still use cube-mapped reflections and stenciled shadow maps is likely because they still have to release on the PS4. The PS5 is now 5 years old and is still sharing many of its major releases with an 11-year-old system.
Sounds like AMD is really pushing Intel's shit in. Granularity of AVX10 spec is quite retarded, maybe it would be fine if Intel was still leading in innovation.
AMD hasn't really made a significant impact on the ISA since x86-64, and that was 25 years ago. AVX is Intel's bailiwick, and it's a gigantic fucking mess that everyone's been complaining about for 10 years at least now. It's their mess to fix, and it looks like they are finally fixing it. Of course, given the longevity of hardware these days, the software situation won't be fully unfucked until 2035.
Last edited: