Years ago, I saw some forum posts by a game dev opining that raytracing was not all that, and never would be, because no matter how nice your raytracing is, you can get a lot more visual quality per clock cycle via rasterization.
Given that a lot of games are still being made for 10-year-old hardware, it might even be capable of that.
View attachment 4089546
I know that my i9-12900, which is 25% slower than the i9-12900K, but uses only about half the energy, never goes above about 15% utilization during any game. Makes me wonder if it's really necessary to be blasting as much power through chips as they are these days. Since the PS4/XB1 forced everyone to learn how to write multithreaded code, I wonder if 16 E-cores couldn't be a compelling gaming CPU.