I cannot tell if this is a troll, but advancements have slowed to an absolute crawl compared to the last 20 years.
CPUs aren't getting faster. We're running into quantum tunneling issues with transistors being so close together, the nanoscale issue has not been solved and we have microprocessors out there with >40% error rates in computational tasks due to interference and crosstalk. That means almost half the work your modern CPU does is discarded when error-correction kicks in.
It's why we just put more cores into CPUs now instead of more transistors into a package.
I could sperg more but that one item has stymied all advancements.
AMD has improved CPU performance (single-threaded) by around
20-30% per major generation since Zen 1, from higher IPC and clocks (making the Ryzen 7 7700X
about twice as fast as the Ryzen 7 1800X). GPUs are improving faster than that. I don't know what error rate you're referring to, but if it's not going to increase from e.g. 40% to 70% imminently then it's not negating other gains.
"Nanoscale issues" are being dealt with by the foundries, it's just that there's always a new set of issues to solve and the results are less impressive than in the past. Now we are seeing the introduction of GAAFETs to replace FinFETs, backside power delivery, high-NA EUV, stacked SRAM to mitigate SRAM scaling stagnation, etc. Maybe we'll see a variation on
TFETs in the future to harness instead of resist quantum tunneling.
More cores can mean more performance, it's just not useful to everyone because of Amdahl's law and shitty programming. Most people don't "need" more than a quad-core from 10 years ago. But if you do, there are mainstream 16-24 core CPUs available.
The free lunch of the 90s is gone, but requirements have plateaued because of the stagnation. You can pick up a refurbished or new PC for $100-200 that will be good enough for shitposting, videos, work, playing old games on iGPU, etc.
We might witness an exciting performance growth spurt if 3D computers take off. Need to break the memory wall? Put layers of memory nanometers away from the logic, and cool it somehow.