- Joined
- Jul 18, 2019
You say this looking at existing technology. There was a period when we thought integrated MMUs and FPUs and sound cards were never going to be able to match the performance of discrete solutions but those also just got absorbed. In the long-term, deeper integration between the CPU, the GPU, and system RAM is probably going to pay a lot of dividends in performance to the point where having a discrete GPU will actually not be the optimal choice. We actually saw this with FPUs - the original x87 discrete FPUs were actually quite slow because moving stuff between the FPU registers and the actual x86 registers incurred a performance penalty. Moving FPU stuff onto the main CPU not only improved x87-level floating point shit but also enabled future tech like MMX.Discrete GPUs will never go away, there's just too many orders of magnitude difference in performance. But igpus are increasingly 'good enough' for many games.
Nintendo Switch: 393.2 GFLOPS
Intel UHD 770 iGPU: 793.6 GFLOPS
Intel Iris Xe iGPU: 2.2 TFLOPS
AMD 8700G iGPU: 4.5 TFLOPS
PS5: 10.3 TFLOPS
Nvidia 3060: 12.7 TFLOPS
Nvidia 4080: 48.7 TFLOPS
PCIe is fast but it's still several orders of magnitude slower than what AMD, Apple, and even Intel are cooking up long-term for what they want to do with integrated graphics. Our biggest issue right now is the added cost of making larger dies and thermal considerations, but the materials science for solving a lot of that is already in the pipeline and I wouldn't be surprised if the discrete GPU as we understand it is gone by the early 2030s.
Last edited: