GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
lossy compression
Removes original detail and worsens the quality to achieve lower file size.
interpolation
Estimates via logarithms what should be put between A and B. Badly at that.

Now the whole craze in the AI race in gaming is how to estimate from a limited dataset on what the full quality result should be and be able to do that with low overhead. For example, taking a lossy DXT texture, then using something like ESRGAN and manage to achieve lossless result near-instantly in the background.

Still, the main issue is this: right now we're only figuring out how to optimize ray tracing, meanwhile Nvidia constantly pushes it as something that's ready now when it clearly isn't. They've dropped the GTX branding for RTX, they're going all in with it. But the 20 series is already a dud even today, and the 30 series is juuust enough for ray tracing, but won't be getting any of those fancy frame gen features which are clearly meant to be the standard in the future.

So the question is: how badly will the current hardware age as the push towards ray tracing superseding rastering accelerates? Will all those optimizations get to the point where the 30 series will remain viable, or will they ultimately demand newer and newer hardware that's only getting more and more expensive due to Nvidia's stranglehold on the market and TSMC lacking any capacity for low margin silicon like gaming GPU's? And how much will your run of the mill gaming CPU cost by the time ray tracing is actually viable compared to the run of the mill console?

The way I see it, it's gonna be a mess, and even the 40/50 series will fall off as being too weak. The 10 series is still viable for rastering, as is the entire RTX lineup. But the RTX lineup will age like milk, because the push for ray tracing is being done too early. Though I'm sure it's beneficial for Nvidia to release badly aging and increasingly expensive products on the market as people will have to buy more and more and more to stay ahead of the curve. We're basically just getting into the fundamental computing power and optimization techniques for ray tracing, yet we're on the fourth generation of Nvidia GPU's, where Jensen boasted that "the age of ray tracing is here" with the 20 series that's now as good as dead for that application with how exponentially more demanding it became.
 
  • Like
Reactions: TrinityReformed
OIP.jpg
 
and even the 40/50 series will fall off as being too weak
a decade out i think we can say Ray tracing was always a meme, it was a failed project and it will never be optimized and "worth it". a 5090 is 300% better than a 2080 and ray tracing is still way more of a hinderance for too little benefit, i doubt by the time we get to the feynmen era GPUs that it will be "worth it" either, and by then nvidia will probably be fucked by tarriffs or some other shit and AMD will catch up with ray tracing tech and the "new normal" will still not be where people thought it would be and you'll still be better off not "turning it on"
 
a decade out i think we can say Ray tracing was always a meme, it was a failed project and it will never be optimized and "worth it". a 5090 is 300% better than a 2080 and ray tracing is still way more of a hinderance for too little benefit, i doubt by the time we get to the feynmen era GPUs that it will be "worth it" either, and by then nvidia will probably be fucked by tarriffs or some other shit and AMD will catch up with ray tracing tech and the "new normal" will still not be where people thought it would be and you'll still be better off not "turning it on"
This is the doom the dark ages, a game that only uses ray tracing and these are the requirements without upscaling. I think you're wishcasting because you don't like ray tracing personally.
Header Cell - Column 0MinimumRecommendedUltra 4K
Settings1080p / 60 fps / low settings1440 p / 60 fps / high settings2160 p / 60 fps / ultra settings
OSWindows 10 (64-bit) / Windows 11 (64-bit)Windows 10 (64-bit) / Windows 11 (64-bit)Windows 10 (64-bit) / Windows 11 (64-bit)
CPUAMD Ryzen 7 3700X, Intel Core i7 10700K or betterAMD Ryzen 7 5700X, Intel Core I7 12700K or betterAMD Ryzen 7 5700X, Intel Core i7 12700K or better
GPUNvidia RTX 2060 Super, AMD RX 6600 or better (ray tracing capable)Nvidia RTX 3080, AMD RX 6800 or betterNvidia RTX 4080, AMD RX 7900 XT or better
RAM16 GB32 GB32 GB
Storage512 GB or higher NVMe SSD (100 GB available)512 GB or higher NVME SSD (100 GB available)512 GB or higher NVME SSD (100 GB available)
 
a decade out i think we can say Ray tracing was always a meme, it was a failed project and it will never be optimized and "worth it".

We are already at the point where any GPU made in the last 4 years, and any NVIDIA GPU from the last 6 years, can at minimum run raytraced reflections and shadows with the rest of the scene rasterized at 60 fps in native HD resolutions. Cube maps, SSR, stenciled shadow maps, and the like are technologies that debuted 25 years ago and have been pushed as far as they can go. Despite literally two and a half decades of iterating on them, the fundamental limitations of those methods and the associated artifacts are still obvious. At least with reflections:
  • Cube maps aren't perspective-correct and are often pixelated
  • Too many real-time cube maps becomes prohibitive
  • Planar reflections are prohibitive for more than a couple surfaces, can't handle irregular surfaces
  • SSR can't capture anything off-screen
When the min spec on games is a 20 series Geforce, which it will be in nearly all games very soon, there's no reason to use these outdated methods at all. They're a pain in the ass to implement, they're extremely limited, they're fragile, and they're not necessary to achieve good frame rates. It's time for these old methods to be put out to pasture. The only reason to keep using them is if your game has to run on the PS4.
 

AMD's Arm-based "Sound Wave" APU:
videoframe_350426.png

Enhancements to RDNA 3.5 might be needed to (better?) support FSR4, could be coming to the Strix Point/Halo successors.

8 CU RDNA4 on Zen 6 desktop CPUs (from current 2 CUs of RDNA2) would knock out the 5700G, 8600G, and maybe the 8700G. I imagine some models would disable 1-4 compute units.

a decade out i think we can say Ray tracing was always a meme, it was a failed project and it will never be optimized and "worth it". a 5090 is 300% better than a 2080 and ray tracing is still way more of a hinderance for too little benefit, i doubt by the time we get to the feynmen era GPUs that it will be "worth it" either, and by then nvidia will probably be fucked by tarriffs or some other shit and AMD will catch up with ray tracing tech and the "new normal" will still not be where people thought it would be and you'll still be better off not "turning it on"
It will start to become "mandatory" 2-3 years after a PlayStation 6 comes out with significantly better raytracing capabilities, so maybe the early 2030s.

Transistor density is going to double, say, 3-5 times by 2035. Upscaling and multi fake frames will make it even easier to target 4K/8K at high frame rates by then. Nvidia needs to find something for gaymers to chase after, and rasterization isn't it. Unoptimized hindrance? Great, says Nvidia, buy our latest GPU.
 
Nvidia needs to find something for gaymers to chase after, and rasterization isn't it. Unoptimized hindrance? Great, says Nvidia, buy our latest GPU.
Real-time dynamic fluid simulations advanced beyond what is currently possible on consumer GPUs and consoles would be super neat! Likely never gonna happen within our lifetimes but a man can dream. 🙃
 
  • Agree
Reactions: Prehistoric Jazz
We are already at the point where any GPU made in the last 4 years, and any NVIDIA GPU from the last 6 years, can at minimum run raytraced reflections and shadows with the rest of the scene rasterized at 60 fps in native HD resolutions. Cube maps, SSR, stenciled shadow maps, and the like are technologies that debuted 25 years ago and have been pushed as far as they can go. Despite literally two and a half decades of iterating on them, the fundamental limitations of those methods and the associated artifacts are still obvious. At least with reflections:
  • Cube maps aren't perspective-correct and are often pixelated
  • Too many real-time cube maps becomes prohibitive
  • Planar reflections are prohibitive for more than a couple surfaces, can't handle irregular surfaces
  • SSR can't capture anything off-screen
When the min spec on games is a 20 series Geforce, which it will be in nearly all games very soon, there's no reason to use these outdated methods at all. They're a pain in the ass to implement, they're extremely limited, they're fragile, and they're not necessary to achieve good frame rates. It's time for these old methods to be put out to pasture. The only reason to keep using them is if your game has to run on the PS4.
The argument is that the ease of use for these features and the lack of those limitations will lead to better looking games.

It's been 3 generations in and they don't look good, they look really bad. Model and texture quality is extremely sharp and crisp but the lighting quality in games now is straight up terrible. Stalker 2 has worse lighting than the original from 2007 - lights don't cast shadows. AND you expect me to be willing to deal with generated frames and upscaling? No way, dude. This stuff is bunk. It's whack. Stop pretending this isn't sloppy.
 
The argument is that the ease of use for these features and the lack of those limitations will lead to better looking games.

It's been 3 generations in and they don't look good, they look really bad. Model and texture quality is extremely sharp and crisp but the lighting quality in games now is straight up terrible. Stalker 2 has worse lighting than the original from 2007 - lights don't cast shadows. AND you expect me to be willing to deal with generated frames and upscaling? No way, dude. This stuff is bunk. It's whack. Stop pretending this isn't sloppy.
No you see these games look so bad because the developers don't want to use these fantastic new features properly that will be the future of gaming whether you like it or not. If they did you'd be in the gaming El Dorado right now. But who cares about such convoluted abstracts let's hyperfixate on why ray tracing is the future by looking at it in an autistic spec sheet vacuum and act like it's on topic because the currently discussed topic was already discussed 50 pages ago and you should know it all by heart by now.
 
Think about it. This :gunt: but its three dimensional and a part of the gameplay. Forget ray tracing. Fully rendered real-time :gunt: is the future of video games.
I wouldn't be surprised if you could AI generate an animated 3D rainbow gunt with all the AI tools you have at disposal nowadays. ComfyUI supports like five billion different AI tools and idk where to even begin with that shit. I know one of the things it could do is 3D models.
 
  • Thunk-Provoking
Reactions: Gog & Magog
I wouldn't be surprised if you could AI generate an animated 3D rainbow gunt with all the AI tools you have at disposal nowadays. ComfyUI supports like five billion different AI tools and idk where to even begin with that shit. I know one of the things it could do is 3D models.
Wait that's a gunt? I thought it was a flattened Discord rainbow blob
f19f0144-9b98-4873-991a-7d0e4bcf66a7.gif
a01b94f6-0b9c-460f-b8ef-d08601293728 (1).gif
 
Back