GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

It's not about actually playing games for a lot of people, but rather the comfortable illusion of, "yea I could play that game if I wanted to and can do so whenever I feel like it because PC master race amirite." Once games exist that you can't play, people start to feel personally attacked even if they weren't gonna play them in the first place.
I remember that attitude when PC games started to be released on DVDs. A lot of seething about how the latest game can't be played by those with CD readers. They had the latest graphics card and everything... this was In 2007 or something. CD+DVD reader/burners cost like 50 bucks at most.
 
Question: Who hated on T&L other than 3dfx fans?

3Dfx fanboys in the late 90s were numerous and vocal, since the Voodoo 2 was absolutely the graphics card for a while. Reviewers complained that in Quake III, one of the only games at the time to support T&L, you got basically nothing by enabling it vs just doing CPU vertex setup on the latest Pentiums, which is likely because QIII needed to run on hardware gamers actually had rather than one specific GPU. Later on, you would see the 3Dfx fans benchmark Voodoo cards in SLI on late 90s games like Unreal Tournament and Quake II to show its superiority, but the reality is there were already games coming out that wouldn't even run on a Voodoo because they outright required hardware T&L. The 3Dfx crowd clung to their outdated benchmarks and blamed Microsoft for bribing companies to switch from OpenGL to Direct3D, when the reality is that the new game consoles were sending polygon counts up, and CPU-based T&L just wouldn't cut it any more.

I asked for how they came to their conclusions. You offered nothing.

What I offer is that burden of proof is on the accuser. It's not on me to commit corporate espionage or break an NDA to uncover NVIDIA's internal processes to disprove your accusation that NVIDIA is lying about their telemetry.

it is objectively true that developers are using upscaling as a bandaid.

Which specific games do you know for a fact would run at higher frame rates if DLSS technology did not exist?

The point is DLSS is not perfect technology.

That claim isn't just moving the goalposts; you've now moved the goalposts and switched to baseball. But sure, I agree that DLSS, along with rasterizing polygons, trilinear filtering, normal mapping, mip mapping, fur shading, anti-aliasing, anisotropic filtering, HDR, bloom lighting, dithering, stencil shadows, SSAO, and global illumination, is "not a perfect technology."

As we have seen in the past 35 years of gaming, though, "not a perfect technology" hardly means it's about to be dropped soon. AI-based upscaling is only becoming more widespread.
 
Last edited:
What I offer is that burden of proof is on the accuser.
Apart from the fact I never claimed that nobody is using it. I just said I don't believe their inflated statistics, which is reasonable considering we haven't seen how they calculate those numbers, and the fact that some games literally turn it on by default.
Which specific games do you know for a fact would run at higher frame rates if DLSS technology did not exist?
It's not a hard concept to understand; people will use shortcuts available to them, even if's a detriment to the end product. This should not be a point of contention and the fact you are so unable to accept this is telling.
 
  • Dumb
Reactions: The Ugly One
I personally only use technology that is 100% perfect and infallible. I, of course, talk about an 68k running a Forth interpreter. It's been only downhill since then...

So a laptop?
Yes, just with interchangable, standardized parts, sane thermal management that actually considers sustained loads and not only burst loads and also without the premium price tag.
 
68k running a Forth interpreter.
Pffft, 68k

330px-Jupiter-ACE_small_system_(modified).jpg
Z80 is the place to be.
 
Which specific games do you know for a fact would run at higher frame rates if DLSS technology did not exist?
Since this is still ongoing I will point out that Remnant 2 devs stated their game was "designed with upscaling in mind" . I do think modern games should launch and be at least playable at native res for relatively modern hardware at least. I also don't get the extreme hate DLSS and frame gen get. I have used both of them, and never noticed insane artifacting and images to appear to me as insane blurry messes, but I also game at 3840 x 1600 so my native res is already high.


I just watched the AMD CES announcements and they didn't even announce their GPUs. What the fuck is AMD doing? :story:
 
Last edited:
I only just got the 9800X3D cpu, and they announce the 9950X3D. What am I missing out on?
 
Since this is still ongoing I will point out that Remnant 2 devs stated their game was "designed with upscaling in mind" . I do think modern games should launch and be at least playable at native res for relatively modern hardware at least.

Remnant 2 can run at 64 fps on a 4060 (non-Ti non-Super) at 1080p Ultra + native. 88 fps at Medium 1080p. That sounds really playable to me.


What people assume from quotes like this is that if it hadn't been for upscaling, developers would do "proper optimization," and you'd be able to run the current, absolute maxed-out settings at maxed-out resolution on the highest-end GPU at 60 fps. What you would actually get is what we saw for decades. Either you get games that flat-out cannot run maxed out on the latest hardware, with developers telling you, "wait to buy next year's hardware to see how amazing this game can really look," which is how it was with everything from Unreal 1 to Crysis, or you get ports of games for aging hardware where the PC version adds absolutely nothing, like how the latter COD games developed for 360 ran at 100+ fps on new GPUs at max settings & 1080p. But it's not like gamers were thrilled; they just bitched about how dated the graphics were.

Here's Unreal 1 (1998) running on a Voodoo 3 (1999):

1736191559889.png


27 years ago, and you still couldn't max out a 1-year-old game on a brand new GPU. No DLSS to blame.

I also don't get the extreme hate DLSS and frame gen get. I have used both of them, and never noticed insane artifacting and images to appear to me as insane blurry messes, but I also game at 3840 x 1600 so my native res is already high.

You'll notice in all the videos complaining, they run the game at a base of 30-40 fps to maximize the artifacting, when you're intended to use it at a base of 60 fps or more.

Here's F1 running at a base 60-70 fps instead of the base 45 fps Hardware Unboxing used. You very rarely see a little flicker on a name tag. On the other hand, you're also seeing raytraced graphics at 140 fps. Anyone want to go back to the "proper optimization" days where that game would be running at 28 fps?

 
  • Like
Reactions: N Space
Remnant 2 can run at 64 fps on a 4060 (non-Ti non-Super) at 1080p Ultra + native. 88 fps at Medium 1080p. That sounds really playable to me.
Yea over the lifespan of the game there has been a lot of performance updates but at launch it was a mess


And to be fair to them they aren’t a huge AAA developer, but games releasing relatively unpolished causes this lashing out at DLSS and frame gen tech, when a lot of the anger should be directed at the suits who demand products launch before they’re ready so they can go to the board and shareholders and point to the graph of the revenue they made that quarter.
 
Last edited:
  • Like
Reactions: Brain Problems
A bigger price and hoping core parking issues don't crop up again.
no issues with the 9800x3d afaik, are you referring to the 7950x3d?

The reason I was concerned was because my setup died, conveniently shortly before the 9800X3D came out. I abstained from buying a GPU since all this new stuff is right around the corner. However since I can only do rudimentary tasks, due to how everything insists on pulling at GPU resources (i’ve had to disable hardware acceleration on my browsers and media players to prevent crashing or render errors). I’m wondering If I should have just completely abstained from having a computer at all and become a phone user for an extra 3 months.
 
I just watched the AMD CES announcements and they didn't even announce their GPUs. What the fuck is AMD doing? :story:
I just glanced at Wccfkek and I believe the guy who says they're scared to commit until Nvidia fully announces and prices theirs. So maybe they'll have another announcement in a week or two. Weird to set the launch date as "Q1" though when partner models are already known. What is it, February?

From the naming (RTX 9070 cough I mean RX 5070) to the waiting to the pricing, AMD is playing the follow the leader act to the letter. Now Lisa Su fans have to watch Jensen.
Looks like FSR4 is going to be exclusive to RDNA4. It also puts AMD fans in a tough situation - the 7900 XTX outperforms the 9070 XT in raw raster, but it won't have access to FSR4.
Is it that tough for the few 7900 XTX owners? You just ignore the generation and cope. Leakers have already been confident for months that 7900 XTX would outperform RDNA4. FSR4 is apparently tied to the fortunes of FSR3.1 (which you can use), so it will take a while to be supported.
I only just got the 9800X3D cpu, and they announce the 9950X3D. What am I missing out on?
Nothing. You don't need it for gaming.
 
I believe the guy who says they're scared to commit until Nvidia fully announces and prices theirs.
If they were truly trying to achieve more market share as their strategy like they claimed just a few months ago, then this would be mostly irrelevant. If their goal is market share they know they have to take big haircuts on their margins so they would price them accordingly to their input cost rather than what Nvidia is doing.
 
I just glanced at Wccfkek and I believe the guy who says they're scared to commit until Nvidia fully announces and prices theirs. So maybe they'll have another announcement in a week or two. Weird to set the launch date as "Q1" though when partner models are already known. What is it, February?

From the naming (RTX 9070 cough I mean RX 5070) to the waiting to the pricing, AMD is playing the follow the leader act to the letter. Now Lisa Su fans have to watch Jensen.

Is it that tough for the few 7900 XTX owners? You just ignore the generation and cope. Leakers have already been confident for months that 7900 XTX would outperform RDNA4. FSR4 is apparently tied to the fortunes of FSR3.1 (which you can use), so it will take a while to be supported.

Nothing. You don't need it for gaming.
I’m a bit of a schizo user, not just gaming. The 9800X3D was number 1 for Photoshop Benchmarks, I do Photography and Illustration as my main hobby/side gig so this was a big deal for me. Any given photoshoot I do like 800 snaps, so if I can shave ten seconds off each photo I review, cull, and process, it all adds up.

For real work I mess with large data sets and huge spreadsheets.

As for gaming, most games I play are old…. however I would like to have a nice experience if I play the odd new game. BTW freedom enjoyers may want to look into this game.

 
Looks like FSR4 is going to be exclusive to RDNA4. It also puts AMD fans in a tough situation - the 7900 XTX outperforms the 9070 XT in raw raster, but it won't have access to FSR4.
Looks like I'll still be using XeSS on my AMD GPU when I can. Because it totally makes sense that Intel would be able to deploy inferencing-based upscaling AMD hardware, but AMD can't. :story:
 
  • Thunk-Provoking
Reactions: Brain Problems
If they were truly trying to achieve more market share as their strategy like they claimed just a few months ago, then this would be mostly irrelevant. If their goal is market share they know they have to take big haircuts on their margins so they would price them accordingly to their input cost rather than what Nvidia is doing.
I mean, if AMD was serious about marketshare, they would have pushed out RDNA4 in time for Christmas and Chinese New Year. They'd have taken a short-term loss to get rid of remaining RDNA3 before Q4 2024 and instead of everyone talking about Battlemage in December, we'd be talking about RDNA4.

I don't think AMD is actually serious about growing marketshare. They might think they are but it looks like they're aiming to do another RDNA3 where they launch their cards at a $100 discount to nvidia and then shrug their shoulders when no one buys them. And then by the time they cut prices, no one gives a shit anymore and they go back to talking about how they need to focus on growing marketshare.

It's all so tiresome.
 
If they were truly trying to achieve more market share as their strategy like they claimed just a few months ago, then this would be mostly irrelevant. If their goal is market share they know they have to take big haircuts on their margins so they would price them accordingly to their input cost rather than what Nvidia is doing.
They have a history of dumb MSRP decisions (remember the "jebait"?). Taking in more info about how the juggernaut is pricing can only help. They can try to upstage with another announcement. But it remains to be seen if they actually care about market share with RDNA4 or if they are going through the motions.

AMD also shared some rather impressive results showing a Llama 70B Nemotron LLM AI model running on both the Ryzen AI Max+ 395 with 128GB of total system RAM (32GB for the CPU, 96GB allocated to the GPU) and a desktop Nvidia GeForce RTX 4090 with 24GB of VRAM (details of the setups in the slide below). AMD says the AI Max+ 395 delivers up to 2.2X the tokens/second performance of the desktop RTX 4090 card, but the company didn’t share time-to-first-token benchmarks.

Perhaps more importantly, AMD claims to do this at an 87% lower TDP than the 450W RTX 4090, with the AI Max+ running at a mere 55W.
Strix Halo: a mobile AI workstation chip first, gaming laptop mega APU second (no actual gaming benchmarks shared, just some synthetics compared to Lunar Lake).

HDMI 2.2 to offer up to 96 Gbps bandwidth
VESA introduces DisplayPort 2.1b and DP80LL (Low-Loss) specifications in collaboration with NVIDIA
 
Back