GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Technological Neanderthal has decided to dunk on Nvidia for removing 32-bit PhysX support.
Shit like this is why I don't throw away my "old" electronics. I still have my old GTX980. And my old GTX580.... and two even older 9800GTs. Old motherboards, CPUs, RAM, all organized in climate-controlled storage. I'm trying to save up money and research if it's worthwhile building something like a Faraday enclosure or EMP-resistant solution because I'm autistically obsessed with ensuring I don't lose the ability to fall back on older hardware.

Those charts made me laugh out loud, Mafia 2 Benchmark rating the RTX 5080 a D.

:story:
 
Dunking on nVidia gets him views. Gamers have used a 750ti or 1030 for 32 bit PhysX here.
Honestly, it shouldn't have happened in the first place. This is perhaps the first time in GPU history where there is actual regression. How in the actual fuck does that happen to a company that has less niggers in the first place?
 
Honestly, it shouldn't have happened in the first place. This is perhaps the first time in GPU history where there is actual regression. How in the actual fuck does that happen to a company that has less niggers in the first place?
The last game to support 32 bit PhysX was over decade ago at this point and from a security standpoint there is no point in supporting it. Even than people are finding even with cards like the 4090, it is still better have a dedicated card for better performance, lets be honest here this seems be cabal of tech youtubers who want to turn the tide for AMD GPU market share despite AMD being equally guilty of the false MSRP while their cards are still being trounced in Ray Tracing and Frame Gen.
 
Honestly, it shouldn't have happened in the first place. This is perhaps the first time in GPU history where there is actual regression. How in the actual fuck does that happen to a company that has less niggers in the first place?
I'll agree here, the best thing would have been is to write a 32 to 64 bit conversion wrapper here.
 
Honestly, it shouldn't have happened in the first place. This is perhaps the first time in GPU history where there is actual regression. How in the actual fuck does that happen to a company that has less niggers in the first place?
Most importantly, as Steve has pointed out, this sets a precedence. What is the guarantee that in X years time Nvidia won't ditch support for all the RTX features, because they've shifted to some AI only rendering and didn't feel like supporting software compatibility for the old RTX rendering pipeline?

Not that it matters, Nvidia fanboys have adamantly defended their asinine pricing and SKU shifting, including every other retarded decision like the objectively faulty connector and power delivery on the 40/50 series, blindly parroting Nvidia's blame shifitng, so they will defend this telling you to beat it and play newer games, or play those older games without PhysX since they're old so they're shit so who cares. Because who the fuck would care about playing old games, on a platform famous for it's extensive backwards compatibility?
write a 32 to 64 bit conversion wrapper
It's something that should've been done by Nvidia the very fucking moment they've dropped 32-bit CUDA architecturally. Microsoft made WoW64 the moment they've made Windows 64-bit to keep the transition seamless. If it's possible to write a wrapper for CUDA that runs near natively on AMD by an independent dev, Nvidia can make a 32-to-64bit wrapper for their own proprietary API.

Though I already know the excuse that'll be thrown around, not worth the time/effort/money Nvidia is an AI company they won't cater to your stupid gamer needs yadda yadda write your own wrapper, Nvidia can do no wrong and it's YOUR fault for being a whiny consumer. Like, Jensen won't give you a free 5090 for defending his leather jacket ass, and ultimately you're getting shafted as a consumer.
 
Honestly, it shouldn't have happened in the first place. This is perhaps the first time in GPU history where there is actual regression.

No, there have been many times in history where features got deprecated, including in GPUs. If you think this is the first time this has happened, you just haven't used that many computers or played that many old games on new hardware.

, or play those older games without PhysX since they're old so they're shit so who cares

If you have an AMD card like me, you're already playing those games without PhysX.

lets be honest here this seems be cabal of tech youtubers who want to turn the tide for AMD GPU market share

"Are you upset that new cards from NVIDIA don't support an NVIDIA-exclusive feature from 10 years ago? Buy an AMD card which also doesn't support that feature instead."
 
Last edited:
Intel Arrow Lake Refresh reportedly confirmed, focusing on AI upgrade
According to the message on Weibo, Arrow Lake Refresh would come to both desktops (S-Series) and laptops (HX-Series), meaning we are probably talking about the same silicon. Previous rumors claimed that the refresh would focus on an update for the built-in NPU, rather than an increase in core count, which was also rumored for some time.
Boring NPU update and 18A delay cofnrimed?

Chinese retailer lists GeForce RTX 5060 12GB and RTX 5060 Ti cards with initial prices
5060 Ti 8 GB shouldn't be launched. Just put it in OEM PCs for plebs.

AMD’s unreleased Radeon RX 9070 XT “reference” design shows up in China
The lack of a reference design, also known as MBA (Made by AMD), is a reason why many gamers cannot buy RX 9070 cards at MSRP these days. AMD made a decision not to introduce this design to the market, perhaps because it was too expensive to make and the pressure to lower the MSRP was beyond what AMD expected.

Lossless Scaling Adaptive Frame Generation Delivers Consistently Smooth Output With Low Base Framerates Even on a Mid-Range Ryzen 5 3600, RTX 4060 System
Promo for Digital Foundry, something for my boy @The Ugly One
 
Its amazing. The 60-class cards are cards for niggercattle. Tech-illiterates that have been conditioned to accept GPUs marked for planned obsolescence and in turn, devaluing the hobby for anyone with half a brain's worth.
There's nothing wrong with 8 GB cards... at the price I want to pay, $100, to play only old/autistic games instead of current slop. Or 720p/1080p low settings.

The niggercattle are fighting back!
GeForce RTX 5070 Takes The Number One Spot On Amazon, More NVIDIA GPUs Replace AMD In Best Selling GPUs List
 
Lossless Scaling Adaptive Frame Generation
I paid this Ukie an equivalent of a single beer for it some time ago and it is a pretty neat piece of software. Unfortunately due to it's nature it has major drawbacks.

My two main test beds for practical LSFG applications is an old Polish shovelware game called Syrenka Racer which is capped at 24FPS and PCSX2 with games that lack a 60FPS patch. I'll use the former as an example of the main two drawbacks. I used the latest Lossless Scaling version with adaptive frame generation to 73FPS. Not that it matters if it's 72FPS which would be 24*3 or 75FPS which would be my monitor's refresh rate, the issues still persist.

Issue #1: massive input lag. Unfortunately this is the biggest issue that disqualifies Lossless Scaling, or any frame gen for that matter, for filling in the gaps to playable FPS. You can see how big the delay is between the overlay activation and the game action.
Issue #2: visual artifacts. The more frames have to be generated, the more obvious they are. Here it's especially evident at around the 26 second mark where you can see a "force field" effect around the car. The car stays static relative to the camera while the rest of the frame keeps moving and that's how you get an artifact like this. I'm sure you'll be able to spot more of them.

Lossless Scaling also introduces input lag of it's own just by having to hook into another window's frame outputs to process them, so you will always have it slightly elevated even if you're just using it for upscaling, or for filling in potential FPS drops with frame gen to help the smoothness, which is the entire point of frame gen, unlike what Jensen wants people to believe. But still, I can apply frame generation to an obscure game and it works. It is pretty damn impressive for something that was written by a single drunk Ukrainian in some Kiev basement.

Too bad that the low FPS input lag issue will never be resolved most likely, otherwise it would be a godsend for old framelocked games like this. I doubt we'll see some magical solution from DXVK, Special K or other injector/wrapper utility that'll unhook the input from the game's FPS.
The 60-class cards are cards for niggercattle.
Hey, the 1060 was a fantastic little card. Don't put that evil on it, it served me well for over 9 years.
 
Honestly, it shouldn't have happened in the first place. This is perhaps the first time in GPU history where there is actual regression. How in the actual fuck does that happen to a company that has less niggers in the first place?
The kind of company that has to write drivers for an operating system that is maintained by something worse than niggers - jeets. Given recent messaging from Microsoft, I would expect that a lot of 32-bit applications might simply stop working in the near future as the current team running Windows seems increasingly unwilling to maintain WoW64.

We're unironically heading for a timeline where Linux with wine has wider support for Windows applications than Windows.

No, there have been many times in history where features got deprecated, including in GPUs. If you think this is the first time this has happened, you just haven't used that many computers or played that many old games on new hardware.
There's a whole era of PC games that sound completely wrong now that EAX is dead. Thank god stuff like ALchemy exists but still.
 
the current team running Windows seems increasingly unwilling to maintain WoW64
From what I've heard there's barely anyone developing and QAing Windows nowadays. It's not just the quality of the devs, but the quantity too. Microsoft is running Windows on a skeleton crew since Azure brings them all the cash.
Thank god stuff like ALchemy exists but still.
You should also look into OpenAL Soft and DSOAL. You can restore EAX as well as HRTF audio in many games thanks to those wrappers. Fully in software, on any audio hardware. Obviously it's not perfect, but at least OpenAL was open source for a while so it was somewhat easy to make a better OpenAL and then a wrapper for DirectSound3D.

I know that ATi used to have something called TruForm, because Serious Sam was one of the few games that had it. Obviously never had an ATi card so I never saw it in action, and also AMD doesn't support it either, and there doesn't seem to be any wrappers to bring it back in software given the current computing power, so that's another GPU tech that got abandoned. However PhysX was much more impressive and was utilized in more titles, and it was supported by Nvidia for a long time, and technically still is. AMD threw TruForm out of their drivers with Catalyst. You can shit on Nvidia for many reasons, but software support is one of their strongest sides. Maxwell is 11 years old and Nvidia is yet to drop the support for it, meanwhile GCN3 is the "latest" architecture still semi-supported by AMD.

Fuck Creative for destroying Aureal, and by extent HRTF audio in games for no benefit after Realtek made Soundblasters obsolete by the way. But I already wrote an entire blog post about it.
 
Issue #1: massive input lag. Unfortunately this is the biggest issue that disqualifies Lossless Scaling, or any frame gen for that matter, for filling in the gaps to playable FPS. You can see how big the delay is between the overlay activation and the game action.

It's only an issue with frame interpolation. Imagine we've got three frames, Frame 0, Frame 1, and Frame 2. Frame 1 is the frame we view if we run natively, and we want to generate Frame 1.5 algorithmically.

Lossless Scaling interpolates from data from Frame 1 & Frame 2 to generate Frame 1.5. This requires waiting for Frame 2, introducing a full frame delay plus the overhead of computing the interpolated frame. At 30 fps, that's 0.033 seconds, which is a lot, and can make action games unplayable.

DLSS4 and FMF, by contrast, extrapolate on data from Frame 1 and possibly Frame 0 to generate Frame 1.5. Since extrapolation doesn't require waiting for the next frame, the only delay introduced is the overhead of computing the next frame, which is typically quite small. However, extrapolation is intrinsically more error-prone than interpolation, so IME I see more artifacts with FMF than I do with Lossless Scaling.

Maxwell is 11 years old and Nvidia is yet to drop the support for it

Hopper doesn't support 32-bit applications, either. 64-bit x86 Windows dropped in 2005, so for any applications that still matter, the writing's been on the wall for 20 years. Dropping PhysX for desktop GPUs seems to be downstream of dropping CUDA for 32-bit applications completely. Given that the vast majority of CUDA applications are oriented toward the datacenter, and that didn't even really take off until 2020, 32-bit CUDA is a rounding error that doesn't matter.

Well, to NVIDIA. To the 51 people playing Mirror's Edge right now, it matters.

1741883805022.png
 
The kind of company that has to write drivers for an operating system that is maintained by something worse than niggers - jeets. Given recent messaging from Microsoft, I would expect that a lot of 32-bit applications might simply stop working in the near future as the current team running Windows seems increasingly unwilling to maintain WoW64.
Goddamnit. Valve, start updating L4D/2 to 64bit before they pull the plug.
 
From what I've heard there's barely anyone developing and QAing Windows nowadays. It's not just the quality of the devs, but the quantity too. Microsoft is running Windows on a skeleton crew since Azure brings them all the cash.
I think one of the things that really hammers home how moribund Windows is if you go to any Microsoft event where they put their developers on stage, those developers... are invariably using macbooks.

Windows is so neglected that not even the engineers working on it really want to use it.
 
From what I've heard there's barely anyone developing and QAing Windows nowadays. It's not just the quality of the devs, but the quantity too. Microsoft is running Windows on a skeleton crew since Azure brings them all the cash
We may actually be looking at a future where nobody uses Windows anymore, either some flavour of Apple's *OS, Linux, or ChromeOS.
 
Back