Gamers Nexus

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
Did either of you niggers actually use 3dfx SLI? It was available for one card in a period of rapid development. I don’t know a single person who used it. I’m pressing X to doubt on either of you having used it, especially because I know Slav Power wouldn’t have been old enough to know or care about SLI’s performance at the time.
I used it. Voodoo 2 and I wanna say some diamond viper.
 
I used it. Voodoo 2 and I wanna say some diamond viper.
Voodoo 2 SLI had to be paired with a good CPU as well.

Here's a benchmark to show the performance uplift of SLI when using a contemporary CPU.
1761388708026.png

V2 SLI is very CPU bound but don't forget that the Celeron 300A was released not long after the Voodoo 2 and if set to run on a 100mhz FSB it would beat the snot out of the P2 400MHZ seen above. The 300A could be bought for ~$120 while the P2 400 was like 500 bucks.
That cheap ass CPU was basically a requirement to get anything out of SLI(meaning 200 FPS in certain games). Everyone had one. Except a friend of mine who was a diehard K6, K6-2 and K6-III believer.

Just for fun, this is the difference between a single card and SLI running on a Voodoo 1 era CPU. It isn't fast enough.
1761388324326.png


And this is the SLI difference when running on a 1400mhz Pentium 3, which no one ever did because Glide was dead, 3dfx was dead, T&L was introduced in D3D7 and even that felt antiquated because of the brand new D3D8 and GeForce 3.
1761389713089.png
 
Did either of you niggers actually use 3dfx SLI? It was available for one card in a period of rapid development. I don’t know a single person who used it. I’m pressing X to doubt on either of you having used it, especially because I know Slav Power wouldn’t have been old enough to know or care about SLI’s performance at the time.

Nvidia’s SLI is a completely different technology. They just reused the acronym for the name recognition.
My point was that the scanline approach simply wouldn't scale with the rapid development of technology during that period and would sooner or later cause more issues than Nvidia's multi-GPU approach did. Though even that was a complete failure and would cause more issues than it would fix. Ideally you'd have a system that seamlessly combines computing power of multiple GPU's where the software doesn't need to be tailored for it, but, you know, pipe dream, especially with how Nvidia killed off NVLink in consumer GPU's for them to not eat into their enterprise products.

Besides, nowadays the more viable multi-GPU setup is having an AMD APU, a beefy Nvidia GPU and a small Intel Arc GPU so that you can use the benefits of all three vendors in games, like FSR/XeSS, AFMF, QuickSync etc while having a main GPU that's actually powerful and worth a damn. Another reason why Nvidia ditched SLI was because they started making single GPU's so powerful there was no need to try and combine multiples of them to reach higher performance, basically wasn't economically viable to do this type of voodoo anymore as everyone moved onto better processes.
 
My point was that the scanline approach simply wouldn't scale with the rapid development of technology during that period and would sooner or later cause more issues than Nvidia's multi-GPU approach did. Though even that was a complete failure and would cause more issues than it would fix. Ideally you'd have a system that seamlessly combines computing power of multiple GPU's where the software doesn't need to be tailored for it, but, you know, pipe dream, especially with how Nvidia killed off NVLink in consumer GPU's for them to not eat into their enterprise products.

Besides, nowadays the more viable multi-GPU setup is having an AMD APU, a beefy Nvidia GPU and a small Intel Arc GPU so that you can use the benefits of all three vendors in games, like FSR/XeSS, AFMF, QuickSync etc while having a main GPU that's actually powerful and worth a damn. Another reason why Nvidia ditched SLI was because they started making single GPU's so powerful there was no need to try and combine multiples of them to reach higher performance, basically wasn't economically viable to do this type of voodoo anymore as everyone moved onto better processes.
It's possible to do it's just why would anyone bother implementing it? For a high-end GPU they already recommend 750+ watt PSUs.
explicit-multi-gpu-1080h-ashes-bench.png
 
It's possible to do it's just why would anyone bother implementing it?
My point isn't using all three GPU's at once, but having an AMD APU (weak), Nvidia dGPU (strong) and an Intel Arc dGPU (weak), so that you use the Nvidia card for the bulk of the game rendering, then using the AMD/Intel ones for extras like frame gen or upscaling, as IIRC you don't need to run the game on those GPU's to use them. For example AFMF can be ran off of an APU with an Nvidia GPU doing most of the work, and by having an extra Intel Arc GPU, even a very low-end one, you get the fantastic Intel video encoders/decoders/transcoders on an AMD CPU system. Plus, you could run Lossless Scaling off of that. This will be the multi GPU meta nowadays: multi vendor supplementary GPU's.
 
He's doing another kickstarter regarding AI, the US government, and other companies like Nintendo and Sega
This is already deleted, or being subject to some shadow ban fuckery. The only place I can get this link to load is on my phone in the Youtube app when I am logged in, regardless of whether my phone is on my wifi or on the cell networks. Every where else, logged in or Incognito, I get either a 404 or a message saying the video isn't available anymore. I am presently being stymied by PreserveTube's captcha to see if someone already archived it there.

Edit: captcha finally worked, someone did archive it with PreserveTube.
 
He's doing another kickstarter regarding AI, the US government, and other companies like Nintendo and Sega
>Palantir
He's gonna be so screwed knowing that Palantir is the NWO's security division. If Blackrock/Vanguard is their propaganda machine/moneymaker, and Palantir's the security, Nvidia fits the technology piece because their GPUs are highly sought after and can be used to create AI models with ease. Even Eurofag companies like IONOS uses Nvidia GPUs.
 
Insane.

Off topic but I hate how people pronounce "Palantir". I just know that Tolkien would be cringing if he heard amerisharts saying "pALAN-teer" instead of "puh-lan-teer", as one flowing word, more like "plantation".
 
Guy who does ESPN sports center styled GPU benchmarks is now deep diving into the governments security apparatus. Interesting development.
I'm just waiting for a comprehensive benchmark/breakdown of visible light emitted by different federal agencies' agents (lumen, measured in the absolute darkness of GN HQ's basement) now.
 
I'm just waiting for a comprehensive benchmark/breakdown of visible light emitted by different federal agencies' agents (lumen, measured in the absolute darkness of GN HQ's basement) now.
What I've always wondered is: Why do they always glow in green when Cherenkov radiation is blue and heat radiation is red -> yellow -> white with a lot of infrared?
 
Back
Top Bottom