Fun Fact, James bond Favourite Fag Brand is named after that guy.
And Ronald Reagan was one that appeared in ads for that brand.
It took 117 computers running for 24 hours a day for over a year in a temperature controlled room to fully render Toy Story.
(I promise this is going somewhere, there's an opening and an ending.)
That's not surprising, computers were ass back then and the hardware+software needed had a combined cost of more than you are ok with selling your ass for.
Pixar's renderer, Renderman, is currently free to download. I think it free was in the past as well but it only ran on Irix(the 3D operating system of Jurassic Park fame) and getting a Silicon Graphics computer to run it on cost 20k and up. PowerAnimator and had a similar price-tag, same with Softimage|3D (guess how that is pronounced, they were also briefly owned by Microsoft before they sold it to AutoDesk - the Unilever of 3d software).
Silicon Graphics(SGI and of N64 fame) had the best 3D accelerators in the world and they weren't for ordinary people, they also created OpenGL. Later on some people left SGI and started their own company, 3dfx, they were the first to release an appealing, affordable and actually good 3D accelerator for the home consumer. The 3dfx Voodoo.
Soon they had many competitors, one of the more capable ones was Nvidia. Nvidia later released the first Geforce card around the same time as the new generation of Alias|Wavefront(creator of PowerAnimator mentioned previously) 3d software was released on platforms including PC/Win2k. That program was Maya, you might not have heard of it, it's pretty obscure.
The Geforce was low cost and absolutely tremendous value for money, I mean, build a low-end SGI competitive 3d modelling computer using consumer priced off the shelf parts? Wow, the cost of entry was 1/20th of the price it used to be. PC used to have LightWave, 3dStudioDos, 3dStudioMax, but nothing classy
Nvidia saw this and capitalized on it by releasing the first Quadro card, aimed at professionals. They did this by disabling some previously functional OpenGL features in the drivers for the gaming card, features that games never use, like putting a hard limit on line rendering for wireframes. The Quadro was of course much more expensive even though it was the exact same card/chip but with fancy drivers.
Changing a thing in the registry fooled the installer to install the Quadro drivers, no big deal, saved a couple of hundred bucks.
The Geforce 2 arrived as did the new Quadros. Nvidia had become smarter, it wasn't just a registry hack to separate the cards this time, it was a hardware feature that determined if it was a pleb card or big money card.
That hardware feature was a resistor, either missing or present. Slightly more complicated than doing the overclock hack of an Athlon with a pencil.
Between the PC release of Maya as well as the Geforce 1 and Geforce 2(an 8 month period) SGI started to crater. They used to design their own CPUs and GPUs but went x86(Pentium 2-3, Windows 2000) to be competitive. To add some special flair to their systems they designed some parts of the hardware for their workstations, I think they made their own northbridge for RAMBUS RAM on a P3 system, but the price-tag was too damn high. They also started using Nvidia Quadro cards, sealing their demise, they were no longer selling their own hardware, they were a retailer.
Fun Fact:
The splash screen of 3d modeling/animation software Maya 1 or 2 had the first CG image of Sam Raimis Spider-man movie, years before it was released.