Workstations

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.

Cupronickel

Ready or not, I'm the illegitimate son of God!
kiwifarms.net
Joined
Feb 6, 2021
Do you own any? In case you aren't aware, a workstation is pretty much any high end PC that typically uses Intel's Xeon or AMD's Threadripper CPUs along with a Quadro or Radeon Pro GPU. I currently own a low-spec HP Z8 that I got used for around $1000. It's a wonderful machine and I like just how much expansion you can do with it.

Just curious.
 

Attachments

  • 1512044542_1378355.jpg
    1512044542_1378355.jpg
    305.3 KB · Views: 56
so an over-glorified gaming pc?
No. They differ in that their motherboards support ECC memory and usually a lot more of it. For example, the Z8 can take up to 3TB of RAM. Also dual CPUs and GPUs are much more useful for rendering and CAD shit.
 
Basically what Octane said. Many types of workstations exist, I think they're talking specifically about "Graphic" workstations, which are real good at graphics and rendering, but not necessarily realtime graphics and rendering.

Think the difference between playing Cawwadoody or working on a Pixar film. Not exactly the best analogy (Pixar has some crazy fucking proprietary internal cluster mega computer thing), but its apt.

I currently don't have any real workstations, I had a computer stronk enough in the college days that was basically one slash my gaming machine, but the inexorable march of time has basically nullified it as tech progressed forward. Workstations kinda got co-opted by the bitcoin mining fad for a bit I believe. These days its almost easier just to spin up and spin down some AWS configs.
 
No. They differ in that their motherboards support ECC memory and usually a lot more of it. For example, the Z8 can take up to 3TB of RAM. Also dual CPUs and GPUs are much more useful for rendering and CAD shit.
ECC is slipping into consumer motherboards for AMD chips. It's not officially certified, but the support is there. It's nice.

Last time I ran a dedicated workstation was when SGI was still a major force in the market. Not quite worth it since then. COnsumer grade hardware is good enough for a lot of purposes.
 
I wanted to get one but it couldn't run Crysis at max settings so I decided against it.

High volumes of RAM is cool though.
 
So is this the kind of machine you'd make CGI shit with?
Theoretically, something with some Quadro video card(s) and thread ripper processors, yeah.

The average high end PC for the enthusiast is pretty much all you need these days unless you're a PRO making PRObux and have PRO deadlines, but it was a different story in the 90s and early 00s.
I mean hell, I ran some tests on the Apple M1 Mac Mini and it kept pace with my i7 in some Blender Renders, and that was early enough Blender wasn't ARM native yet (I know they released a beta, that, oddly enough, wasn't running as well as the x86 version in Rosetta, so, like, something ain' optimized right yet. Maybe that's fixed by now, haven't gone back to check in a few months). The reason I mention this is because the Mac Mini is like, $700 bucks and far from what you'd think of as "CGI Powerhouse Workstation" configuration of a computer.
 
I took home some Dell towers from work. Does that count?
 
  • Thunk-Provoking
Reactions: Tookie
I'm regretting not going threadripper for my recent desktop build. I thought that 5.3 ghz would make the pc much more responsive for large, realtime code analysis but it really just didn't do that much. I had concerns about Ryzens IPC too. Maybe that's solved by now.
 
  • Feels
Reactions: The Real SVP
I can get by with very little but I'd never get by without ECC RAM. It should be the standard really.
 
I can get by with very little but I'd never get by without ECC RAM. It should be the standard really.
speaking of which, what's with most consumer cpus supporting ecc but the majority of chipsets don't?
 
speaking of which, what's with most consumer cpus supporting ecc but the majority of chipsets don't?
For cpus specifically, I think what they normally do is just produce one model and then turn on and off features based on silicone quality/binning. It's probably a lot less expensive than producting two distinct models since I think ECC would require an architecture change and (just guessing) might not be subject to silicon quality variations like core count is.

Getting ECC onto a motherboard might be a different story.
 
  • Agree
Reactions: Smaug's Smokey Hole
Anyone itt used BeOs? I'm looking into it and it's successor Haiku and wow it has some cool ideas.
I might still have the original disk and manual for R5 Professional Edition somewhere. I eventually replaced it with Windows 2000 and Suse Linux, because BeOS lacked video editing and encoding software and the posix layer had troubles with running daemons and X.
 
I ended up buying my old man an alright used Xeon based workstation a few years ago now. He ended up getting some good use out of it when he was still doing web dev work. Myself, I was originally looking into getting a used one and just getting the flash and RAM upgraded. Though I decided to just build a new gaming PC for about the same cost. I'm not much of a gamer these days, I'm not doing anything too crazy beyond some development for Android and web side ATM. Though once I branch off into game development and things that require higher levels of memory, my machine should be suited to do the trick.

Though so far two weeks in, it runs all the software I use day to day development wise like a fucking breeze. Unlike my laptop that would choke if it had too many chrome tabs open.
 
Back