- Joined
- Jun 24, 2024
Nah just buy the big boy gpu so I can @ grok make this guy naked natively on site.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I looked at this a while ago for the BMJ VODs: https://github.com/kaltura/nginx-vod-modulewant to write a media processor that converts user uploads into playlists.
ok chyat how about this
1x https://www.ebay.com/itm/127566125829
4x https://www.ebay.com/itm/297951440555
I think someone offered me a bunch of WD Enterprise 12TBs recently, I need to go find that email. It came at a really bad timing because I had already bought 16TBs for the server but it would actually be SUPER DUPER PERFECT for this
Off topic but are you familiar with Bookmarks? Not as useful as having every vidya transcribed, but you can at least tag things you want to come back to:This would be literally a dream come true for me and every lolcow's worst nightmare. I constantly want to reference random videos and can't because they're buried in several thousand page threads with no obvious search terms in the post.


Do email me, thank youI offered 5 of the 12tb on your page earlier, I'm travelling right now but later I can shoot you an email about donating them.
yeah, i use those, but not religiously enough. My favorite one is this Philly guy talking about how they were justified to have a stadium full of people throwing batteries at santaOff topic but are you familiar with Bookmarks? Not as useful as having every vidya transcribed, but you can at least tag things you want to come back to:
View attachment 8530350
View attachment 8530346

Me either, and me too.I do not understand any of this but I am filled with overflowing optimism.
I really don't think we need 32gbps bus.The Supermicro's are Skylake era so they run at PCIe 3.0 speeds while the Arc Pro B60 runs at PCIe 5.0 x8, so you would have a small performance penalty in the amount streams you could transcode at once due to PCIe bandwidth limitations.
I like this idea. A good rack-mounted server with as many good ARC GPU's as you can fit in it seems like it would be a good jack of all trades for your use cases.ok chyat how about this
1x https://www.ebay.com/itm/127566125829
4x https://www.ebay.com/itm/297951440555
I think someone offered me a bunch of WD Enterprise 12TBs recently, I need to go find that email. It came at a really bad timing because I had already bought 16TBs for the server but it would actually be SUPER DUPER PERFECT for this
You're looking for a cross between an ASIC and GPGU. Sophgo maybe... In this area Intel/AMD should beat the crap out of Nvdia since they own ASIC subsidaries for a while now.I don't think it's possible to build a server that does both media and AI without just getting big boy NVIDIAs, spending a ton of money, and plugging it into a 250W outlet.
I'm not experienced enough in this area to offer a hard opinion. But generally, I'd say focus on one thing and get something solid built for that. You're already getting complex with the equipment whichever goal you have, the complexity just scales up when you try to do both at once.$6000 is honestly too much for our workload if I managed to get something like NETINT Quadra Cards but it makes me wonder if we could meme together something that can do both media + ai inference.
Even if intel gives up on dGPUs they should still get support since their integrated graphics are based on the same architecture.Ignoring the AI meme, on the assumption that intel continue to support Arc, intel cards are great bang for buck.
also if we're reencoding media, niggermarking is back on the table