LSI / Broadcom HBAs and IT mode firmware - where to get for the 9400-16i?

TheBallPit

kiwifarms.net
Joined
Sep 26, 2019
Now that HDMI 2.1 has FINALLY come to iGPUs and DDR5 has become user friendly (more or less), I'm preparing a system build and the LSI / Broadcom 9400-16i is on the list, as I'll have a lot of drives to connect.

Does anyone have any familiarity with flashing LSI cards to IT mode? Can it be done under DOS or Windows or does it have to be in a Linux environment? Assuming this particular card needs it since it claims "tri-mode" OOTB, can anyone direct me to the proper firmware?
 
Just order your HBA from The Art of Server on EBay, it will come pre-flashed.
I don't think I can, as he doesn't typically sell the higher-end HBAs. He's also on an eBay hiatus. And I'd rather learn how to do it myself in case the firmware needs updating.
 
"Tri-Mode" refers to the fact it supports SATA, SAS, and NVME.

Anyway, the 9400-16i should just be a basic HBA or "IT Mode" by default.

From what I understand, all LSI adapters need flashing to IT mode in order to talk to drives in a JBOD non-hardware-RAID configuration. Perhaps the 9400 is different, but it's unlikely since IT firmware was required for the previous iterations.
 
Why on God's great, green, infinite, hollow yet flat earth, would they specifically label cards as "HBA" and "megaRAID" in the same series if the HBA was not just an HBA? Compare the product briefs, and the available software downloads.

 
What kind of drives? If it's just SAS (6 Gbit) or SATA then get an older HBA like the 9207-8i (can find them reflashed on eBay easily) then use a SAS expander so you can plug in enough drives (unless you're using an enclosure with a built in expander like the NetApp DS4246, in which case get 8e).

If you're looking for SAS 12 Gbps or need MiniSAS HD then I can recommend the Dell H330 as you can set that to HBA mode in the option ROM or even crossflash it to a generic LSI card. It does have some quirks in non-Dell systems though: https://maidavale.org/blog/dell-perc-h330-tips-and-tricks/ and I don't know if the option ROM works outside of Dell systems. That guy links to an extremely comprehensive guide in crossflashing which may answer your questions re: reflashing/crossflashing.

If you're instead interested in NVMe, you can find no-name cards that'll do PCI-E bifurcation to give you M.2 or U.2 connectivity on eBay.

I'm not going to guarantee you don't have to reflash the 9400-16i but I'd be surprised if you do. I've found reflashing LSI cards is generally only necessary on older LSI cards and I'd speculate that's because of things like VMware vSAN causing a diminishing interest in hardware RAID. This STH thread on the 9400-16i is useful as one of the posts has a breakdown of alternative options using older cards and expanders like my first suggestion: https://forums.servethehome.com/index.php?threads/any-reason-i-shouldnt-buy-an-lsi-9400-16i.32182/

Lastly, why so many drives that you need a card like this? You might be better served by getting a small number of higher capacity drives then doing SSD caching.
 
Sorry. Between all the goddamn DDoS'ing and Bad Gateways, it's been a real PITA to access the site. Things seem to be behaving now, though.

After asking around, it looks like the "HBA"s are actual JBoD HBAs and not cards that need to be flashed to IT mode. This surprised the hell outta me, as every time LSI cards are discussed, no matter the type, IT firmware has always been broached. I just thought it was de rigueur with these cards, hence my insistence.

Regarding the Ali Express SATA cards, I've heard that they (and similar) cards don't have much QC and ports can spontaneously fail, which is why I wanted to stick with a "known brand". I do admit, though, that they look interesting and will be bookmarking them for future consideration.

Flaming Dumpster, they're just standard SATA drives, most of them shucked, so they're white label WDs. I've accumulated 10 14+ TB ones and might get another if the price is right, especially when I fully get into the 4K HDR game. I mostly just want things to be simple, which is why I'm running Drivepool and not some fancy RAID setup.

My biggest problem is my indecision as to what to ultimately do. Having everything in one box would be nice, but would require a ridiculously large case and the final system weight would be ludicrous. Scaling the need back to a simple ATX case would probably save at least $150-$200 AND I wouldn't have to bother with the LSI card.

So, where would all the drives go? Into this, or something like it:

https://sabrent.com/products/ds-uctb

I could then build the actual system in something like this to put on my AV shelf:

https://www.silverstonetek.com/en/product/info/computer-chassis/GD11/

Like I wrote, I'm still vacillating between the two solutions, so nothing yet is set in stone. I am liking the Sabrent + Silverstone combo, though.
 
My biggest problem is my indecision as to what to ultimately do. Having everything in one box would be nice, but would require a ridiculously large case and the final system weight would be ludicrous. Scaling the need back to a simple ATX case would probably save at least $150-$200 AND I wouldn't have to bother with the LSI card.
I have that many drives, in a secondhand massive gamer case. I have a little cage thing that fits into 3 of the 5 1/4" bays to fit 5 3 1/2" HDDs in there. The motherboard barely goes 2/3rds of the way down so I could probably fix at least one more of those cages to the bottom of the case if I wanted more drives. It does already require longer-than-usual SATA cables and possibly unsafe levels of SATA power extension though. Some sort of expansion chassis might not be a bad idea.
 
I have that many drives, in a secondhand massive gamer case. I have a little cage thing that fits into 3 of the 5 1/4" bays to fit 5 3 1/2" HDDs in there. The motherboard barely goes 2/3rds of the way down so I could probably fix at least one more of those cages to the bottom of the case if I wanted more drives. It does already require longer-than-usual SATA cables and possibly unsafe levels of SATA power extension though. Some sort of expansion chassis might not be a bad idea.
I was going to say, I think my okd Fractal Define R5 could fit 10 drives internally. I also have a Fractal Node 804 that fits 8. The Sabrent enclosure is neat, but does something like ZFS work over USB? 🤔
 
I was going to say, I think my okd Fractal Define R5 could fit 10 drives internally. I also have a Fractal Node 804 that fits 8. The Sabrent enclosure is neat, but does something like ZFS work over USB? 🤔
From what I understand, the Sabrent presents Windows with drives as if they were connected directly to the motherboard. I have no idea how it performs this bit of sorcery from just a USB cable, but if that's the case, then anything that can be done in a standard box should be up for grabs.
 
It does already require longer-than-usual SATA cables and possibly unsafe levels of SATA power extension though. Some sort of expansion chassis might not be a bad idea.
I should've mentioned that I was also concerned about the power situation with that many drives. I know you can get cables from Amazon with all sorts of SATA power connectors hanging off the one cable, but I'm pretty leery of them. Since the Sabrent requires its own dedicated wall power and does its own internal multiplexing, that particular issue is solved.
 
I should've mentioned that I was also concerned about the power situation with that many drives. I know you can get cables from Amazon with all sorts of SATA power connectors hanging off the one cable, but I'm pretty leery of them. Since the Sabrent requires its own dedicated wall power and does its own internal multiplexing, that particular issue is solved.
I ran into this problem too. Those SATA splitters are often pretty sketchy. Luckily I have two Seasonic power supplies, both of which came with modular 4xSATA power cables. So I was able to take the cable I hadn't used from my desktop machine and hook it up in my NAS box.

Are you going to be running Windows on this storage device, rather than something like TrueNAS or Unraid?
 
Yes, nothing but Windows and Drivepool. I know less than zero about NAS software and RAID gives me the heebie-jeebies, as I've managed to pull data off of dying drives in other systems, whereas I couldn't do that if the files were RAIDed in some arcane striped fashion. Drivepool is great in that it's just pooling software with some redundancy and evacuation options, so all the files are simply "there" waiting to be accessed by any OS, including live Linux distros.
 
Hmmm. I don't think the Silverstone is gonna work. I watched some YouTube reviews of the case and it looks like a cabling nightmare. Plus, there's no cutout underneath the CPU, so you'd have to take the whole damn thing apart should the cooler go on the fritz.

Ugh. Too many options. I just a need a relatively simple case that's easy to work in and has at least one 5.25" front drive bay, as I'll be installing an optical drive to rip 4K discs. Maybe getting a giant case with room for two PSUs is the way to go after all....
 
Go all out on the autism and get a NetApp DS4246
1673605692093.png
1673605164621.png
They're dumb, reliable and not that loud once the boot sequence is done. They're getting pretty cheap since corporations are pulling apart these (relatively old) NetApp filers all over the place. You can use SAS or SATA, they support large drives and since they have a built in SAS expander, you don't have to worry about getting an overly fancy HBA. You just need this relatively weird cable and something like an LSI 9200-8e.

If you still want to go down the path of using a desktop case, just do what I suggested earlier and get a cheaper HBA and SAS expander. No reason to get this pimped out 9400 when you aren't using flash storage.
 
  • Informative
Reactions: The Ghost of Kviv
The problem with those rackmount units is that they still require some serious cooling since drives are packed-in like sardines and can't get much airflow without resorting to Delta or Delta-type fans. That's just too loud for my use case, even if they only run intermittently. Don't get me wrong, they're beautiful units, but they're simply not practical for my situation.

This is why I wanted to separate the drives completely from the main system, as they could then be cooled in isolation requiring far less aggressive methods.

Also keep in mind that those rackmount units typically don't have bottom CPU cutouts, so if anything happens to either the CPU or the cooler, you've got to take the whole motherboard out to effect a repair.
 
Yes, nothing but Windows and Drivepool. I know less than zero about NAS software and RAID gives me the heebie-jeebies, as I've managed to pull data off of dying drives in other systems, whereas I couldn't do that if the files were RAIDed in some arcane striped fashion. Drivepool is great in that it's just pooling software with some redundancy and evacuation options, so all the files are simply "there" waiting to be accessed by any OS, including live Linux distros.
Raid and ZFS aren't really the same thing. I would still recommend checking out Unraid or TrueNAS Scale. Especially if you are going to the trouble of hooking everything up to an HBA.
 
Back