My old NAS in a PowerMac G5 case and my NUC home server are growing old, I also want to upgrade beyond 1Gbit networking. So as I usually do, I start from the case I want for this project. It needs to be well below 20L in volume. It also needs to have serious performance and I want to use as much leftover parts as I can.
Case: Silverstone CS01-HS (link)
This case looks amazing, is small and has 6 easy-remove 2,5" bays that support disks up to 15mm thick. The outer shell is one piece of 5mm anodized aluminium, with two easily removed side panels. It's design reminds me of the PowerMac G5 that it is going to replace. At 14 liters it'll easily fit somewhere. But as usually with Silverstone's storage-oriented cases, it's not easy to install powerful components without caveats. The main one being it only supports one low-profile PCIe card and it's CPU cooler height is limited to 68mm.
In the famous words of Barney Stinson: Challenge accepted !
CPU: AMD Ryzen 7 1800X (link)
I didn't have this one available but I wanted to upgrade to a newer Ryzen last year anyway. With the release of Ryzen 3000-series I found a 2700X for cheap, upgraded my own PC and had the 1800X available for this project. 8 cores at 3.6 GHz base clock is still reasonably nice for home server usage.
CPU Cooler: Cooler Master MasterLiquid 120 (link)
Maybe the biggest issue of this case is the CPU cooler solution. Ideally I wanted a Silverstone TD03-LITE but those are hard to come by, atleast in my region. The next best solution with an integrated pump is the MasterLiquid 120. Why ? Because a 25-27mm max radiator thickness is important, there is not much more than 40mm of space between the only fan mount and the motherboard. It also ticks the most important boxes, namely long lifetime and lack of RGB.
Motherboard: ASRock AB350-GAMING ITX/ac (link)
This is another component I had "lying" around. I prefer ASRock boards for my kind of stupid projects because they support PCIe bifurcation out of the box. At the point of conceptualizing this build, I wasn't sure I would go with bifurcation or use the M.2 to PCIe adapter. In the end, the bifurcation route would cost me precious space I didn't have. But it would have been ideal, PCIe lane wise.
RAM: 2x 16GB Samsung "B-die" DDR4-2400 ECC memory (link)
AMD's Ryzen CPUs support ECC memory and Samsung B-die was (at that time) the best fit for a first-generation Ryzen. I found these sticks for a reasonable price at the time B-die was on its way out.
Storage Controller: LSI SAS9207-8i based HBA card (link)
Like the popular IBM M1015 HBA card, this offers up to 8 SAS/SATA drives, through a PCIe 3.0 x8 link. There are many OEMs with this solution and they are cheap to find. Considering I'm only going to be running HDDs from these, being limited to 4 PCIe lanes shouldn't be a problem. It requires two "Mini-SAS (SFF8087)to 4 SATA" cables.
Storage Tier 1: 2x Samsung 2,5" SSD 830 SATA (link)
I had one of these and was able to buy a second one used, for not much more. This allows cheap but fast SATA SSD storage. It'll do for now.
Storage Tier 2: 6x Toshiba 2,5" 3TB 5400rpm SATA (link)
This was the toughest part. Recently it has become known that a lot of HDDs are SMR based, which you don't want for storage that involves random writes. I'm going to be using ZFS and SMR is discouraged for that use. The largest 2,5" HDDs I could find are 2TB drives, which later turned out to be SMR-based as well. The only 1TB+ drives that aren't and also aren't enterprise 10.000rpm (needing cooling, making lots of noise), are the Toshiba MQ03ABB300, which are hard to come by. But they were available in my region and still are for about 100€ a piece. Much cheaper than SSDs and not much more expensive than SMR drives.
Network Controller: Mellanox ConnectX-3 MCX354A-FCBT dual 40GbE card (link)
These are interesting beasts. They are cheap and offer two 40GbE ports through a QSFP+ port. Direct-Attach Cables are easily found, basically it's about 100-120€ for two cards and a DAC, having a 4GB/s link between two devices. Switches are more difficult though. But these cards are also interesting for 10GbE SFP+, with QSFP+ to SFP+ adapters also readily available. Oh and did I say it has two of those ports ?
Even though 25/50/100GbE is becoming the norm, these cards are dirt-cheap because a lot of companies are migrating to the formentioned better upgrade path.
M.2 Adapter: ADT Link R43MR M.2 M-key to PCIe 3.0 x4 adapter (link)
I bought these before the build was underway and luckily I did, bifurcating the PCIe x16 is not easy with the common x8/x8 adapters in that tight of a space. These components also heat up considerably without active cooling. In the end it's the better solution for my build but with this I am limited to PCIe 3.0 x4.
Case: Silverstone CS01-HS (link)
This case looks amazing, is small and has 6 easy-remove 2,5" bays that support disks up to 15mm thick. The outer shell is one piece of 5mm anodized aluminium, with two easily removed side panels. It's design reminds me of the PowerMac G5 that it is going to replace. At 14 liters it'll easily fit somewhere. But as usually with Silverstone's storage-oriented cases, it's not easy to install powerful components without caveats. The main one being it only supports one low-profile PCIe card and it's CPU cooler height is limited to 68mm.
In the famous words of Barney Stinson: Challenge accepted !
CPU: AMD Ryzen 7 1800X (link)
I didn't have this one available but I wanted to upgrade to a newer Ryzen last year anyway. With the release of Ryzen 3000-series I found a 2700X for cheap, upgraded my own PC and had the 1800X available for this project. 8 cores at 3.6 GHz base clock is still reasonably nice for home server usage.
CPU Cooler: Cooler Master MasterLiquid 120 (link)
Maybe the biggest issue of this case is the CPU cooler solution. Ideally I wanted a Silverstone TD03-LITE but those are hard to come by, atleast in my region. The next best solution with an integrated pump is the MasterLiquid 120. Why ? Because a 25-27mm max radiator thickness is important, there is not much more than 40mm of space between the only fan mount and the motherboard. It also ticks the most important boxes, namely long lifetime and lack of RGB.
Motherboard: ASRock AB350-GAMING ITX/ac (link)
This is another component I had "lying" around. I prefer ASRock boards for my kind of stupid projects because they support PCIe bifurcation out of the box. At the point of conceptualizing this build, I wasn't sure I would go with bifurcation or use the M.2 to PCIe adapter. In the end, the bifurcation route would cost me precious space I didn't have. But it would have been ideal, PCIe lane wise.
RAM: 2x 16GB Samsung "B-die" DDR4-2400 ECC memory (link)
AMD's Ryzen CPUs support ECC memory and Samsung B-die was (at that time) the best fit for a first-generation Ryzen. I found these sticks for a reasonable price at the time B-die was on its way out.
Storage Controller: LSI SAS9207-8i based HBA card (link)
Like the popular IBM M1015 HBA card, this offers up to 8 SAS/SATA drives, through a PCIe 3.0 x8 link. There are many OEMs with this solution and they are cheap to find. Considering I'm only going to be running HDDs from these, being limited to 4 PCIe lanes shouldn't be a problem. It requires two "Mini-SAS (SFF8087)to 4 SATA" cables.
Storage Tier 1: 2x Samsung 2,5" SSD 830 SATA (link)
I had one of these and was able to buy a second one used, for not much more. This allows cheap but fast SATA SSD storage. It'll do for now.
Storage Tier 2: 6x Toshiba 2,5" 3TB 5400rpm SATA (link)
This was the toughest part. Recently it has become known that a lot of HDDs are SMR based, which you don't want for storage that involves random writes. I'm going to be using ZFS and SMR is discouraged for that use. The largest 2,5" HDDs I could find are 2TB drives, which later turned out to be SMR-based as well. The only 1TB+ drives that aren't and also aren't enterprise 10.000rpm (needing cooling, making lots of noise), are the Toshiba MQ03ABB300, which are hard to come by. But they were available in my region and still are for about 100€ a piece. Much cheaper than SSDs and not much more expensive than SMR drives.
Network Controller: Mellanox ConnectX-3 MCX354A-FCBT dual 40GbE card (link)
These are interesting beasts. They are cheap and offer two 40GbE ports through a QSFP+ port. Direct-Attach Cables are easily found, basically it's about 100-120€ for two cards and a DAC, having a 4GB/s link between two devices. Switches are more difficult though. But these cards are also interesting for 10GbE SFP+, with QSFP+ to SFP+ adapters also readily available. Oh and did I say it has two of those ports ?
Even though 25/50/100GbE is becoming the norm, these cards are dirt-cheap because a lot of companies are migrating to the formentioned better upgrade path.
M.2 Adapter: ADT Link R43MR M.2 M-key to PCIe 3.0 x4 adapter (link)
I bought these before the build was underway and luckily I did, bifurcating the PCIe x16 is not easy with the common x8/x8 adapters in that tight of a space. These components also heat up considerably without active cooling. In the end it's the better solution for my build but with this I am limited to PCIe 3.0 x4.