NYTN: Not Your Typical NAS or 20Tb of SSDs in SFF case.

TLN

Caliper Novice
Original poster
Mar 9, 2020
32
13
Finally decided to rebuild personal storage and configure it proper way. Planned this project long time ago and acquired few components over past year or two. Might sound a bit ambitious, but I hope it will work out the way I planned.

Plan: NAS, HTPC, Application server. Reliable and future proof. All in one: bit faster and bit more complicated then your typical NAS or typical HTPC in a sleek case.

1. NAS: I've been backing up my data recently and calculated that I have 6-8Tb of data. Half if important, other half is less so. It can easily grow to 10Tb or more if I upgrade my camera to something better, but 10Tb sounds like a good start.
2. Some NAS boxes can run docker containers, but fail if you want to run something but docker container: Low-end CPUs, not enough memory and limited OS support. Need to support all operating system and be able to run them all at once.
3. HTPC: While you can do lost of stuff with Chromecast or Nvidia shield sometimes you wish you got normal PC for certain stuff: casual games, or specific applications. Casual games, xbox controllers and etc.
4. Futureproof: 1Gbps network is too 2019. Need something faster then that in 2020.
5. Reliable. Box will be running 24/7. Enterprise equipment over consumer stuff. You'll see few cool parts below.

In order to run multiple OS we'll need hypervisor. I'm going with Vmware ESXi: I know it a bit better and using it for years. It's supported by big company and well documented. You'll be able to run multiple OS at the same time, and I'm looking at 5-10 virtual machines.
Network: If looking at 10Gbps network. It's not very common, but it's the way to go. My home switch supports 10Gbps already.
Data: If we want to run NAS at 10Gbps speed we need to have bunch of hard drives in RAID array. Or some SSDs in RAID. I decided to go crazy and ended up with Nytro 3330 in 15.36Tb. It's designed for data centers and rated for 1DWPD (Daily write per day). With 5 years of warranty that gives us: 15Tb x 365 days x 5 years: 27Pb of endurance. Consumer hard drive (950 PRO) gets you 400Tb of endurance for example. Drive is pretty fast, and rated for 2100MBPS/1200MBPS seq read/write. Faster then 10Gbps network. Two cons: Unlike SATA or NVME drives SAS will require additional controller. Price.
VM storage: In order to store Virtual machines I'm going with enterprise NVME drive. Intel P3600 series or Samsung PM 1725 in so-called AIC (Add-in Card) format. Motherboard that I'm going with doesn't support M.2 NVME drives. Slower Intel P3600 drive is rated at 2700/2100MBPS, while Samsung doing around 6000/4000MBPS. Normally those drives are used in high-performance databases. Samsung is rated for 55PB of data written.
Motherboard: There're some motherboards with integrated 10Gbps network. There re also few boards with integrated SAS controllers. My mobo comes with both of those: Asrock rack EPC612D4U-2T8R. Board is based on C612 chipset and designed to work with Intel Xeon E5-v3 and E5-v4 processors. There's no M.2 port on that board, but with three PCIe slots I can live without one. Another issue is Narrow-ILM cooler mount: it limits available CPU coolers: Noctua makes some air coolers and Asetek have Narrow ILM mount for AIO.
CPU and memory: That's easy part: there're lots of Xeon processors available, from 4 core CPUs under $50 to powerful 18-core chips. Intel Xeon E5-2000 processors are disgned for dual-CPU systems, while Intel E5-4000 were designed for 4-CPU systems. Both work perfectly fine in single CPU boards. Motherboard can support up to 128Gb of DDR4 memory that I might max out. I have few 16Gb sticks around.
CPU Cooler: Currently I have Asetek 550 AIO with narrow ILM ring. No issues and this is what I gonna start with. I'm not so sure about AIO for 24/7 system, and will likely end with Noctua cooler. Narrow ILM from Noctua is coming my way. L12S for low-power CPU and better compatability or C14S for max performance/low noise. Looking at top-down CPU coolers to cool 10Gbe chip that lists next to CPU.
Videocard: Most likely Radeon. No idea which one. Consumber NVidia cards doesn't work with passthrough and existing workaround are not that stable. I'm not going after fps and 4k here. I'd like to have 5700XT, but there're some issues, see below.
Case: Trying to fit all that in Cerberus case.
PSU: Corsair SF450/600/750. Tested system with existing HX850i PSU on my table: with E5-2683v3 (120W, 12 out of 14 Cores were assigned) and Intel P3600 I was able to max it out at 205W. Not very scientific test, just wanted to get initial numbers. I'm fairly sure I'll be okay with 2618Lv3 and SF450

Below is two projected builds: one is reasonable build and other is maxed out version.

Components:

Small buildBig build
CPU:Intel Xeon E5-2618Lv3 (8C/16T, 3.0Ghz)Intel Xeon E5-2683v3 (14C/28T, 2.0Ghz)
Intel Xeon E5-4650v3 (12C/24T, 2.1Ghz)
Intel Xeon E5-4669v3 (18C/36T, 2.1Ghz)
Cooler:Noctua L12s.
Asetek 550LC (Single 120)
Noctua C14S
Asetel 650LC (Dual 120)
Memory:4x16Gb DDR4 2133 ECC Reg.4x32Gb DDR 2400 ECC Reg
Storage:Nytro 3330 15.36Tb SAS SSD.
Intel P3605 1.6Tb NVME
Samsung 850 PRO SATA SSD
Nytro 3330 15.36Tb SAS SSD
Samsung PM1725 6.4Tb NVME SSD
Samsung 850 PRO SATA SSD
Videocard:Radeon 5500XTRadeon 5700XT.
PSU:Corsair SF450.Corsair SF600 or SF750.
Case:Sliger CerberusSliger Cerberus
 
Last edited:
  • Like
Reactions: maped

TLN

Caliper Novice
Original poster
Mar 9, 2020
32
13
Problems:
1. Drive partitioning. While I can install hypervisor on NVME drive it's usually a good idea to keep hypervisor and VM separate. Most likely I'll end up with extra sata SSD just for ESXi installation. Might use that drive for Windows and Linux as well. ESXI can be installed on USB stick and that's how it's done in servers but...
2. USB controller. Board works bit different compared to my existing workstation (Asus Z10PE-D16 WS). I had to disable Intel USB 3.0 controller to pass both controllers to Virtual Machine. Also it seems that both type of ports (2.0 and 3.0) are connected to same controller. With my ASUS I could pass USB 3.0 to "NAS VM) and usb 2.0 ports to Windows VM. I can always get additional USB card, but prefer not to.
3. Videocard options. If videocard installed in top slot I blocks 2nd PCIe slot. So far it looks like I can get away with it, but I prefer to have all three slots available. If long videocard is installed in bottom slot (most logical option IMHO) it blocks onboard SAS ports. I have to use short (ITX) cards or mod PCIE extender. I haven't decided yet, but I will likely start with HP OEM RX580 4Gb: it's the only ITX RX580 that I'm aware of. Other options include RX570 ITX, Vega 56 Nano and Powercolor 5700 ITX that's only available in Japan.
4. Cooling: I'm not worried about CPU cooling: i have a few options there. Boards like that usually designed for good front-to-back airflow and it's not a problem in datacenters. I've seen my LSI SAS at over 90C in my workstation. I'm going with top-down cooler for CPU to cool 10Gb NIC and considering copper hearsinks for C612 and SAS chips. Not a top priority, but something that's on my list.

Configuration:

Photos:

Benchmarks:
 
Last edited:

TLN

Caliper Novice
Original poster
Mar 9, 2020
32
13


Xeon 8/16 4x32 uddr4 ecc HP Samsung 512 Gb м.2 Intel 905 480 Gb м.2 Supermicro 128 Gb ssd
I bought but did not install the 1030 video card. MIcro PSU 250 w, power supply brick 150 watts
That looks like supermicro board with Xeon 1521 or 1541 processor.
In fact, I was thinking about Intel NUC or something similar. There's no expansion and it's getting pretty expensive very fast.
I'm also aware there's Supermicro/Asrock boards with Intel Xeon 1541 (8Cores/16Threads) with integrated 10G and SAS controller but those run for $400-500.
 

TLN

Caliper Novice
Original poster
Mar 9, 2020
32
13
Currently build looks as follows:


I ran two VMs (Desktop and Storage) and did quick test.
1 drive - system drive (VMDK Stored on Intel P3600 NVME)
2nd drive - Same drive via Network (Another VMDK connected to NAS VM)
3rd drive: Nytro connected to NAS VM. VIa Network of course.

Quite interesting results. I expected it will be cut off at 10000mbps (10gbps), but it was better then that. I intentionally used 16Gb files in crystaldiskmark.

This is how it looked form VMware:

I got cables that should be dual-ported, but can't confirm if I'm using dual-port SAS or not.
 

TLN

Caliper Novice
Original poster
Mar 9, 2020
32
13
Received case from Sliger. Turns out it's pretty light and small.

Got some storage:

Fits pretty nice, but I want to place Nytro on small standoffs for a better cooling:

With motherboard in:
 
  • Like
Reactions: sancho and maped

TLN

Caliper Novice
Original poster
Mar 9, 2020
32
13
Decided to go with Visiontek 5500XT videocard over old OEM RX 580. It's the only "short" Navi card besides that unicorn 5700 from powercolor. Was able to get one for reasonable price from Dell. Performance wise it's about the same, but it's newer and come with warranty. Turned out to be a good idea, card overlaps PCH radiator: any card longer then that and SAS ports will be blocked.

Ordered Noctua C14S cooler. Still waiting for Noctua mounts.
 

TLN

Caliper Novice
Original poster
Mar 9, 2020
32
13
Turned out that there's such thing as "Amd Navi reset bug". It affects 5700XT and 5500XT and everything in between. Host cannot reset videocard if you reboot virtual machine and you have to reboot host itself. I'm not going to reboot all the my storage every now and then, right?
Decided to give nvidia a try. Nvidia cards require some workarounds, but I knew it could be done. After some attempts it finally working now.


Shortest 2070 RTX besides Gigabyte Mini ITX 2070 and MSI Aero ITX 2070. Both of those cards are $520+.
Asus is bit longer then Visiontek 5500XT or both 2070 ITX versions at 197mm. Card covers one SAS port, but I can live with that now, and fix it later if needed.

I was running Assasins creed Odyssey with 5500XT and CPU was used at 30% (2683v3, gaming in VM). With 2070RTX I'm running ultra high graphics and CPU is way more utilized at 50-60%.

5500XT:

2070RTX:

I'm using ESXI monitoring that doesn't show GPU temps. But after 3 hrs of gaming temps are below 55c. Hottest components are memory sticks @53-54C and PCH @50C.
 

TLN

Caliper Novice
Original poster
Mar 9, 2020
32
13
Everything together:

Sliger Cerberus
Asrock EPC612D4U-2T8R.
Xeon E5-2683v3 (14Cores)
128Gb DDR4
3.2Tb NVMe + 15.36Tb SAS drive + 128Gb SATA drive.
ASUS 2070 RTX.


 

smitty2k1

King of Cable Management
Dec 3, 2016
789
400
That's crazy, love it! Need more SFF NAS around here.
That SAS drive sure is something else!
 

Sligerjack

Caliper Novice
Jul 29, 2019
26
46
www.sliger.com
I have no clue how I didn't see this thread earlier, but good god man, this thing is beautiful!

My personal Cerberus serves as my main system and my NAS, but this thing just put mine to shame! All of the kudos!