• Save 15% on ALL SFF Network merch, until Dec 31st! Use code SFF2024 at checkout. Click here!

Log Workstation-Server in a Briefcase

Brian McGuigan

Chassis Packer
Original poster
Sep 17, 2022
14
11

Background​

To understand this PC Build, you probably have to know a bit more about me, to figure out how I arrived at my requirements.

Firstly, I am 80. I started my career in computing in 1966, having left university with a degree in Mechanical Engineering. Back then, Computer Science was not a discipline - nor was Business Studies. If you wanted to do anything like that, you had to do a ‘Technology Degree’ – like Mechanical Engineering.

When I started, I worked on punched card tabulators. From there I progressed to batch-processing mainframe computers, like the IBM 360 – punched cards in, line printer out. In the mid-seventies, I got involved with my first interactive Mini-Computer – a PDP 11. Boy was that an eye-opener. You could actually see what was going on inside the thing! It even had a ‘Visual Display Unit’ - we now call them Monitors.

I wondered what I was going to use it for, to start with. Interactive debugging of programs, whilst they were still running! That was real progress, compared to waiting for overnight test runs, filled with debug statements, and spending the next day, figuring out what went wrong, and trying again the next night. (It was a wonder we got anything done!)

I have been using a laptop PC since they first 386 came out in 1985. I have never had a desktop PC, or therefore a GPU. I’ve also never had a screen with better than HD 1080p Resolution, either. Since I’m 80, I think a high-res monitor would be wasted on my poor old eyes. I have never used a PC for gaming, either – Solitaire doesn’t count. I lost interest when Doom first came out in the early nineties. (I used to use the PDP11 at work, from home, before modems came out, over an acoustic coupler, to play text games like Moon Lander and Adventure. But I guess text games don’t count, either.)

Unfortunately, early Nickel-Cobalt batteries, were not good at holding their charge. So, I got into the habit, when I was ‘out and about’, of never using my laptop - unless it was plugged in. Never wanted to use one in the airport, or on trains, planes or busses etc – they were simply too big. (I now use my Samsung Tablet or Nokia mobile phone, for that, all the time.)

In 1995 I started my own company and got involved in computer support, training, as well as software development. If you want to see the details, they’re on www.linkedin.com/in/brianmcguigan

I bought my last laptop PC in 2007. It had an AMD Turion CPU with two cores running at 2 GHz each. It died after 14 years faithful service, multiple hard drive and O/S upgrades. I have been using a VM inside our old Server ever since. Since, at home, I always used my laptop, with an attached keyboard and monitor, I hardly noticed the difference. As the VM was every bit as responsive as the laptop had been.

The only thing I’d really given up was portability. As the Server, which I built myself, is in a full-tower case. It is so heavy and hard to move - I fitted it with rollers. Portable, it is not!

Then a couple of years ago, the system drive, on the Server, also died. Having replaced the drive, I did something that in over 50 years, I have never done. I dropped a screw on a live motherboard! The magnetic tip on my screwdriver, picked up a screw that I didn’t know was there, and dropped it, in just the wrong spot – end of motherboard! So, there I was - no laptop, and no Server either!

Now, I had been lusting after replacing my heavy clunky Server, with something more svelte and portable, ever since the Ryzen 2, came out in 2018. I had decided, I didn’t really need a laptop. For when I’m out-n’-about, my Tablet and Phone, are quite adequate for: YouTube, surfing the Net, and so on. Beside which, laptops are not really upgradable – apart from their hard drives. So, when you’re buying a new one, you’re paying for a new keyboard and screen – when your old ones are perfectly OK. You can also never get a laptop, that’s as powerful as a desktop, because of heating problems.

But, at the time I destroyed the Server, I was not yet ready to replace my laptop. Frankly, I was waiting for Zen 5 to be released, as the rumour mill had it that it would be a significant upgrade over Zen 4. So, because, at that point the Server’s motherboard was 15 years old, there was no-way I could replace it with anything that would run its existing CPU and Memory, I had to replace the Motherboard, CPU, and Memory.

Specs of Current Server​

I opted for the cheapest options I could get away with:
  • Motherboard - Gigabyte mATX Aorus B450 Elite 4x DDR4, 2x PCIE3.0 x16, 6x SATA3, 2x M.2,
  • CPU - AMD Ryzen 5 5600G 3.90GHz, Boost: 4.4GHz, 19MB Cache, AM4, 6 Core, 12 Threads,
  • Memory - 16GB Corsair Vengeance (3200MHz) DDR4.

I also decided to upgrade the System Drive to a:
  • SSD - Crucial 500GB, M.2 NVMe PCIe Gen3,

Besides these our current FILE-SERVER, to give it its correct name, ‘boasts’:
  • 2 x 1TB 2.5” SATA SSDs in an internal RAID 1 Array, which acts as our Data Drive,
  • 2.5” Hot Swappable Disk – currently used for backup.

FILE-SERVER also has two network Adapters, and a Wireless/Bluetooth PCIe Card. FILE-SERVER runs Windows Server 2022 Datacentre, simply so that I can do incremental backups, yet recover any file from however long ago, as I like. I use a Hyper-V Virtual Machine, inside FILE-SERVER, running Windows 11, 24H2. Another VM is running a pfSense Firewall that protects our internal network, from the outside world.

Frankly, I think I made a mistake in buying a Micro ATX Motherboard, as I have now decided to go ahead and build my own ‘Tiny Workstation-Server’ case. This is because I need something that it is portable by the end of January, as I’m going to be teaching again. For, even though I’m not yet ready to make my mind up on which Processor/Motherboard combo I’ll end up with, I KNOW it will be an AM5 Mini-ITX Motherboard.

Choice of Processor/Motherboard​

The reason for my indecision, at the time of writing, is that the Ryzen 9950X, and 9900X, have been released, but have had very lack-lustre reviews. The Ryzen 9800x3D, has also been released - this time to rave reviews.

Intel’s latest, the Core Ultra, has also just been released – and universally panned. (I would NEVER buy Intel, anyway. I believe they milked the computer market for ten years, insisting that 4 Cores was enough for anyone. So, I’m VERY glad they are finally getting their comeuppance.)

There are also rumours that a 9950x3D and a 9900x3D will be announced at CES, in early January next year. The AMD Strix Halo, now called the Ryzen AI Max 300 series, is expected to be released in early 2025. These will be APUs, so will have an integrated graphics. They are expected to have up to 16 Zen 5 Cores, with 32 threads and mid-market GPU performance. It will probably render mid to low end GPUs obsolete.

I’m anxiously awaiting reviews of the 9950x3D and the AMD Strix Halo. I could end up with either. Provided the performances are comparable, I’ll probably end up with the Strix Halo. Whichever one I get, I’ll be pairing it with a Gigabyte B650i Aorus Ultra ITX Motherboard, in a custom-built 5 Litre, DIY Case, with up to four hot-swappable 2.5” drive bays, and three M.2 slots, one of which is Gen 5.0 capable.

So, how did these design decisions, come about?

Case Reviews​

Initially I looked at available mATX or ITX cases for inspiration. I quite liked the Thermaltake Core P1. This is a Mini ITX case of 16.6 litres. The thing I really liked about this case was the simplicity of construction, and the fact that you could see everything inside the case. This is because I still, teach people about computers. So, it would be nice, if I could show people what goes on inside. But it had no handle, and at 16.6 litres it could only be considered ‘luggable’.

I also liked the Corsair Graphite 380T. This is a Mini-ITX case of 16.6 litres. This one has a handle at the top. So, at least it can be carried. But the small window at the top gives a very limited view of the interior. Again, at 16.6 litres it still pretty big to carry. I really wanted something laptop sized.

The Corsair Carbide Air 240is a Micro-ATX case of 24.4 litres. It has a very clear window in the side, and I just liked the look of it. I thought long and hard about adding a handle to the top.

Micro-ATX cases, have the advantage of being able to take a Micro-ATX Motherboard. These have four RAM slots compared to the two slots in a Mini-ITX Motherboard. So, it offers the tantalising prospect of being able to double the amount of RAM in a few years’ time, without having to throw away, or sell, what you already have.

However, you have to be VERY careful about what RAM you buy. Otherwise, it can lead to incompatibility issues. (I just decided, it would be easier to double the RAM to start with. So, no real advantage, there!)

Micro-ATX motherboards also have more PCIe Slots than a Mini-ITX motherboard – all of which, have only one. However, since I have never had a GPU, I was beginning to doubt that I needed, even one PCIe Slot! So, why would I need a board with 2 to 4 PCIe Slots?

So, despite these advantages, at 24.4 litres, I decided that, if 16.6 litres was too big - how the hell could I justify 24.4 litres, just because I liked the look of it!

I also had a dalliance with the NCASE M1, which is a, highly popular, Mini ITX of only 12.6 litres. Obviously, things were getting a little smaller, now. It even is available with a tempered glass side panel, as shown. But it was still not really as portable as I’d like.

So, I specifically researched ITX Case with handles, and discovered the Silverstone Milo ML08B-H, which is still 12 Litres. It obviously has a handle, but no glass side panel.

I also came across the Lian Li Aluminium TU150-X. This has a retractable handle, and a glass side panel. Nice case, but at 23.6 litres, it was not a step in the right direction.

So, I researched for the ‘Smallest Mini-ITX case’, and came across the Dan Case A4 SFX. At only 7.2 litres, we were starting to get somewhere.

But no handle, though no doubt one could be added. There would be little point in adding a glass side panel, though, as everything is so tightly packed, you couldn’t see anything, anyway.

Summary of Case Reviews​

Having looked at multiple case reviews and builds on YouTube, I came to the conclusion, that most of the complications were caused by the size and amount of heat given off by modern GPUs.

But here was I hoping that I might not need one anyway. This was for two reasons:

What you’ve never had, you never miss.

Both Nvidia and AMD seem to be abandoning the low to medium range of GPUs. I believe this is because they have realised that with the advances being made in integrated graphics, that separate GPUs will no-longer be required for gaming - certainly at 1080p. (Integrated Graphics are commonly called iGPUs. AMD call them APUs.)

I really wanted a case that would accommodate hot-swappable drives for backup. If I was going to use the new computer as a Server, then it had to accommodate a hot-swappable drive for backup. None of the cases I’d seen came anywhere close to providing this capability.

Could I design one, with hot-swappable drive bays, that was even smaller and be even more portable without a GPU? I’d have to make a GPU optional. So that, should I ever feel the need, or my expectations of integrated graphics remain unfulfilled.

Designing My Own Case​

Size of Case​

I first of all had to decide whether I wanted to design a case, with a handle, or design something that would fit inside my briefcase, or perhaps my old leather DEC Laptop Bag.

If I designed a case with a handle, I would need to find some way of protecting the case, itself. So, I measured my briefcase and laptop bag:

Briefcase Measures 430 x 325 x 98 mm

Laptop Bag will take anything up to 300 x 390 x 90 mm

Since I usually preferred taking my briefcase, rather than the laptop bag, I decided to base my dimensions on my briefcase. Though, it would be useful, if I could use either. The laptop bag is a bit more restrictive of height. But it’s flexible, so there might be enough ‘give’ in the laptop bag for it to be OK.

If I can’t make it fit my briefcase, I’ll need to add a handle. 😊

Fans​

Almost the first decision was the size of the case fans. These come in three main sizes: 65mm, 80mm and 120mm. 65mm fans, tend to be noisy and would restrict the hight of components too much. 120mm was obviously too large for the briefcase height. So 80mm, it would have to be. This would leave 18mm or so, for material thicknesses, and so on.

I decided to go with a Noctua 80mm NF-R8 Redux Edition 1800RPM PWM Fan – based on Noctua’s reputation, and the fact that I don’t like Noctua’s usual brown colouring.

Case Structure​

I liked the simplicity of the Thermaltake Core P1’s – just a base, four standoffs, and a glass top. So, I started out with the concept of having a 3mm aluminium base, and a 3mm clear acrylic top, held up by 80mm advertising standoffs. I could then, if necessary, stretch aluminium or stainless mesh around the pillars.

Initially, I liked the look of Modders Mesh, which is a hexagonal steel mesh. Unfortunately, it comes from the States. So, the shipping doubles the cost. It is also plain steel, as opposed to stainless or aluminium. It therefore requires painting etc. I tried very hard to find an alternative supplier – but to no avail. The alternative meshes I found; I didn’t like.

I also began to get concerned about: how I would fasten it at the ends, and keep it under tension. I was concerned whether the mesh would bend smoothly around the standoffs, and would I be able to finish it neatly around the drive bays and any other component that needed access to the outside. I wasn’t sure.

At around the same time, I began to realise that the standoffs were also occupying considerable area of the case, because you could not put anything in the corners of the case – because there was a standoff was in the way!

If you tried, it always wasted the width of the standoff along the other side. Since all PC components have square corners - I needed a case with square corners!

Not having the necessary equipment to bend aluminium or acrylic, or the inclination to buy or build it, I’d have to stick to what I was used to – wood. Making the sides of the case with wood, also solved my mesh problem, as I can simply cut slots in the wooden sides to support aluminium exhaust slats. I still decided to use mesh at the back of the case, where it’s not visible, to increase the air exhaust area.

PSU​

I calculated the TDP, or Thermal Design Power, of my likely components – excluding any GPU. It came to 154W. Of which, 120W were accounted for by the CPU. So, I was not going to need a powerful Power Supply Unit – unless I added a GPU.

ATX PSUs were obviously out, because they are 83mm high. SFX PSUs at only 63mm high, were feasible. But at 100 x 125 mm, they still occupied a chunk of space, and quite frankly offered far more power than needed. I even investigated server type PSUs. But their tiny fans, make them far too noisy for a living room.

In researching small PSUs, I came across the HDPLEX 250W GaN ATX Power Supply. This is tiny at only 170 x 55 x 25 mm. It is exactly the same length as an ITX Motherboard, so fits in nicely with an ITX build. The 25mm height, still leaves plenty of room for airflow over it. It is even fan less, so makes no noise whatsoever, and gives off very little heat. Need more power than 250W? Use them in tandem!

HDPLEX also, do a tiny 200W DC Converter that plugs directly into the Motherboard, so takes no space at all. BUT you have to use an external power supply ‘Brick’, or find space for one internally. I don’t like using anything that hangs off the computer, if I can avoid it. So, I was unwilling to go down that road.

I do have and external DVD/RW Optical Drive, simply because I could not get an internal drive.

Cooling​

Initially I decided to use water cooling. Not because I was keen on water cooling, but simply because water blocks are smaller than air coolers, so you can still see where the CPU is. Some air coolers are so large that they cover the entire motherboard.

Initially, I looked at using an AIO, or an ‘All-In-One’ Water Cooler, which come prefilled with water and fixed length pipes. The pipes were just too long to fit in my tiny case. So, I’d have to use a ‘Custom Loop’.

I designed a ‘Custom Loop’, complete with extra thick 120mm radiator and fan. Because of the restricted height of the case, the radiator had to lay on its back. This meant it made the case considerably larger in area, but it still fitted in the briefcase, so there was no problem there. It would have been pretty efficient too, for it sucked cool air in from the bottom of the case, and blew it directly out the top, without it going anywhere near the inside of the case – COOL, literally. 😊 THEN, I costed out all the components needed. The cost, was quite a shock. So, I decided to research low-profile air coolers instead!

In doing that research, I came across the Thermalright AXP-100-Full-Copper Low-Profile Air Cooler, which is capable of handling processors of up to 155W TDP. So, that should be more than adequate - as I don’t intend to do any overclocking. It is also quite attractive. With a profile Height of only 38mm, there’d be plenty of clearance, between it and the top of the case too. So, I shouldn’t need to cut a hole in the top of the case to feed the cooler - or so I thought.

It wasn’t until I downloaded an image of the cooler, and scaled it to fit the cross-sectional drawing below, that I realised that the 38mm profile height, didn’t include the 15mm fan on top! ☹ That’s why there is now a hole in the top, covered by a mesh filter. (It’s a magnetic filter, so that I can remove it easily. It will be held down to the acrylic top by old, cut up, fridge magnets glued to the top.)

My water-cooled case layouts, had enough space down one side for three inlet fans. I finally checked the specs of the fans, and looked up their airflow. This made me realise that, because the case was so small, three 80mm fans at full bore would change the air in the case nine times a second! Obviously three fans would not be required. One would be adequate. Opening the roof of the case for the low-profile air cooler, has obviously added more airflow. So, at full bore the air in the case will now be purged five times a second. 😉

Since both fans draw air into the case through filters, they will maintain a positive pressure, minimising dust build-up in the case.

Front Display​

Because this is the first case I have designed, I am understandably anxious that the above cooling will, in fact, be adequate.

I researched small display monitors, on AliExpress, that can display the computer’s vital signs, using something like MoBro, or NZXT CAM. That way I can see if the CPU is getting too hot, and do something about it. (I might even, not cut a hole in the acrylic top, until I’ve checked whether it’s necessary, or not.)

There are, in fact two type of mini-display panels available.

Some use proprietary software, and use an internal USB Connection. However, they will only run their own software – nothing else.

Others are full Windows monitors, and will be recognised by Windows as such. So, you can display anything on them that you can display on an ordinary monitor. So, they plug into an ordinary HDMI Port. Since there are no internal HDMI Connections on any motherboard, this means that the display has to be connected to a port on the back panel – like any other monitor.

I opted for a full Windows monitor, because of the flexibility. I can, for example, use it during PowerPoint presentations to display Presenter’s View, which tells me what’s coming next. (I don’t need to read it – just recognise which slide is next.)

After a search I found the LESOWN Black HDMI IPS Monitor 800x480 for Windows on AliExpress. It is exactly the right size - 123 x 79 x 12.1mm.

It has its connections down the right-hand side - when viewed from the front. Which, you can see from the ‘Internal Layout’, below, is exactly the wrong side for us. Fortunately, since it’s a Windows Monitor, this is no problem. As we need to do is change its ‘Display Orientation’ to ‘Landscape Flipped’. ‘U Type’ Connectors, will be used to enable connections to be made behind the Display Panel.

Because the chosen Display Panel, is simply a 5in Windows Monitor, and there are no HDMI Headers on the Motherboard, it will have to be connected to the back panel through a slit in the back.

The Angle Bracket, fastened to the side of the Drive Bays, is big enough to support the Display Panel. This enables the Display Panel to remain connected to the PC, even when the case is removed. (Since the bracket is stainless steel, and side panel is aluminium, a thin paper gasket between the two, will be needed to avoid galvanic corrosion.)

Monitor​

When it’s at home, I can use my current Monitor and Keyboard. But they are too big to cart around with me. Obviously the 5” Display, is not much use for anything, other than displaying the computer’s vitals.

My intention is to set up a Wi-Fi Hotspot on the computer, and use Remote Desktop Connect from my Samsung Tablet, so that I won’t need an Internet connection.

Noise​

The Noctua inlet fan makes 17.1 dBA at full bore. The fan on the air cooler makes 24.0 dBA. I was not sure that their combined noise, would be the sum of the two. So, I consulted CoPilot, who produced the appropriate equation. Apparently, the two in combination, should produce only 24.8 dBA – at full bore.

A sound level of 24.8 dBA is very quiet. For context, it's comparable to a whisper, the rustling of leaves, or a quiet library. It's the kind of noise level you might find in a peaceful, quiet room.

Drives​

Our current FILE-SERVER has a 500GB PCIe Gen 3.0 M.2 NVMe SSD, System drive, and two 1TB 2.5” SATA SSDs in an internal RAID 1 Array, which, together, act as our Data Drive. The System Drive has less than 40GB of files. Whilst the Data Drive, has less than 400 GB of Data.

I intend to put the RAID 1 Array behind an M.2 NVMe Cache Drive, to speed up disk access. I have therefore purchased a 1TB PCIe Gen 4.0 NVMe M.2 SSD. So, effectively the contents of the entire RAID Array, will also be in the NVMe Cache Drive, as well.

I looked at purchasing a PCIe Gen 5.0 drive, but they have only just been released and are currently, four to five times the price of a Gen 4.0 Drive. Whilst Gen 5.0 drives read/write speeds are considerably faster, this only really shows up when dealing with huge files. For us , IOPS, are much more significant, and on this metric Gen 4.0 and Gen 5.0 Drives are much more comparable.

PCIe Gen 5.0 Drive also run considerably hotter and require significant cooling. But I don’t think with purging the air in the case five time a second, this will ever be a problem.

I’m not sure at this point, what to do with our current System Drive. I could move it over to the new machine and keep it as the System Drive. But it’s only a Gen 3.0 Drive, so it will be better to partition the 1TB Gen 4.0 drive, as there is plenty of spare room – 100GB for the system partition and 900GB for the Data Cache Partition, perhaps. It could be kept as a spare drive, in one of the two spare M.2 Slots. Perhaps, install a Linux distro – who knows.

It is actually difficult to know what to do with spare M.2 Slots on a mITX motherboard, apart from using them for an NVMe Drive. This is because the second M.2 Slot is usually underneath the first, or sometimes on the bottom of the Motherboard. So, there is very little room for adapters with headers on them. You’d have to use the top M.2 Slot for that. However, the top Slot on all current boards, is the only Slot that is Gen 5.0 capable. ☹

You could always use an M.2 Slot underneath the Motherboard by increasing the height of the standoffs – provided there was enough space above the Motherboard. But then you’d have to work out, how to get any cables back on top.

Drive Bays​

FILE-SERVER has an ICY Dock 2.5" Hot swap Drive Bay for hard drive backups. This is going to be transferred to the new machine. So, it was only natural when looking for hot-swap bays for the RAID 1 Array, that I chose the matching ICY Dock flexiDOCK MB522SP-B. This has two hot-swappable drive bays, instead of one.

Since both Drive Bays are physically, the same size, it gives us the possibility for buying another twin drive bay, and adding another 1TB Drive to double the capacity, by changing the RAID 1 Array, to RAID 5. (I really don’t think I’ll ever need but, again – who knows.)

USB Front Panel​

Each of the ICY Dock Drive Bays, is 26mm high. So, this left 28mm below them for a Front Panel with USB Ports and a power switch. Unfortunately, there is no such thing. Well - none that incorporate a power switch, anyway. So, I had to get one off Ali Express. It has two USB A Ports, a USB C Port and Audio Ports. My intention is to replace the USB C Port with a power switch, that incorporates an LED Power light. (All my USB cables are USB A.)

Drive Bay Assembly​

This will consist of two ICY Dock Drive Bays at the top, and the USB Front Panel below. This is so that the Front Panel can be screwed to the base of the case. The Front Panel then supports 3mm aluminium drive bay sides, which hold the ICY Dock Drive Bays, above.

The Drive Bay Assembly will be made by, first, screwing on the Drive Bay sides. The assembly, will then be screwed to the Base from the inside. The Base will be drilled and tapped, to avoid use of nuts, underneath. If necessary, the Drive Bays could be removed to facilitate screwing.

Experimental drilling and tapping of the base, has already been performed to prove it can take an M3 thread, as I ordered 3mm aluminium chequer plate, which in parts is only 2mm thick. (It was cheaper.) Even the 2mm areas will take an M3 threaded screw! I tried pulling out a trial screw, with a pair of pincers. It bent the aluminium!

GPU Options​

When I started on this design, it was far from certain that AMD Strix Halo, now called the Ryzen AI Max 300, would be available for the AM5 socket. Many people thought that being an APU, it might only be available for laptops. So, I thought through, how to incorporate a GPU into my design, should the need arise.

In my original water-cooled design, I found that if I made the computer’s case, as large as the full size of the briefcase, there was enough space for Full Sized double-slot GPU e.g., RTX 3080, either in its own separate compartment, or outside the acrylic top, effectively leaving it out in the open air. It needed a PCIe Riser Cable, so that it could lay ‘flat on its back’, parallel to the motherboard. It also needed a second fan-less HDPLEX 250 PSUs for a total of 500 Watts.

I also looked at External GPU Docks. These however require Thunderbolt 3, Thunderbolt 4, USB 4 or OCuLink on the Motherboard. This obviously limits you to more expensive Motherboards, or putting some sort of Adapter in a spare M.2 Slot – if there is room. Both of which adds to the cost of the eGPU Dock. It then occurred to me, that a far cheaper option would be to use a PCIe Riser Cable to extend the PCIe Slot, underneath the side wall of the case, on to a short extension of the base of the case. This would then enable us to use a GPU of any size. This is the option I have made provision for.

However, now that AMD have announced, that the Ryzen AI Max 300, will be available for the AM5 socket, I will probably make the case without an extension, and add it later – if I change my mind.

Motherboard Options​

For our build, we need an ITX AM5 motherboard with:
  • Four SATA connections,
  • At least one M.2 NVMe slot - preferably one that supports PCIe Gen 5.0 for future upgradeability.

B650 Motherboards with Four SATA Connections​

There are only two ITX AM5 motherboards with four SATA connections:
  • Gigabyte B650I AORUS ULTRA, and the
  • MSI MPG B650I EDGE WIFI.

Gigabyte B650I AORUS ULTRA:
  • Supports a Gen 5.0 NVMe drive, plus
  • Two additional Gen 4.0 NVMe drives (one below the Gen 5.0 Drive, and one underneath the m/b).
  • Three M.2 drives on a Mini ITX motherboard, is excellent for server expansion.

MSI MPG B650i Edge WiFi:
  • Supports only two Gen 4.0 NVMe drives, and
  • Two M.2 Drives.
So, it’s not really in contention.

X870 Motherboards (mITX)​

Currently, there are two X870 mITX motherboards available, but they both have only two SATA Connections:
  • Asus ROG STRIX X870-I GAMING WIFI,
  • Gigabyte X870I AORUS PRO ICE.

Advantages of X870 Motherboards
  • More Memory: 128GB max vs. 64GB
  • Faster Memory: Up to 8400 MT/s vs. 6000 MT/s
  • USB 4 vs. USB 3.2: Useful for eGPU
  • PCIe 5.0 vs. PCIe 4.0

Considerations​

  • Both X870 boards have only two SATA headers and two M.2 slots - but only one is Gen 5.0.
  • The Gigabyte B650I AORUS ULTRA has one Gen 5.0 M.2 slot, another two Gen 4.0 M.2 slots, and four SATA headers. (I really don’t think I’ll ever need all the M.2 slots, but again – who knows.)
  • More PCIe lanes on X870 boards, but Micro ITX boards can't fully utilize this.

Solutions for Additional SATA Headers​

M.2 to SATA 3 Adapter: Use in one of the M.2 slots.
  • Limitation: Only the top slot is Gen 5.0 capable,
  • and the second Gen 4.0 slot lacks clearance for an adapter.
  • If a Gen 5.0 NVMe drive is desired in the future, we have to reserve the top slot. ☹

PCIe to SATA 3 Adapter: in the sole PCIe slot.
  • Advantage: Ample space for SATA connectors.
  • Limitation: Occupies the PCIe slot, making it unavailable for a graphics card. ☹
  • Alternative: Use USB 4 for an eGPU.

Conclusion​

An X870 board offers more and faster memory but comes with additional costs for the motherboard, memory, and possibly an eGPU. It will come down to cost, in the end, but the Gigabyte B650I AORUS ULTRA, currently appears to be the best solution given the requirements and future-proofing considerations.

Layout​

If you look at the layout of most motherboards, you will see that there is one edge that is lower than the others. If mounted in an upright tower case, it is the top edge. So, I thought that if you were blowing air into the case, it would make sense to do it from that edge. This made it seem sensible, to position the PSU, along the bottom edge.

I then realised that positioning it here, meant that any PCIe Riser Cable, would have to go over it. This might not be too good from the point of signal integrity, going through the cable. It would also inhibit the airflow over the PSU, and make the PCIe Riser Cable longer than it needed to be.

So, I was already going off that idea, when I saw where the 24 Pin Motherboard Connector, was – the top right-hand corner. The 8 Pin CPU EPS Connector, is even worse. It’s in the top left-hand corner. Therefore, If I left the PSU at the bottom, I was going to have a whole mess of wires going across the front of the Motherboard. This was bound to obstruct the airflow leaving the case from the front. That is why I decided to position the PSU along the top edge – despite the air having to go over it. It was simply the lesser of two evils. (It's actually not too bad, because the HDPLEX 240W GaN PSU, is only 25mm high. So, there is still 55mm clear above it for airflow.)

In the ‘Internal Layout’ Drawing below, the so-called ‘Top’ of the Motherboard is on the left, and the front is at the top of the drawing. I’ll refer to the orientation used in the drawing, from now on – for the sake of clarity.

Placing the PSU to the left-hand side of the Motherboard, in the Drawing, minimises cable runs as the:
  • 24 Pin ATX Cable will be as close to the 24 Pin Connector as possible, and the
  • 8 Pin CPU EPS Cable travels the absolute minimum - hopefully, in the wind shadow of the PSU across the motherboard itself, to avoid restricting the air flow.

Having placed the PSU to the left of the Motherboard, it makes sense to position the Drive Bay Assembly, to the left of the PSU, thereby minimising cable runs to the Drive Bays and Front Panel.

This leaves the Inlet Fan to be positioned in the corner behind the Drive Bays. It could go in either the side, or the back of the case. For cosmetic reasons, I’ve put it in the back.

We obviously want the airflow from there to travel across the Motherboard and exit at the opposite corners of the case. So, air outlets have been provided at the front and back in both opposite corners. The Display Panel, will obviously make up the remainder of the front.

Placing the PSU to the front of the Motherboard, is almost as good from a wiring standpoint. The case would be 55mm deeper, and 55mm narrower. However, this would preclude using a Front Display, as the air exhaust slats would be far too narrow.

Air Flow is across the Motherboard into both far corners. An additional vent cut in the Back above the Motherboard’s I/O Shield at the back, encourages air-flow into that corner of the case. I was initially worried about the Motherboard chosen, as it has its four SATA Ports and the Font Panel Connectors, on a daughter board, right at the front in the middle of the Exhaust Slats. Obviously, the SATA Cables could restrict air flow. However, now there is an exhaust at the rear, as well, this should ameliorate the problem. Both Fans in concert, at full bore, should purge air from case five times every second. (The noise made by both fans, combined, will be quieter, than a sole fan on a water-cooled radiator. The PSU has no fan.)

The Air Cooler, obviously sits on top of the CPU, but there is a choice as to which way the air exits the cooler. It can either exit to the front and back, or to each side. It will need to be orientated to the front and back, to avoid fighting the air coming in from the inlet fan on the left

Wiring​

It was realised right from the beginning that there was no room for a separate cable management chamber. So, DIY Custom Sleeved Cables, will be absolutely necessary - given the confined nature of the case. I have chosen four colours of sleeving to match the Air Cooler – burgundy, orange, bronze and gold.

Wiring will probably be the major challenge of the build, as I’ve never made cables before. I’ve never sleeved them, and in the past my soldering has been woefully inadequate. Crimping, should not be a problem. The strategy therefore, will be to make the case, and get everything going, in the open air with the cables supplied by HDPLEX. I will then replace one cable at a time, checking it thoroughly with my multimeter, before putting it into service.

The 25 Pin ATX Cable and the 8 Pin CPU EPS Cable, have already been covered, above. This leaves the:

SATA Power Cables These will run from the head of the PSU, down half its length, then round the corner and up the back of the Drive Bays. (Fortunately, their connections will be vertically above each other.)

SATA Data Cables I don’t want to have to make these, as I don’t want to risk bad connections - through inadequate soldering. So, to start with I’ll use some of the cables I already have. Then I’ll look at what sort of connections I’ll need on the Front Panel Daughter Board, to minimise impact on the airflow through the front slats. I’ll then route them, hopefully below the height of the RAM, to the side of the Drive Bay Assembly.

The USB Front Panel Bay underneath the Drive Bays is 27mm shorter than the Drive Bays above, and is fairly empty behind a PCB at the front. There is also a 3mm gap between the bottom of the Drive Bays, and the top of the PSU. This should enable three SATA Data cables, which are less than 3mm thick, to be routed through the gap, and up the back of the Drive Bays – losing a small amount of excess cable enroute.

Front Panel USB and Audio Connections will pass through a 16 x 26 mm gap cut in the side of the Front Panel, along the front of the Motherboard, to their headers, at the front of the Motherboard.

Unfortunately, the cables that came with the USB Front Panel, are far too long for my tiny case. So, I’m either going to have to lose some of it behind the Front Panel, or shorten them and solder new connectors on the end. (Some of these pins are so fine, I honestly don’t fancy my chances, of doing so successfully.) So, I’ll have to lose the excess cable in the space between the Front Panel and the air inlet fan. (Provided I keep the cabling below the height of the PSU, it should not affect the airflow.)

Some thought will have to be given to bundling the front panel cables together, for support, as they are above the bottom aluminium bar at the front, which is removed when the cover comes off. So, unless they are held in place, somehow, getting the case back on again, could be a problem. ☹

Power Button and LED Connections These will follow the same path as the other Front Panel Connections.

Sys Fan Connection This header, is in exactly the wrong spot! As it is diagonally opposite the Air Inlet Fan. So, I propose taking the connection below the motherboard.

Description of Solution​

The PC is less than 5 litres in size, consists of an ITX Motherboard, with a HD-PLEX 250W GaN ATX PSU at one side. It too, is only 170mm long, so fits quite nicely. It's fan-less, and does not use an external power Adapter. It is also capable of using any size of GPU, using a riser cable to an external extension of the case base.

The whole case is only 370 x 201 x 92mm, and features up to four Hot-Swappable Hard Drives, and up to three M.2 NVMe SSDs, one of which can be Gen 5.0. (Without the Drive bays the width would be 107mm less, and be only 4.2 Litres – nearly half the size of the Dan Case A4-SFX’s 7.2 Litres. But the latter houses a GPU.)

The design of the case is relatively simple too: 3mm aluminium base plate, 19mm Tasmanian Oak sides, and 3mm clear acrylic top, fastened with either epoxy glue, or clear double-sided tape. The front 25mm is black aluminium exhaust fins, with the black drive bays and a 5in Display Panel, for the PC’s vital signs.

The whole Case Cover is held on by two large black thumb screws, one each side at the back. All of the working components of the computer, remain attached to the base plate, and can be run as an open-air PC, if required.

Urgency Strikes​

I really need to get this case built by end of January 2025, as at the age of 80, I am going to be teaching again at the start of February.

So, I need something that is mobile. But if the processor, I want, is not going to be even announced until CES on January 6th, with delivery sometime later in the year, it does not look like that will suit my timeline.

I have therefore decided to build my case anyway, but use the CPU and Memory from our current server:
  • CPU – AMD Ryzen 5 5600G 3.90GHz, Boost: 4.4GHz, 19MB Cache, AM4, 6 Core, 12 Threads,
  • Memory – 16GB Corsair Vengeance (3200MHz) DDR4.

Unfortunately, I cannot use the Motherboard from FILE-SERVER, as it is an mATX. So, I’ve bought an:
  • ITX Motherboard - Gigabyte A520I AC MB, Rev 1.4, A520, 2x DDR4, PCIe3.0, 1x M.2 NVMe, 4x SATA3, 4x USB 3.2 Gen1, Display Port, 2x HDMI, WIFI AC
As this is capable of running my existing CPU and Memory, until I make up my mind, and upgrade it further.
(You never know, if it takes too long, I might be tempted to hang on for Zen 6! 😊)

Fortunately, the layout of the interim Motherboard is the same as the selected, eventual Motherboard. So, all the design considerations, discussed above, remain the same.

ENINEERING DRAWINGS​

All the engineering drawings that follow, were done in MS WORD. I don’t have a CAD System, and I didn’t think that it was worth learning one, given I could do everything I wanted to do in MS WORD. (Remember I have been using WORD, and teaching people how to use it, ever since it first came out.) All I needed to do was:
  1. Insert some rectangles, representing the various components. I used a scale for 2:1 to simplify the mental arithmetic.
  2. Add text to tell me what they were.
  3. Wrap Text, square, around them, so that I could drag them around the page.

Doing the design, then simply involved dragging various components around, until I got something that fitted the space available, in both plan and elevation.











 

hrh_ginsterbusch

King of Cable Management
Silver Supporter
Nov 18, 2021
764
298
wp-devil.com
This whole build reminds me of the person that built themselves a "sophisticated" e-mail machine, inside a Densium 4, with just an AMD 7950X, running 3 screens all on iGPU. Apparently they eventually had to upgrade it with a (50W) DGPU, because of AMD driver issues under Windows. This thing also runs everything with just a HDPleX GaN 250.

There even is an update video - I guess they streamed it live - in which they upgraded to a 9950X:



Also see the original post about it on r/sffpc.

cu, w0lf.