The point was to do a water-cooled Team Red build, running Linux and fitting all these components
in a 13 liter case. The point was also to make it a learning experience.
Now as we all know, a build is never "done", is it?
The SFF itch got into me and I knew that something more could be done. In my case, I have a Ryzen 7 5800X CPU and after seeing the temperatures in benchmarks I settled for the 65W or Eco-mode, as opposed to the 105W mode that it is capable of.
Failed attempt
In which the frustration of the average SFF'er is shared, in self-explanatory images
One really low hanging fruit (apparently) is the size of the radiator (80mm) which is really the smallest one sees in a build of this kind. Looking at the case dimensions, one would think that a 2x80mm radiator could be somehow used. So much did I want to believe this, that I made an impulsive buy:
Even for the shortest one I could find, the actual difference is huge:
As if poor BIG1 hadn't have enough, I was seriously thinking about a radical case modification:
... but went on to collect some more Steam achievements instead...
... and some time later, during a particularly sunny day on an undisclosed southern beach, I met with my engineering team. And I was told, in no uncertain terms, that my current build is terribly sub-optimal. Therefore, improvement must, without the shadow of a doubt, be done, to said build. The concerns were noted, and some arrangements were made.
Thus begins my search for a new home for my PC components, and this build log.
Search for candidates
With "room for improvement" I mean that we want everything: we want full-power CPU, lower GPU temperatures in less volume.
Myself I managed to get my GPU temperatures equal or slightly below open-air, and benchmarks told me that GPU temp is probably the first priority.
After some thorough search the Sliger SM560 seems to be the perfect candidate: the GPU location seems right below the fans, allows for a 120mm radiator with a small adjustment, the GPU still fits and yet slightly lower volume.
The Sliger Conswole was also considered, for having the virtue of the smaller footprint, but didn't seem likely to deliver the best GPU temps.
In the end it was necessary to hold a brainstorming session with my quality team (which is a different team than the engineering team) in which we would first figure out which kind of case we want and then hunt it down.
In which we learn of advanced design and prototyping techniques
My build is particular in the sense that I use an external AC-DC brick so there is a "box" of 125mmx100mmx65mm which is needed in most PC cases, which I don't use: this space will be instead occupied by the radiator. In that regard, my understanding is that 140mm should be more than enough so I went for an Alphacool Nexxxos XT45.
This leaves us with three "boxes" to rearrange inside a case, like a tangram, in whichever way we like. Our very competent engineering team introduced the data in our very advanced design program:
Who needs CAD
Now we may not have a 3D printer in our premises, but we have cardboard, polystyrene (EPS) a pair of scissors and good intentions:
We can cut to measure some components in EPS, just to visualize the size, also to see where the I/O and cables go:
Who needs a 3D printer
There is some thinking and some drafts come:
Production level stuff.
After some playing around with the components, we decide on something resembling a Conswole (more like Consuela, as baptized by Level1techs), but having some more breathing room for the GPU. The GPU would be on top instead, given this odd tendency of hot air to always go upwards. Finally, this breathing room gives us a chance to put some Noctuas on the "roof of the chimney", which this engineering team is really fond of.
This arrangement seems to please everyone, and much joy is had. The only task left is to find a similar case, or a case that can be modified into this one.
A cursory look through the SFF master list reveals that we don't find these dimensions right away. Surprisingly enough, this doesn't appear to be a big deal for our engineering team. Apparently, "this is just a small problem". It seems that "this issue can be solved easily".
In an emergency meeting with the logistics team (which is a different team than the quality team and the engineering team) we are told that a realization of such case in aluminum is possible at the main premises of the organization, where more advanced tools will be available.
Not one to waste a breath, and placing full faith in my team, I went on to order the missing 140mm radiator, which arrived just in time before my preparations for the extraction to the main location. Here it is in all its glory, side to side with its near identical twin:
And here it is, fitting in its hypothetical case:
Last but not least, hunted for a bag which could carry such case, if it was finished, back home (Like I said, full faith in my team):
Pictured in the photo is what will be my carrying back for the case, and inside is the case I made with cardboard.
The time for Steam achievements is over: now I have fully dismantled my current build, duly packaged components in cases and I'm ready for extraction towards main premises.
Despite the high level technical terms imposed by all the professional teams involved into the project, along with all the latest design technologies employed here, I surprisingly managed to understand most part of it and even find the reading very entertaining!
Thanks for bringing a smile to my face, you really got writing skills!
You are not bad at drawing too, especially at straight lines, thus labeling each dimension twice was not needed but I guess that is a requirement imposed by the engineering team!
So the case is going to be 296 x 360 x 76mm, 8.1 liters, right?
Very good question, that is the plan yes, but to be honest if these end up really being the final dimensions this will have been an absolute wonder of planning and accuracy.
First of all there is the little 11x332 part you mention. If you look at the pic below, orientated as if the case was standing vertically, it can be somehow seen that there is this little extra platform in the "roof" above where the covering panel would go, and there is kind of a tray, on top of which three 92mm fans could fit. Now the 11x332 part could be a grid that is placed on top of the fans and protects them.
As project manager I see the fan cover as an optional feature, only feasible if it leaves enough remaining time to observe the obligatory cups of wine in between work periods.
Anyway a quick math adding 25mm height of the fans to the 296mm of the panel would give me still ~8.8 l. Somehow seems too small to be true, I want to really see that in practice.
The second factor is the brick, which I think should be taken into account as well. However in the 13L case earlier I was also using a brick so there will be a volume reduction. In theory, almost every number will be improved.
In which our heroes learneth of ancient drills and all kinds of screws and screwing around
Ahead of me was a plane flight carrying what I estimated was thousands of euros in components in my backpack. I decided to mentally prepare for it by reading further about how scarce GPUs are, how everyone wants one and how the prices keep getting inflated. You know, because that helps one (not) calm down.
Laying it out
As I arrive to our premises, I unpack the actual components on a table, and realize how much of an advantage it is to have them, compared to the models of cardboard and plastic.
The strategy is the same: arrange the components first, make the case later. Seeing it like that, we realize that we could make the case much thinner; but why would we, if it's already quite portable. Definitely the airflow should be better this way.
Here's where all our armchair theorizing goes: the hot air goes all in the same direction and against the side panel (to be tested), also goes upstars because it's hot air, and because it's being helped by the noctuas on the roof (to be tested). In my experience the radiator under high CPU load blows rather temperate air (maybe my loops are suboptimal), but definitely not as hot as the GPU at high load. So in my opinion the GPU should be placed upstairs.
Looking at this picture, we chicken out and give the case some extra length (335mm), so that there's some error margin for the radiator and motherboard, also more breathing room for the GPU. With this length, we compromise on having three case fans, despite huge pressure of adding a fourth fan, from influential Noctua representatives from within the project.
Getting close and personal with the screws
We believe this can all be done with aluminum panels, aluminum bars, aluminum corner sections, some rubber feet and screws. Everything can be put together with just screws, lots of screws. The panels need to be white (no paint), but if we are happy with the result we can try to order some better looking panels in black. Pretty much everything else, we found in black in the hardware store.
On our side, there is a whole arsenal of tools of which age and purpose is not 100% known. Against us, is time.
The enclosing
In which a box is built
First of all, the enclosure that contains the volume of the case will be the most straightforward to start with. We made the rehearsal already with cutting cardboard, now it's just about cutting aluminum instead.
Of course we have an angle-measuring mount for a saw laying around: it's the main premises after all. With a 45º cut at the end, all the pieces will fit and be rather orthogonal-ish.
Once the angles are cut down, we will mark with a nail where the screw drills will go. Pictured here is the inaugural drill, fixed in place by a tool of immense arcane power:
This particular piece is for the upper tray which will house the three case fans. 335mm are a bit more over the length of the three 92mm fans, so we'll use leftover sections as padding. The first padded piece looks like this:
In the background of the photo you can see that we measured length by just putting the three fans together, the sections are literally being cut to measure.
Whenever two pieces are joined permanently, we don't need screws and can use rivets instead. Of course we have 3mm rivets and some riveter laying around: it's the main premises after all.
For this particular piece, a corner of aluminum is cut which joins the two sections in the two points seen.
Here is the finished "cap" of the case. The three fans will be sanded same as I did for BIG1, so that the power connector can be placed below:
The base of the case is made in the same fashion, a bit simpler since it doesn't hold fans. What is left now is to start adding the vertical sections that will join the two pieces. A square set will help keep the 90º angle (I wish) and we came up with the following wooden piece of the needed width, which will keep the corner sections parallel:
This is really important as we make drills and put on screws the angle becomes more and more difficult to correct.
... and then, as they say, draw the rest of the owl ...
PD:
It is now when I can give an exact answer for the case measurements, since I can just go ahead and measure it. This is 360mm (height) x 336mm x 105mm so roughly 1,27 L.
We plan on using aluminum flat bars as platens, and those will fix the components to the case. In the spirit of build to measure, the stripes are placed at the height and drilling marks are done with the actual motherboard. See little crosses below.
(What's not in the picture is the cold sweat, as I move the motherboard around)
This marks the beginning of more drilling adventures. We need at least three platens on the side of the right panel, two for the motherboard and at least another one at different height for the 140mm radiator. We also need other two bars on the back, working as a "slot" to hold the GPU.
For each bar we need at least three drills so do the math. We use a mounted drill, a wired drill, a wireless drill, we spend the battery of the wireless drill halfway through the afternoon. If all else fails, we have a manual drill (because, why wouldn't we):
Which I actually like and end up using, because turns out quite handy for thinner aluminum bars. Who needs a Bosch anyway.
Now, this frame is something we want to be able to mount and unmount. After the drilling hole is made, we will thread it with a threading tap (which we of course have laying around somewhere) so that we can screw and unscrew it.
To the degree that's possible, we will use screws with a flat head and the hole will be finished with a bigger drilling head, what is called "countersunk". Eventually we realize there's such thing as countersinking heads for drills and that we actually have already one of those laying around in the premises.
So the next ones we make will look better.
I want to make a special mention for the particularly diabolical shape needed for the platens in the GPU side:
That overcomplicated black section recycled from a leftover of the corners, filed down to smoothen those killer pointy corners, and that impossible screw that needs a nut and some prayers to stay put.
It would have been much easier just by bending that side of the GPU, but hear me out: now that GPU is unmodified, it can be put in another case and also another GPU can be put in this case. I think the sacrifice is worth it.
Moving the case around a bit, we quickly realize what is going to be possibly the biggest headache of this whole project...
The sagging
In which a GPU is let know its place
Being as we are, rather new to the PC building business, we had no idea of what GPU sagging was. In my previous case the GPU was vertical, and its weight was nicely resting on the side where the screws are.
Looking around the loose side of the GPU, we are perplexed by the fact that we don't find any screw that could be re-purposed nor anything that could be used to place a screw of our own. We stop and wonder how has the PC industry standardized around this issue (or hasn't?). Our perplexity increases as we find other people with "commercial" cases facing similar issues.
After further brainstorming and exploration through our vast premises, we come up with the following piece as a solution:
Which I believe has been the most brilliant little thing that we came out with so far, probably for this whole experience. In record time, too. It is worth mentioning that this was cleaned and cut down to measure, and bent from an old aluminum bathroom screen (thus the white color) which we found laying around.
This happens to fix the GPU so well that one can really move the box around freely (picture me screaming "careful!" in the background and cleaning my cold sweat).
In which the shameful insides of a case are covered up with shameful outsides
Preparations
If we look at the upper tray in this photo, we can see some extra 2mm of height left for the sections compared to the end of the tray. This is left intentionally for a cover to potentially sit on top of it. Our time for this project is limited but now it seems we'll have time for making such covers as well, at least some rough ones.
Another important preparation, for the side panels, will be to double check whether the resulting angles of the case were actually 90º. It turns out they weren't (not totally):
A bit of a downer, but good to know. Our plan is to first make templates out of poster board, hopefully templates that can be re-used later for future, better-looking black panels.
Last but not least, the right cover would be kind of loose in the middle with only 4 screws, so we will add two little platens to address that:
The left panel and upper panel will both have openings for the fans. I was OK with a rectangle but my team wanted to get cute:
The templates themselves are used to mark the sheets:
For this we employ a professional plotting tool based on graphite, commonly known as a "pencil".
Similarly precise tools will then be employed for the cutting of the sheets:
On a side note, I must say that I have really enjoyed the experience of working with aluminum. It is a foldable material, can be sanded down and cut down easily, reasonably robust when one drops some piece to the floor (which happens often). Aluminum dust ended up everywhere around me sort of like the midichlorians, and have it everywhere in my clothes. Possibly became part of my diet and my body composition.
The sheets will be roughly cut with some margin compared to the outline, then it will be sanded here and there to make the shapes less rough:
Now it's also the time to drill some more holes for fixing the case fans to the tray.
We could have 12 but we decide that 6 is enough. We keep running out of battery on the drill and, soon, out of screws.
Difficult pieces: honorable mentions
At this point is when I feel like I am more involved in the SFF experience, and starting to realize the difficulty to find parts of the needed size. In our case, this was screws.
Going to a hardware store asking for 3mm screws of >30 mm length can earn you some funny faces.
This picture exemplifies this whole story perfectly:
The hole in the fan had to be drilled further to be 4mm, since no 3mm screw was available that was that long. To the right we see why at least half of the corner screws have been serrated 3mm shorter. And just below the corner, we see that some threading was needed for the right screw to "perforate" the rivet.
I also want to give a special mention for the particular kind of hell that is the back cover of the case:
This piece has it all. A nice bend at the end with that nice rounding, three holes that must match the underlying screw holes and that width change in the middle of it... just chef's kiss.
Finishing touches
Now we have a pretty good idea of the space that we have left for yet four more screws. It's a good moment to add the feet to the case. The solution will be a bit fast and improvised, although we have the feet already. It will involve big holes and screw anchors, and it will look exactly like this:
At this point we are running a bit out of time, and upper management is getting impatient and starting to ask questions; so we get ready for a quick assembly. We probably cannot go as far as turning the PC on, but we should have enough time to address the question of where to put the power button, and fill the water loop.
I go for the loop first. Now I have no idea how is the airport policy on PC's with water loops, showing so much liquid in the scanner and all, and I really don't have time to research, so I am not going to carry it in my backpack. I think I have time to go as far as to fill the loop and package it separately.
After my previous experiences, this was a pretty easy water loop. Perhaps the left tube could have used an extra angled fitting or at least an anti-kink spring, and I will probably add that if/when I refill the loop in the future. This time I thought I'd be cute and instead of adding a female-to-female connector I added a temperature sensor:
I thought I was thinking ahead, but it was the other way around. Somebody experienced enough can recognize what's wrong with this photo: the ASRock motherboard doesn't have a connector for temperature sensors. That I learned after filling the loop, and was a bucket of cold water.
From here I have some options, and I don't like any of them:
- I could replace with other sensor which has the LCD included, but then the temperature reading is just for my viewing pleasure, I cannot use the input as reference e.g. in benchmarking.
- I could get a fan controller, but it's a big component, kind of beats the purpose of an SFF case. Plus, there's not very good availability of drivers when it comes to Linux.
For the time being, the sensor will stay there as a glorified female-to-female connector.
Also, this time, I got an Allen key for the draining cap in the Eisbaer (in green), so that I have two different points through which I can fill the loop.
The red cap will be used as air entry. Moving it around and getting to the point that it seems to be filled, I then started filling through the other green point. Now, the radiator is easy to fill because of its shape, but it wasn't until now that I realized how difficult is it to tell when the Eisbaer is filled. I am also surprised by how much capacity this radiator has: I almost ran out of coolant.
Now, to the power button. This should be rather straightforward, it is enough that we know the diameter of the drill we want. As can be seen, we even have the option to put it in the back (if it gets packed):
Hey, look at it go! Doesn't it look cute? Doesn't it?
Now comes the final, final test. Did we make our prototype cardboard case big enough?
I am proud of this team and what we have accomplished. Now, to some well deserved celebration, and prepare my trip back to the extraction location with the high tech cargo on me.
In which we first introduce the rival to beat, the dreaded Open Air Build
After an uneventful trip, (other than a much more expressive face in the employee that was scanning my bag in airport security control) and getting settled, I finally get around unpacking the filled loop and install it into the case that came in my bag.
I was never fully satisfied by the fact that during benchmarks, the air coming from the radiator wouldn't feel as hot as the one coming from the GPU even though the sensors report that the CPU is clearly getting toasted. That is why I am very careful about filling the loop and not leaving air bubbles inside, and this time I am going for a little more premium thermal paste as well:
See if that does something. I also got some modest temperature improvements in the past, just by properly cleaning the cooler's surface and better application of the paste.
If you think about it, we have made some bold assumptions about this case, in terms of better airflow and better GPU cooling. So bold, that we went ahead and built the case.
Also, I put a damn 140mm radiator in there (compared to 80mm earlier), so having an improvement in CPU temperatures wouldn't hurt, either.
Since we went so far, we consider a first requirement that the final "product", closed, with the three case fans, gets lower temperatures than when leaving the components in open air.
Later we can do other tests, for example tweak the fan curves, compare results to previous case, etc. But all with open case as a reference.
For that we need, first and foremost, a build that looks exactly like this...
... and secondly, a set of benchmarks to use as a reference.
Open benchmarking
Now we finally get into software territory, which is more of my little bubble. We are now in Helsinki, a city where videogames are kind of a thing...
... and so here's my problem with benchmarks. Whenever there is a Steam sale, I see that 3DMark on sale. Trust me, I see it! But then there's also a pretty good game on sale as well. So, what do I do?
Quite often, the game can be used as a benchmark anyway. And you can play it. And it's a much more realistic use case. Right? Right?
So anyway that's my excuse to come up with the following, surprising, disturbing, unlikely, battery of tests:
- Build GCC: Never fails to toast my CPU, consistently reaches the highest temperatures even sometimes in Eco Mode.
- Unigine superposition: Never fails to toast my GPU. I have only managed to get below max temperature by configuring some pretty aggressive fan curve through fanctl.
- Shadow of the Tomb Raider: The game-benchmark that has become a staple in most hardware reviews, AMD/Nvidia product presentations, etc. (because it's such a great benchmark).
- Total War Warhammer II: Big battles, lots of special effects, very reliable benchmark, wildly popular strategy game.
- The Riftbreaker: A 2021 game that is big party of explosions and body parts, and a great CPU benchmark at that.
- Dawn of War 3: Lots of explosions and possibly the most reliable benchmark of the bunch. Never crashes, never needs a re-run.
- Company of Heroes 2: If you protest that nobody plays Dawn of War, here's a very similar game with consistent thousands of a player base.
- Ashes of the Singularity: Escalation: Lots of explosions. Used to be one of the poster boys for DirectX12, but requires too many runs to be reliable.
So my criterion for choosing a benchmark is basically explosions.
Before disassembly of my previous build, I was careful enough to automate such sequence of tests, did a system update in both installations (Garuda and SalientOS) to account for any new performance improvements, got results from those and stored them away. So now those will show up in the results as well, and I will get some rough first impressions.
Am I ready yet? No, now I need a name for these test results, meaning that I kind of need a name for this case.
I am calling it "Big Conswela", maybe for the fact that it is a big console-style case. I am open to ideas, though.
OK, now I can run the benchmarks:
Between SalientOS and Garuda we see the usual back and forth. We might be here "victims" of a Linux upgrade to 5.15.12. Anyway overall many victories for Big Conswela. What might be the reason? The hardware is the same and the CPU is still in Eco mode. Could be a result of better CPU cooling ...?
Now let's look at maybe the most important part, the CPU toaster:
There is an improvement in temperature and for the first time ever I am seeing that max temperature get below 80ºC. Average temperature went from 69 to 63. That is six degrees... it's OK I guess, but a bit underwhelming. I don't think I would disable Eco mode for just those 6 degrees.
Now let's look at the GPU toaster:
Hungry as always, making the GPU consistently draw 240W from the wall no matter the OS or the case.
The temperatures are actually slightly lower in open air than in my earlier build. This means that I forgot to activate fanctl when I made these measurements. The case fans would have reacted to the GPU temperature and the results would have been closer.
Finally, let's not forget that we are using BTRFS, which is a copy-on-write filesystem with cool functionality such as snapshots that allow you to rollback if you screw up... but which increases your NVMe temperatures quite spectacularly. Just look at this:
First of all, temperatures are higher in Garuda (BTRFS) than Salient (ext4) because of the filesystem, but both are within the BIG1 where two case fans were connected directly to the CPU cooler plug in the motherboard. Meaning that during this test, the airflow was much better in the BIG1 closed case than in the open air test, as we can see here.
We also see what can happen when using filesystems like BTRFS, if we don't tame them.
It's a new year, and a new case. I am excited to try it out, but reticent at the same time, realizing that I am way too invested in the case by now.
Reality has a way to make it so that things won't do what you expected them to do... let alone what you wanted them to do.
Let us bring back, for example, the issue of screws of precisely a certain length. Behold, a new addition to our little zoo: the flat, flat-head, 4 mm/35 mm screw:
And how about the case feet that aren't tall enough for the power cable, and you never realized until you hand-made your case?
Luckily, we are not dead yet. We can leave it like this:
... Or like that:
I decide on the left, which seems to leave room for further cable management... perhaps even a tray below the table, where the brick could go. I have done similar with my laptop and its brick.
Very much in the spirit of this whole project, during my screwing/unscrewing one of the panels, I come up with a highly sophisticated way to remember which screw goes where (in the panel):
It's finally time to think about the cooling. There are many ways to skin this cat:
This time, I will connect the radiator fan and the fan immediately above it to the same plug, since my intention is for the hot air to go up in this case.
The remaining two fans to the right, I assume, are tighter with the GPU, and it's probably a good idea to have them respond to the GPU temperature sensor.
Armed with my trusty screwdriver and my trusty eggcarton, the time has come to behold the sacred artifact:
It will be a matter of minutes to turn it on and arrange the cooling, so that the two fans before mentioned are reacting to the GPU temperature. This is pretty much the "fanctl" configuration that I had before, the only difference being that now there is only one fan on the "side" of the CPU.
About that first test, I have some explanation to do: this is a result that I will show here, for, let's say, "historical purposes":
Here we see good ol' Unigine doing its Unigine thing, pulling those same 240 W and the GPU temperatures seem to have improved both with respect to open air, and also with respect to previous case. All seems OK, until I look to the left and realize that the GPU fans are stopped.
Since the testing has finished, I start some game. I look again at the GPU. Fans are not moving.
Now I launch the Tomb Raider benchmark. I wait a bit. Fans are not moving.
I start getting paranoid, but now I launch Unigine again, and wait a bit for it to warm up: yes, fans are slowly, timidly moving.
Now, what I think happened here, is that those two fans on the top are already enough to get temperatures to where the governor in the GPU wants them, so then it doesn't spin its own fans until it's really, really necessary... I guess?
Anyway, that's not going to work for us. When we were planning this case, we were thinking of an "air tunnel" or more of an air wall, that works like this:
Cold air in blue, coming in. Hot air in red, coming out
The point is that I want those fans to be spinning at least a little bit. So I am going to "cheat" and tweak a bit the fan curves of the GPU, so that they will spin, although very silently, to get this airflow going. I will call this configuration fangpu to differentiate from the before mentioned fanctl. Then, rinse and repeat:
Haha OK well... now this is a bit overkill. But what's really important is that the CPU temperatures have very, very timidly gone down as well (although reliably, as I repeated this measurement a bunch of times):
My goal with this case was to optimize GPU airflow (careful what you wish for), and I took the CPU for granted, by virtue of the bigger radiator. Now I realize that CPU is going to be the actual bottleneck. As I see it, the next step from here will be to put all three case fans to be reacting to the CPU temperature, a configuration which I will baptize with the very original name of fancpu.
To be perfectly clear: everything else stays the same, the only change will be that the two fans in blue:
... Will be reacting to the CPU temperatures, like the fan in red. That was the only change, and yet, we seem to be breaking a bunch of performance records:
Further proving my hypothesis that the CPU was the culprit.
I will be the first to admit, that this is not a spectacular improvement: I don't think I would ever overclock this CPU (not that I know how to do it)... and maybe the GPU has some room for overclocking (again, if I knew how to do it).
Anyway, what matter is that now we have proven an improvement in nearly every way.
These are some solid 10 degrees below my first build... and roughly 2-3 degrees below the open-air case. Plus, a pretty respectable 15 degrees less in the NVMe with respect to open air:
Drive temperatures is the aspect in which the BIG1 case still holds an advantage.
A possibility could be to open some ventilation slots in front of the motherboard area, like the BIG1 has. But that, like Michael Ende would say, is another story, and shall be told another time.
At this point, I have "some idea" of how my build performs. Wouldn't that be great, if us engineers were satisfied with having "some idea".
Now, I will go about demonstrating that this improvement holds across multiple test runs, and multiple operating systems.
In which we leave it to the operating systems, to do the fighting for us
As we all know, the reasonable approach to testing is to repeat your measurements in at least four different operating systems.
Since my previous build 11 months ago, shortages and price inflation have made it so that not so many exciting things have happened in the hardware world, in 2021. However, there have been lots of things happening in the world of software and, more specifically, in the world of Linux.
Right off the bat, I find some benchmarks that won't run. This is not entirely surprising, considering this distro is targeted more towards cloud and not desktop gaming:
If anything, I am positively surprised by having most tests working off-the-shelf. I won't spend extra time on making these work... maybe later, if I decide on giving this distro another try.
On a general note, though, I like these results. I am particularly satisfied with the performance of DX11/DX12 games, and just in awe by the native game results. Good job by Intel's open-source team: whichever black magic they are doing, it's working, and I'm sure they never expected that somebody would install their OS just to run Tomb Raider.
From my earlier build log, I got some offline comments regarding my obscure OS choices. Therefore, this time I am going to change the lineup a bit, introducing some heavyweights. (That is, if Intel's own distro is not heavyweight enough):
- Introducing heavyweight number one, with more than [90K packages](https://en.wikipedia.org/wiki/Comparison_of_Linux_distributions#Package_management_and_installation): Fedora 35, also known as the meme distro.
The installation is buttery smooth, and I think it's giving me a slight wink by displaying the ASRock logo during startup, perhaps the brand is detected by the installation software in some way.
This is a first. But, at the same time, pretty much the level of polish that one would expect from the apple of Red Hat/IBM's eye: Fedora is the operating system which anticipates what will become mainstream in the whole world of Linux in the coming years.
Shoudn't I? Watch me go:
After putting Fedora through the usual benchmarking routine, got the following results:
And whaddayaknow! Some more records were broken. This time, there were some catches, though.
- The CPU frequency scaling was set to power save mode by default. I had to follow Fedora documentation to switch it to performance. The performance actually did improve, reason why I added "cpufreq" to the name in this particular result.
- The DX11/DX12 games run noticeably slower (~10 FPS) than other distros. And that is, even after tweaking the CPU freq scaling.
- One of the games didn't launch.
Fedora shines in native Linux tasks, such as compilation. The CPU performance increase came with a one-degree impact in temperature:
Now, let us set Fedora aside, and introduce heavyweight number two. With more than 125K packages to its name, another hipster OS you may have never heard of, called Ubuntu 21 Impish Indri; a devilish animal (the Indri), which will make an appearance quite early in the installation process:
Looking Impish indeed, this Indri.
There is a reason why so many people use Ubuntu. The installation process is so simple that it required fewer steps than my latest Windows 10 installation. Nowadays, I consider Ubuntu the de-facto fool-proof OS (including my own foolishness).
First, all the benchmarks worked. The Impish Indri manages to steal (impishly) some records from Fedora:
Here, I just want to highlight how unlikely it was for any setup to break any record whatsoever at this point.
The results in DirectX games are slightly better, still not the best. I make special emphasis on DX11/DX12 games because, at the end of the day, most of the playing hours are spent in games that fall in this category.
This time around, we pay no price in temperatures for the extra performance:
Very nice experience overall.
The aftermath
My little trip through Linux distros brought me a bunch of new numbers. My impressions are mostly positive:
- The GPU temperature gains of this fan configuration held stable, across all OS'es. Unigine Superposition used to be a scary test: the air was coming burning hot from the GPU receptacle, the fans were at full capacity, there was coil whine. Now, it's almost like a joke:
- Staying on the topic of temperatures, Fedora 35 sports the lowest temperatures on a BTRFS drive partition, that I've seen so far:
- In a world where we are told that "every FPS counts", some people spend some extra hundred of $$ on a graphics card to bring a marginal increase in performance. I thought it would be worth generating a geometric mean of my results in terms of FPS:
The distro that stands out to me is, still, Garuda: the eagle distro doesn't hold many records, but it does hold enough second-bests so that it gets first place on average. Add to that: the extremely easy update, the automated snapshots of the disk, the possibility of rollback if we break something, the GUI approach to configuration where changing a kernel is just a tick box... and I think we have a winner.
Beyond that, we only see marginal impact on the distro choice. We see much more impact on performance from CPU cooling improvements, and therefore hinting that a CPU power/voltage/frequency tweaking would have considerable impact.
The GPU cooling improvements didn't move the performance-needle, but it doesn't mean that we cannot try and see what happens when undervolting/overclocking.
With holidays coming up, there was some extra time to devote to hobbies, and there were a bunch of improvements for my build that were left pending.
On the other hand, getting other people involved sometimes results in nice surprises, and that was the case with a nice little gift from my aunt, a foamy cover (and black, as I originally intended).
For comparison, here is the micro-fiber sheet that I was previously using, to prevent dust from accumulating in the fans:
And here's my aunt's version:
With a nice opening in the back for the cables as well.
Anyway, with such demonstration of skill, it was only right that we got our stuff together and got the black panels for what, once knowing the exact dimensions, should be much easier work than it was before. And it was, but as usual I can't help myself but show our very sophisticated CAD project:
From here, you know the drill (oh, I see what I did there). The work is much easier than before, only with very good looking panels. And here's the compulsory protective plastic peel, which somehow I hoped it would have been more satisfying than what it actually was:
Other than that, once opened, this shiny black just looks so much better:
Here is the right and upper fans side. I wonder, is this the point at which I start putting LEDs in my case?
And here are the left and front sides. It's wrong that I say it, but for manual work, man those holes are pretty well done:
The holes are smaller now. Is the airflow the same? Have I lost my cool? Time for some benchmarks:
GCC compilation gets the maximum CPU temps (CPU is at 65W TDP). The bottom result is with the new cover. It seems it is even 2 degrees cooler (which is quite small, may be just standard deviation).
Unigine Superposition gets the maximum GPU power consumption and temperatures. Bottom result is again with the new cover.
Again, we have same or better temps, nothing was lost, and probably the OS updates brought some performance improvements that can explain both better results and same/slightly lower temperatures.
And finally, here's the Shiny build finished with DualSense and Shine for scale:
Victory is sang, joy is had, build is finished (...?)
Excellent question, and basically yes. But in the spirit of tinkering, and of this forum, things both which I love, I will provide a completely unnecessarily long and documented answer.
With the drawings shown above, we mark where the patterns will be on the aluminum. We will just drill two holes at the beginning and the end of each section, and then connect the holes.
There are, of course, specialized drill heads that have some kind of top which can be attached to a guide so that the drill can be moved along in a perfectly straight line. We were at the beach, so we don't have the equipment nor time for any of that.
For joining each pair of holes, we use one of those really small saws that one would normally use for wood, but since aluminum is such a nice material to work with, it can also be used here:
These are the patterns, once cut:
A little bit of extra sanding, so that it looks more rounded:
Exactly! I wasn't actually sure that the pattern would be visible, so it's good to get confirmation. The truth is that there were good intentions to make it more visible using paint, but it doesn't quite show. Like I said, maybe this is the edge case in which I finally use LEDs for something.
For the name to stand out, we thought that a contrast between bigger and smaller holes would be enough. We cut the bigger holes first. Here the name is pretty obvious:
Then, the rest of holes, way smaller:
In addition, we are going to try to paint (with various degrees of success) the inner part of the smaller holes. The idea is covering the big ones from the paint:
Haha yeah, to be fair I should have specified: "the two Linux-native tests out of the few ones that I have tried, that produced the max. temperatures in CPU (90.7ºC sensor reading seems to be as high as it can ever get) and GPU (240W seems to be max. possible power consumption) respectively.
Because hardware stays the same, and I am not overclocking, I am mainly trying to measure improvement in cooling from building a new case (especially GPU cooling), those two tests are enough for that.
For joining each pair of holes, we use one of those really small saws that one would normally use for wood, but since aluminum is such a nice material to work with, it can also be used here
I wouldn't have bet such a tool can deal with aluminium, but the results show your team of experts were right about that! Hopefully you did not loose a finger in the process!
I wonder now if the said tool can still work with wood or any other material after what you made it endure!
Yes, this has been my nickname through the years, Baron Heinrich Zemo (just kidding). The ID "BHZ" is usually taken, so I tend to use a derivative like BHZet or BHZeto. Pretty meaningless to the casual onlooker, but some of my old friends have laughed out loud after recognizing the pattern in the case.