Log Monochrome 2: My custom fanless 7.5 L Strix Halo system (completed)

TheJiral

Average Stuffer
Original poster
Jun 29, 2025
55
124
I had a look into it. I don't think one can separately adjust voltage to the iGPU. That's why I then had a look at the exposed sys files for the CPU. I could watch frequency and power draw in real time but it did not show voltage there either (giving 0V readings), nor did I see any voltage related files either. Maybe one can indeed modify voltage but things are often more locked down with APUs. I think I'll leave it be. I do have my "Turbo Boost" button in the system tray. If I am really desperate for an 8% multicore performance boost ;)


On another note. I have started having a look into some easy LLM stuff. LM Studio works fine in general, even though it refuses to work with rocm on Strix Halo, mine at least. But GPU support on Vulkan works so it is not such a big deal. What I had troubles with was with models larger than 32GB, even though I have set the GTT to anywhere between 44 and 58 GB (either and everything in between lead to similar behaviour). It insists on going to SWAP and then it moves only glacially. Taking ages to even load. ... Not sure if some settings are wrong or whats up there. However the 28 GB model Gwen3 32B 6bit works quite nicely and I also think its output is of useful quality, at least when asked for Linux related problem solving or advice, given at least some templates one can work with and ideas to investigate further. It runs at 7-8 tokens/s, which I find perfectly servicable. I'll try if I figure out working with dockers and getting ollama to run on rocm. It is all completely new to me so still quite confusing.
 
Last edited:

TheJiral

Average Stuffer
Original poster
Jun 29, 2025
55
124
Some more benchmarks, which I find quite interesting.

I add Shadow of the Tomb Raider Benchmarks, comparing 1080p and 1440p at Balanced Mode and Balanced Mode with Performance Mode at 1440p.

Balanced (115W/100W)​
Performance (140W/120W)​
1080p Highest preset (TAA,. No FSR)​
Average FPS​
145​
147​
Tomb Raider: Shadow of …
Min FPS​
109​
111​
(in-built game benchmark)​
1080p Highest preset (SMAT2x,. No FSR)​
Average FPS​
142​
Min FPS​
108​
1440p Highest preset (TAA,. No FSR)​
Average FPS​
94​
95​
Min FPS​
76​
78​

There was a 1-3% increase in FPS, both average and minimum at both resolutions. In other words, 1-2 FPS more.
Without knowing the actual power draw of the 8060s, this looks to me as if it can't make full use of the 120-140W power limit with its 2900 MHz max frequency. This is very much in line with the result from Cyberpunk2077 where also there was barely any difference between power levels.

So all in all, I don't see how I would be missing out a lot by staying on Balanced mode, unless when I really want max multicore compute power. On the other side, maybe there is substantial overclocking potential if one dares to push the max Frequency higher than default. Not sure what the limits would be that are possible and what would be dangerous for the APU. This isn't a question for my system anyway.


A short note to the Tom's Hardware benchmarks. They also tested Cyberpunk2077 and Shadow of the Tomb Raider. The former was tested at RT Ultra, presumably without FSR (hat is enabled by default in the preset). They got 22 FPS, in Windows. I repeated the benchmark at the same settings without FSR but in Tumbleweed and got 17 FPS average. This is in line with reports about problematic RT implementation in the Linux drivers that can eat up variyng degrees of RT performance. Needless to say that this is no great loss for the Strix Halo. It is simply not built for RT.

What confuses me however are their Tomb Raider results (highest setting 1080p) in Windows. It is 87 FPS, I assume average. I got in Linux 145 FPS average at these settings at Balanced mode and even the min FPS were substantially above 87 FPS. Maybe there is a mix up and those are results for 1440p? That would be much closer. Alternatively, Linux is maybe so much better in this game on Strix Halo. I doubt it is because of my passive build. At heart its also just a Framework Desktop.
 

irads

What's an ITX?
New User
Aug 27, 2025
1
2
Indeed I found two reviews from Phoronix (not much else though). The one did not give me the info I was looking for but the other really showed long full CPU load that appeared to fully saturate the heat sink.



It does appear that 60-140W load (averaging a bit above 100W) did not push the sustained CPU temperature beyond 80°C. There were short bursts to Tjmax but that was also at power bursts beyond 140W. That means a single side of the Airtop3 heat sink can manage the CPU load. I am fairly confident that my heat sink should have at the least an equal heat dissipation capacity and then some. Heat pipe capacity is a question but theoretically it should be sufficient in my case.

Overall I find that a good sign and I also figured out an easy fairly safe way to limit peak power draw of the CPU, by reducing the preferred max frequency via cpupower command, which can be made persistent via cpupower.service. The nice thing about that should be that it is really only cutting back the CPU a bit without effecting the GPU power budget. In case it is harder to get the heat dissipated from the tiny CCD compared to the fairly large iGPÜ.
Nice project.
To get lower temperatures of the Strix Halo at 140W you may want to optimize TIM between die and heatplate. If you reduce it to under 0.1 K/W, CPU should be under 90C.
At the time we paid a lot of attention to the TIM during Airtop development, which proved critical for getting it to work.

Just my $0.02.

Best regards,
Irad
 

theoldwizard1

Average Stuffer
Sep 10, 2018
58
7
I am almost "crazy" enough to build something like this for myself ! I would probably use a Strix Point motherboard so that I could use a 150W picoPSU and a maybe a Noctua NH-L9a
 

theoldwizard1

Average Stuffer
Sep 10, 2018
58
7
I am almost "crazy" enough to build something like this for myself ! I would probably use a Strix Point motherboard so that I could use a 150W picoPSU and a maybe a Noctua NH-L9a
 

TheJiral

Average Stuffer
Original poster
Jun 29, 2025
55
124
To get lower temperatures of the Strix Halo at 140W you may want to optimize TIM between die and heatplate. If you reduce it to under 0.1 K/W, CPU should be under 90C.

Thanks. I agree that there is likely room for further optimisation regarding Die/heatplate TIM. However this thing is a nightmare to disassemble and I have done so already and feel no urge to do it again right now ;)
The main issue I see is that I might be able to bring down APU temps but not the heatsink temp. I feel comfortable with up to 65C sustained under full load. Thats what I get with sustained 100W. Add to that the pretty small performance gains I see at even 140W.
 
  • Like
Reactions: Soul_Est

TheJiral

Average Stuffer
Original poster
Jun 29, 2025
55
124

Phoronix has investigated that very question I was looking at just days ago, but of course in a much more extensive fashion regarding benchmarking. Their results appear to confirm what I was seeing. The rendering and game relevant performance at "Performance" is almost the same as for "Balance" mode. The GPU cannot get beyond the power limit even at balanced so it has little to gain from a higher power limit as long as the CPU is not pushed very hard. However in mixed loads, when the CPU is getting really busy as well, it might start eating into the iGPU power budget and then up to around 25% higher results are possible for "Performance". I am not sure which games would fall into this category, probably something like City Skylines 2. Games with 3d graphics and extensive simulation. But something like Cyberpunkt2077 does not appear to benefit much at all.



A note on undervolting
I was mistaken about undervolting potential of the Strix Halo. Today I learned that ryzenadj can in fact undervolt the 395 CPU, but that function is not even documented in the git readme, only in the version comments. According to those, the GPU has indeed separate voltage settings from the CPU. Unlike predecessors undervolting the GPU is not implemented yet. So this means it is probably not worth it for me. In case they implement GPU undervolting I'd be interested in trying it out though.
 
Last edited:

TheJiral

Average Stuffer
Original poster
Jun 29, 2025
55
124
After a short while of usage I have to say, I am really happy with how everthing turned out. The only regret I have is that I cut it 500 EUR short and went for 64GB, instead of 128GB. Now I did not expect that I would get to such a conclusion that fast ;)

My wrong assumption was that the Halo Strix is too slow for larger LLMs. But with those new Mixture of Experts models, this is not true at all anymore. You can make good use of 128GB unified memory for nice MoE models that fill up 60-80GB and the rest with context. Anyhow, I made a choice. It has to suffice and indeed I am happy to report that I got to run even a 60GB model like GPT-OSS-120B, one of the brand new talked about models, one of those of the MoE kind. It does exile my system basically to swap in the process (and surprisingly, it stays fairly usable, as long as you don't start any other heavy tasks and don't mind 1-2 sec lag when switching windows).

So far my favourite models are Gwen3-32B (which feels pretty perfect for the 64GB Strix Halo in terms of memory demands and is servicable at 6 tokens/s) and well, GPT-OSS-120B. The latter is blazingly fast at 50 tokens/s but with the at the limit memory just manages one long answer, maybe, with luck 1-2 follow up answers until it fails.


But enough LLM talk. Here are some images of my final setup. It probably is not your average gamer desktop.



 
  • Like
Reactions: NinoPecorino

protocolsix

Caliper Novice
Dec 24, 2022
29
15
Added aluminum tape and then Kapton tape to the interior of the side cover to improve the incomplete shielding (the mesh is ASA, can't be bothered to figure it out in metal.)

Amazing build, thank you for the detailed writeup! Out of curiosity, if you didn't add this shielding would you expect problems with EMI? Not something I often see with DIY builds (3d printing, carbon fiber, etc).
 

TheJiral

Average Stuffer
Original poster
Jun 29, 2025
55
124
Amazing build, thank you for the detailed writeup! Out of curiosity, if you didn't add this shielding would you expect problems with EMI? Not something I often see with DIY builds (3d printing, carbon fiber, etc).

I have the system standing right next to PC speakers, that's why I wanted to have that but I don't think it is strictly necessary to have it shielded. I had the board without the side cover, just a few cm further away from the speakers and could not see any issues with it.

If I am not mistaken commercial PCs require shielding, or at least commonly have it (even though I am not sure if all Chinese Mini-PCs do). My reasoning was that I really wanted to have no ASA up and close to the board without some non-static capton tape in between. I already had aluminium tape so why not do it a bit more proper and shielding on the cover behind the board too. I looked up if partial shielding even does something or is a complete waste and it appears that if only the top and bottom are open, it should still reduce EMI, albeit less.

I assume my choice of material isn't quite your standard DIY printed case thing either. I printed everything in UL94-V0 ASA (non certified but it behaves like one, burns if assisted but does not drop material and stops a few seconds after loosing external heat source). I mean PLA or PETG wouldn't cut it anyway, given that the heat sink already reached 74°C and the base has to support around 5 kg of weight.
 

TheJiral

Average Stuffer
Original poster
Jun 29, 2025
55
124
Two developments which I cannot make totally sense of. On one side thermal performance appears to have substantially degraded over time. I am now approaching the 90°C with full multicore load within 10-15 min at Balanced and in Cyberpunk I am now at Tjmax but not throttling by much apparently.

I tightened the mount a bit but without a positive effect. I really do not want to dismantle the whole thing again and even if, it would make only sense when getting some PTM sheet. I think I'll just get rzyenadj and set a thermal limit to 90°C or something and let it throttle as needed.

Because the other thing I see is that I suddenly get considerably higher FPS with Cyberpunk Ultra preset (FSR balanced). Around 10% more at both 1080p and 1440p. Other than system updates the only thing that has changed that I can think of is that the GTT is now fixed to a pretty high level (almost all of the memory actually).

Cyberpunk (Ultra Preset with FSR Quality)
1080p: Av. 106 FPS, Min 85 FPS
1440p: Av. 68 FPS, Min 57 FPS

This is now almost exactly what notebookcheck measured for the Z2 Mini G1a which runs at highest power settings... on Windows I presume.



I also finally got around to measure real power draw at the outlet. It is pretty much where I expected it to be:


Power Save
Balanced
Performance
W
W
W
Idle
10​
11​
11​
Youtube fullscreen
23​
25​
33​
S-tui stresstest boost (multicore)
68​
154​
186*​
S-tui stresstest sustained (multicore)
68​
135​
n.d.​

*possibly thermothrottled


EDIT:
After some trial and error of getting a ryzenadj.service working I have no hard set the Max tctl temp at 88°C. Something I am still comfortable with. Crazy as things are, I got the identical 1440p results, even with a bit of throttling. Now that I have gotten rzyenadj up and running maybe I try the undervolting of at least the CPU... but not today.
 
Last edited:
  • Like
Reactions: NinoPecorino

TheJiral

Average Stuffer
Original poster
Jun 29, 2025
55
124
Undervolting the CPU worked, even though I am not clever enough to find a way to see the actual core voltages on my Strix Halo system.
A quick testing showed that with -40mV I can recover the losses from throttling at Balanced power level. -50mV lead to a funky crash of the desktop environment in which my Desktop was bit by bit replaced with empty black cubes on my monitor and then with artifacts, when I stressed the CPU with geekbench6.

As expected the undervolting option for the GPU was locked for Strix Halo.
 

TheJiral

Average Stuffer
Original poster
Jun 29, 2025
55
124
Shedding some light ... or rather heat.

On longer gaming loads it does get nice and toasty. I think after a few hours it can indeed reach up to barely 70°C at the hottest heatsink parts (but on the cooler 65°C or so). I guess this will grill the capacitors sooner or later. Under other loads the system does not get nearly as hot, so it is really just gaming that is probably counting down their lifespan. But hey, let's see how long they will last.

The PSU on the other side is staying somewhere between 40 and 50°C on its outside, which I think should be fine.

 

windv

Cable Smoosher
Jun 4, 2023
8
5
Thanks for sharing it all. It's helpful.

I will be starting work on my project too using some vapor chambers. It's going to be fun :)
 
  • Like
Reactions: TheJiral

TheJiral

Average Stuffer
Original poster
Jun 29, 2025
55
124
So Notebookcheck.com has published its review of the Framework Desktop, one of the most complete reviews of that machine for sure. They too were not talking much about the power levels though, maybe they did not even realise that it has 3 easily available very different power levels. So they appear to have tested it in the default "balanced" power mode, which is basically what I am doing (minus those modifications mentioned above).

Long story short, they were running Windows from what I can see but when I compare their Cyberpunk Ultra QHD (no FSR) results with mine I am intrigued how similar those are, almost identical: 51-52 FPS.

It is only one game but I take that as an indication that one can run Strix Halo entirely fanless, on Linux, and get the same performance out of it as when you are running it without crazy cooling mods and on Windows.

 

TheJiral

Average Stuffer
Original poster
Jun 29, 2025
55
124
Regarding power management. I was told on the Framework forum, that in Windows, changing power management settings to "Performance" actually does not increase power draw to 140W boost / 120W sustained. This could explain the Notebookcheck.com results and indeed results from many other reviewers that looks as if they did not operate the system beyond 100W sustained. Could it really be a Windows thing?

Apparently ryzenadj should still be an option, also in Windows but that is a lot more scary to use for many I guess than having an option in the OS power management. Not that this would be a problem for me, as I have no plans using Windows on that machine, nor to go to higher power settings either. It is still an interesting information.
 
  • Like
Reactions: Soul_Est