msystems' gaming FC5 (Streacom FC5) 6.6L

Lee.III.Will

Caliper Novice
Jun 7, 2017
24
19
Some of the bends look a bit sharp, but the diameter is above 4mm for the most part, so they are still functional.
Too little too late and I don't know if it would have helped but when I had to bend my heatpipes by hand(also couldn't use the pipe bender) I used solder to keep the shape. I picked up some cheap small diameter solder from a local hobby store(I believe it's for Stain glass windows). I then made a custom diameter and shape 'bending spring' with the solder. Since the solder is wound tightly around the heatpipe it minimizes the horizontal crimp that you'd otherwise see. Couple that with a bit of heat and a slow bend and you may see a better end cooling capacity. (If you wanted to spend more on another Accelero just to chance it, that is.)

Also, glad to see the HDPlex heatsinks being put to use!
 
  • Like
Reactions: msystems

msystems

King of Cable Management
Original poster
Apr 28, 2017
804
1,405
Temps? Excellent heatpipe work.


That's some special looking heatpipe noodle ramen you have there. I'm wondering if there is any benefit, because the GPU cooling looked almost done before adding the Arctic Accelero.

Thanks! Great questions... and free motivation to run some tests.

Do the Accelero's crazy bent heatpipes work properly? No, I don't know for sure :D

Here's all the evidence I came up with:

1. The temperature delta between the GPU core and the outer-most fin on the Accelero (furthest from the heat path) is 9C, measured by a thermistor throughout the duration of passive stress testing. 9C is comparable to the same measurement I recorded on the end of the long heatpipes in my earlier testing. I watched this temperature climb with the GPU core, early in the test, before the entire case shell could heat up. So it's not a false result from ambient heat absorption.

2. Very low tech method here, but... just by touch. I compared how the various heatpipes felt as the core was increasing in temperature when running a stress test. The Accelero heatpipe with the crazy bends was heating up about the same as all the others.

3. The passive performance level is measurably higher. It reaches passive thermal equilibrium at 75C after 1 hour on the .8V (115w) Afterburner preset. [Phuncz: previously, it was not able to reach equilibrium on this .8V setting without the Accelero. Yes it was close, but temps exceeded 80C after roughly 20 mins and I stopped the test]

4. Active airflow (about 40 CFM from a 120mm fan) is successfully dropping the temps.... a lot. I started an airflow test after reaching the 75C passive equilibrium: I recorded a 10C drop after 5 minutes, 15C after 10 minutes, 20C after 20 minutes, reaching 55C equilibrium with active cooling. If the Accelero was not working, then I don't think the active cooling would have that much of an effect. I don't have much to compare it to here, except that I've tried running air over the card before, and the resulting temp drops were nowhere near 20C.

But wait! There's more. If there is active airflow, it can run more than .8V...

It can run 2012mhz @ 1.0v with stable temps in low 70s!

Oh... but there's just this one thing - all the tests have been with the lid off. :eek:

In its current state, this is the world's first GTX 1080 Ez-bake oven.
 
Last edited:

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,947
4,952
That's awesome news ! You've really achieved something here that will no doubt inspire others to see what's possible with passive cooling !
 
  • Like
Reactions: AleksandarK

BaK

King of Cable Management
Bronze Supporter
May 17, 2016
967
958
Late to the party once again but ... WOW, Impressive!
Congratulation for all the work done and every detailed information you are mentionning throughout the whole thread! :thumb:
 

msystems

King of Cable Management
Original poster
Apr 28, 2017
804
1,405
GPU Airflow Planning

I've been taking a bit of a break on this lately, mostly sidetracked by joining the HWbot team and overclocking the piss out of this rig, which has turned out to be pretty fun. I'll get to that at the next post. But first, you might be wondering what happened when the lid was put on.

If you guessed that the lid trapped the heat and case temps rose, you'd be right, sort of. Something else happens though. This is a 2mm thick piece of precisely machined aluminum. The lid got extremely hot (60C+) via conduction from both sides of the chassis.

The net effect is such that the lid only traps a little more heat than it can help dissipate via conduction to it's massive surface area. If this lid was made of a typical chassis panel material such as - a piece of steel, plastic, or thin sheet of stamped aluminum, the result would not be like this at all, the heat would only be trapped inside.

So it's not so bad, but still needs airflow for best performance.

You might remember the CPU duct from earlier. To make airflow testing easier, I reused that design to create a simple and attractive ABS frame.



I also picked up some fan splitters, so I can run these off of a single header, and an adapter for the miniature GPU fan header.



While these fans weren't great at cooling the GPU directly, it turned out they work PERFECTLY as exit fans.



Even these weak 6 CFM fans have a noticeable effect here. Essentially it just improves the natural convection airflow, which was already designed into the chassis.

I am looking for the best intake fan to use. One idea is to use this:

It's the only quiet blower fan model I could find (under 20db), but it's a USB fan, and it runs at 4-6v not 12v.

The voltage has to be determined.



The "High" setting read about 5.1 volts.

I tried next to see if I could "create" this same voltage in Speedfan by adjusting the fan speed while measuring the voltage through the CPU fan header pins... not the safest thing :S



So 41% "speed" would match the same voltage for this fan (5.13v).

It's possible to hook it up to the motherboard header like this but seems unsafe. It's very likely that it could be given the wrong voltage by accident, all it would take is one mistake and then the fan would be fried (or worse) so I am hesitant to try it.

I have a couple of ideas to still make it work. One is to just use a common "fan speed controller" to dial down the output voltage from Speedfan. Then there would be no risk of damaging the fan.

Another is to just to have it run off the USB / Molex current, and experiment with using a thermal switch to control the activation threshold:

This switch reacts only to thermal affects and requires no electricity to operate. Normally open switch closes at rated temperature sending power to fan. Swivel bracket allows switch to be mounted to any flat surface.



It can control the fan activation at 50C without relying on the motherboard or software. Pretty neat, but a bit more work to install it.
 
Last edited:
  • Like
Reactions: AleksandarK

msystems

King of Cable Management
Original poster
Apr 28, 2017
804
1,405
Overclocking (for the hell of it)

Like I said... I got sidetracked by the HWbot thread. I was able to reach a decent 4.6 GHZ overclock on stock voltage. Is there any point to going higher? It's already questionable if the case can handle the temps.



I benched the 6700K up to 5050mhz at 1.392v, but it required disabling two cores from the immense temperature spikes. The chip at idle (36C) could instantly spike to 82C = 46C delta T.

The GTX 1080 reached 2,126MHz (+32.30%) / 1,364MHz (+8.99%) while maintaining a 4,900MHz (+22.50%) overclock on the 6700K in Uningine Heaven, basic settings. It reached a similar but slightly lower overclock of 2088mhz / 4.8GHZ in Heaven Extreme, due to higher power draw. GPU temps didn't exceed 60C.

It was not easy to stay within the capacity of the HDPlex 300 watt brick and required adjustments to limit maximum power draw. Despite this, the brick was able to be pushed very hard and supplied up to a 320 watt average load.
These are surprising results. It's probably going to be impossible to achieve this level of performance in the build.

Why stop at lapping the CPU heat spreader...why not go all the way and delid it?

Ah... well then. Ask and ye shall receive.








Trying to remove the IHS glue is maybe the most tedious thing ever. I did not have good results trying to scrape it off and ultimately resorted to using a razor blade. It's a really terrible idea for sure, but it worked.


This is after completing the prep work. These are sitting in a re-lidding tool. The part holding the IHS can be dropped right on to the other part and snapped into place.


Liquid ultra on the die and silicon adhesive on the IHS.


Clamping it up.

The first results were very bad. Temps were immediately over 50C after loading the Bios.

This is why:

Only a very tiny bead of TIM made contact with the IHS.
I don't know what the exact problem was here: either too much silicone, not enough clamping force, or too thin a layer of TIM. So I removed the silicone and applied more Ultra, and then replaced the IHS without using any sealant.


Above: Observing the Idle->Load temp. Delta after delidding. (1 core)


Now, I'm only seeing up to a 33C delta T on Idle->Load, for an improvement of 13C, depending on the number of cores in use. This is a great improvement and should allow for a higher turbo setting for 1-2 core performance. Max TDP will still have to be limited, to prevent excessive power draw on 4 core usage.
 
Last edited:

msystems

King of Cable Management
Original poster
Apr 28, 2017
804
1,405
Since exit fans seem to help, I continued down that road, and this is what I came up with:



Essentially it's just double the size of the earlier design, to handle four fans. Also an attempt was made at cable management with some clips for the cabling, but they still need a little more work.



I'm currently using a 6-to-1 fan splitter. What really needs to be done is have the individual fan cables shortened and spliced so there is not so much extra cable.

It looks like this:



It sits in there just fine, but is going to need some kind of attachment bracket so the fans stay level and secure.

You can get the idea even though they aren't perfectly aligned without a bracket yet.





I put the previously made 2-fan rack on the CPU side, and also installed an Arctic 92mm PWM fan on the GPU fan header.

Not quite how I imagined it when I started, but I think it's going to work. The noiseblocker fans are quieter than the VRM noise and aren't audible until 2000 rpm.

The 92mm Arctic fan is there a mockup. I don't know for sure what will ultimately be best for an intake fan but I think it's time to cut a hole and start testing.
 
Last edited:

msystems

King of Cable Management
Original poster
Apr 28, 2017
804
1,405
Yes, except all testing was done with more fans and the lid off like above. I did try a bigger SFX PSU last week to see how much performance is lost by using this 300w brick. In truth though-- not very much, if any. There isn't enough cooling to run this at 4.9 GHZ anyway so it won't matter.

If I use an external brick though :) the possibilities
 

MarcParis

Spatial Philosopher
Apr 1, 2016
3,672
2,786
Have you tested your oc cpu on intel burn test (high stress, 10 cycles) ?

How have you validated stability of your oc please?
 
  • Like
Reactions: AleksandarK

msystems

King of Cable Management
Original poster
Apr 28, 2017
804
1,405
No, in fact it's nowhere near stable at 5ghz. It took disabling cores to get above 4.8ghz, and it began to not matter how much voltage I added, I could see the end was near, stopped at 1.41v at 5050mhz. The memory controller or something was causing real instability problems and did not like any higher memory settings at all.

For benchmarking though, I would just run cinebench to see if it blows up, and then prime 95 for only a minute or two to see if it failed instantly or not. If not, it's good enough to get through a gaming benchmark at least since, they don't stress the CPU nearly as bad.

To get perfectly stable I will need to run memtest x86 and prime95 (avx disabled) for an extended period, I don't dare trying that on anything above 4.8 ghz.

That was all just for fun though, the actual setting I will prefer to run on this is 4.4ghz, using offset voltage setting of -.150v because the temperatures and power usage are excellent like this. All the extra power and cooling is needed for the 1080.
 
  • Like
Reactions: AleksandarK

msystems

King of Cable Management
Original poster
Apr 28, 2017
804
1,405
So it's been a while. Picking up on this project, the next step was to cut the hole for the GPU intake fan.

After some deliberation I decided against crudely cutting a hole myself with a handheld drill and hole saw. Too much risk of damaging the beautiful finish or warping the panel. Instead of ruining it, it made sense to take this to be professionally CNC cut.

I'm very glad I did...


Money well spent. There is just no way I could have done this myself and achieved this level of accuracy and quality.


Now for the fans. Option #1 is usage of a 92mm axial fan, Arctic F9 as a mockup here:







Option #2 is use of the AC Infinity 120mm radial blower fan:







Using the 120mm blower fan will require trimming part of the heatsink, because there is only about 100mm of room, so I re-assembled the build to test the performance of the axial fan first.


On to the performance tests!


Test Configuration

For performance testing below, I evaluated the maximum performance at different voltage levels and the level of fans needed to keep temperatures in check. If the GPU reached 80C or CPU reached 75C during the test, it's considered a failure. Here is the breakdown on what exactly each level means, in terms of the fan RPM and noise:

"Silent" < ~20db (Estimate)
50mm GPU Exit Fans x4 (CHA header, Speedfan controlled): up to 40% speed, <2000 rpm
50mm CPU Exit Fans x2 (CPU header, Speedfan controlled): up to 40% speed, <2000 rpm
92mm Axial Intake Fan x1 (GPU header, Afterburner fan curve): up to 30% speed, <1000 rpm

"Audible" < ~30db (Estimate)
50mm GPU Exit Fans x4 (CHA header, Speedfan controlled): up to 60% speed, <3000 rpm
50mm CPU Exit Fans x2 (CPU header, Speedfan controlled): up to 60% speed, <3000 rpm
92mm Axial Intake Fan x1 (GPU header, Afterburner fan curve): up to 40% speed, <1300 rpm

"Max" > ~30db (Estimate)
50mm GPU Exit Fans x4 (CHA header, Speedfan controlled): up to 80% speed, >3000 rpm
50mm CPU Exit Fans x2 (CPU header, Speedfan controlled): up to 80% speed, >3000 rpm
92mm Axial Intake x1 (GPU header, Afterburner fan curve): up to 100% speed, >1900 rpm

Test results

Unigine Heaven Extreme @ .800mv (~1800mhz)
Silent: Pass
Audible: N/A
Max: N/A
Conclusion: Silent

Unigine Heaven Extreme @ .900mv (~1900mhz)
Silent: Fail
Audible: Pass
Max: N/A
Conclusion: Audible

Unigine Heaven Extreme @ 1000mv (~2025mhz)
Silent: Fail
Audible: Fail
Max: Pass
Conclusion: Max fans

Witcher 3, Ultra settings @ 1000mv (~2025mhz)
Silent: Fail
Audible: Pass
Max: N/A
Conclusion: Audible

Stalker: Call of Pripyat w/Graphics mods, Ultra quality @ 1000mv (~2025mhz)

Silent: Fail
Audible: Pass
Max: N/A
Conclusion: Audible


Summary

With the addition of the GPU intake fan, the rig is able to dissipate enough heat to achieve a silent gaming system at a decent level of boost. It remains to be seen if it will be possible to maintain maximum boost in all games while still having a silent profile. This won't be known until the blower fan is tested in my next post, because I estimate that the blower fan will be much more effective than the axial fan. Additionally I plan to add ducts to channel the airflow for further improvement.

In the two games I tested, real gaming performance turned out to be not as demanding as the Unigine synthetic stress test. Unigine just hammered the GPU Load% and power limit to maximum the entire time. In real tests though such as the Witcher, the engine is apparently limited to 60FPS and I observed the GPU getting some "rest" so to say on less demanding scenes, with the voltage dropping down in the .8v range when not needed. This reduced the fan requirements quite well and means this game could be run with high boost speeds.

Stalker COP does not have a FPS limiter like Witcher 3, and voltage was pinned at 1.0v the entire time. Still, it appeared to generate less heat than Unigine, and the boost speeds could still be maintained fairly high, if not quite as high as the Witcher.

So next, I will try out the blower fan and see if performance is improved further.
 
Last edited:

MarcParis

Spatial Philosopher
Apr 1, 2016
3,672
2,786
Final aspect of the cut was worth to made it by pro..:)

In fact on unigine, as it's a benchmark it's not limited by vertical synchro...fps is going as high as cpu allows it.
In game you should have check vsync.

In what resolution have you run your benchmark?

Ps : monitor'power usage of your gpu, using hwinfo...this is the best value to track use of your gpu
 

msystems

King of Cable Management
Original poster
Apr 28, 2017
804
1,405
Thanks Marc I'll add that power usage to my overlay. These tests were at 1080p.

Yeah I don't know why they forced in the Witcher settings, a slider bar called "frame limit" I think, which can only go up to 60.
 
  • Like
Reactions: AleksandarK

msystems

King of Cable Management
Original poster
Apr 28, 2017
804
1,405
GPU Cooling update- Radial Fan

I went to swap out the axial fan and get the 120mm blower fan into this case. It takes up quite a bit more room, so about 15mm had to be permanently trimmed off of the Accelero heatsink fins, and some of the blower fan's exit duct bit too.





The blower fan sits flush with the heatsink fins and it gets plenty of air in there. It still needs a vertical duct/wall on either side of the card to channel the airflow all the way to the end though.



At the moment the blower fan is manually controlled via that switch behind the case, since it is powered off of USB and I don't have a good solution for controlling it yet. On the "Low" setting it's audible but not unreasonable.

GPU temps are well handled now so the challenge is really still just the overall heat factor as the entire case turns into an EZ-bake oven. CPU temps to creep up and and start hitting high 70s after a while during gaming. This is not dangerous, but I plan to redo the heatpipes on the CPU side with longer pipes that reach the end of the case, just like on the GPU side. It should lower the thermal resistance a little more. Also the exhaust fans need a proper support / attachment system as they are just sort of hanging there, and aren't working as efficiently as they could if they were flush against the lid.
 
Last edited:

msystems

King of Cable Management
Original poster
Apr 28, 2017
804
1,405
That looks awesome, super clean build and clean wiring. I would like to see what you did to the GPU^^

Time for an update

I have been using this rig as my daily machine. I still have not finished the few remaining things to do. One issue I've noticed is the HDPlex AC DC converter eventually overheats under heavy load after an hour or so (when supplying over 250 watts). It's hard to manage the waste heat that this AC DC unit gives off with the lid on. I am of the opinion that the HDPlex needs active cooling at peak load. So I just leave the lid off the case for heavy gaming and that cures the issue. (It's funny to me that this is what ended up being the point of failure.)

If I could do it again, I would use an external DC Brick. Also, I probably wouldn't bother with the extra heatsink over the CPU. I have some doubts if it is even a net benefit to the total thermal design. In the original design I planned to have ducted airflow on the CPU side, but since it ended up not being necessary, the heatsink there is just accumulating heat inside the case. I'm very happy with the way the GPU side came out and the effectiveness of the thermal transfer to the chassis is amazing.

At some point if I do finish the last few things here, I can start up a new project. 1080TI in a FC10? Why not!
 
  • Like
Reactions: dealda