Log Deskmini x300 "Turbine", a 12vDC brickless Vehicle build

HydrAxx747

SFF Lingo Aficionado
Feb 23, 2021
97
116
I found that 2000mhz FCLK with good timings and the 2200mhz GPU @1.15v Soc was a good compromise. The Corsair bdie kit here could probably keep going, but vDimm is limited at 1.35v.



2000mhz CL17 is about as fast as the 2100mhz CL18 level, but more stable for the FCLK & GPU.

Nice tweaking of your Corsair Vengeance 3800MHz CL18 kit, I myself have essentially the same type of kit as you: the model above, that is to say the Corsair Vengeance 32GB(4x8GB) 4000MHz CL19 kit which I uses only 2 memory modules like you (it's a shame that we don't have 4 DDR4 SODIMM slots on our ASRock X300M-STX motherboards, we would have 32GB of Samsung B-Die High binned😊). I wanted to know: Did you try to push higher in frequency than 4000MHz CL17 and 4200MHz CL18? For example 4400MHz or even more? And if so, do you have any screenshots of your timing settings under "ZenTimings"? Because I am in the process of completely reassembling my configuration based on DeskMini X300 motherboard in another box (Akasa Cypher ST to which I have made some modifications) and I have gone from a Ryzen 7 PRO 4750G to Ryzen 7 5700G because past 4000MHz on DDR4 on the 4750G, its Vega 8 iGPU sees its VRAM speed stuck at 2000MHz, which is not the case on the 5700G.

Another thing, I took a Crucial Ballistix 32GB (2x16GB) 3200MHz CL16-18-18-36-1T kit in addition to do overclocking and latency tweaking tests because apparently these memory modules would be equipped with Micron E-Die chips which are reputed to go up fairly easily to at least 4000MHz in CL18-20-20-... after seeing if they can really get there, see pushing further? But if it was really the case, it would finally allow me to go from 16GB to 32GB of total RAM, anyway I would see... otherwise I saw other rumors if a Japanese forum mentioning 32GB kits (2x16GB) of OEM/generic SKHynix 2666MHz equipped with SKHynix CJR or DJR chips which would support operating at 4000MHz or more with latencies between CL18-21-21-... and 19-22-22-... under 1.35v of course.
 
  • Like
Reactions: msystems

msystems

King of Cable Management
Original poster
Apr 28, 2017
784
1,370
Did you try to push higher in frequency than 4000MHz CL17 and 4200MHz CL18? For example 4400MHz or even more?
I was able to boot with 4400mhz but when I tried to bring infinity up to 2200mhz (using ryzen master - Asrock bios won't set above 2000 no matter what it says) it crashed badly and I had to reset cmos, so I wasn't sure it was worth pursuing over the 4200 or 4000.

From what I read, Hynix is almost as good (can hit high frequency, worse timings) but might take a bit of luck. Bc its likely 2666 binned 16gb sodimm and they aren't paired. But they are cheap, like $50 per dimm, so its worth a shot.

I am looking into how to voltmod the board to get 1.5v on the memory which in theory could make 4000mhz cl14 possible.
 
  • Like
Reactions: HydrAxx747

HydrAxx747

SFF Lingo Aficionado
Feb 23, 2021
97
116
I am looking into how to voltmod the board to get 1.5v on the memory which in theory could make 4000mhz cl14 possible.
Someone tried to do it by software modding the BIOS P1.70 of the ASRock X300M-STX motherboard via a modification of the hexadecimal value at the offset linked to the "1.35V" setting of the DRAM voltage but it does not work not apparently, the DRAM voltage values could have been hardware locked.

Source: see post #75 on page 5 and post #84 post on page 6 of this thread I found on an outside forum:


and


So it would take a hardware mod, I think but I don't know enough about it to realize that ;-)
 
  • Like
Reactions: msystems

msystems

King of Cable Management
Original poster
Apr 28, 2017
784
1,370
Nice find, looks like it had to be hardware modded. Need to locate the dram voltage regulator. I have some pictures I found on the internet of the board, but the text on the chips isn't very legible. I need to take some high res photos and then try to ID them.
 
  • Like
Reactions: HydrAxx747

SFFMunkee

King of Cable Management
Jul 7, 2021
673
681
Nice find, looks like it had to be hardware modded. Need to locate the dram voltage regulator. I have some pictures I found on the internet of the board, but the text on the chips isn't very legible. I need to take some high res photos and then try to ID them.
ASRock are sometimes responsive to these sorts of questions too, I've seen them put up beta BIOS versions for similar things. Worth asking their support team to see if they can get you in contact with an engineer anyway! They might be able to solve it with a beta BIOS, or point you in the direction of hardware mods :)
 

msystems

King of Cable Management
Original poster
Apr 28, 2017
784
1,370
Did a little electrical prep work for a potential eGpu, still in the early stages.

It could be powered either by externally regulated 12v dc-dc converter such as a Victron Orion OR a "wide" dc-atx internally such as the HDPlex 400 dc-atx which requires a 19v step-up externally. There are a few 12v dc-atx out there, but problem is they aren't wide input or don't have a strong 12v rail.

12vDC eGpu (Victron Orion - 30a @ 12v)19vDC eGpu (19v 20a Step up + HDplex 400 dc-atx, 35a @ 12v)
-Smaller eGpu. 12v plugs right in, no dc conversion done inside
-Built in switch, relay, & on or off via Bluetooth phone app
-Can automatically cut power if battery voltage drops too low
-Electrically isolated
-Slightly less ripple (2mv vs 10mv)
-Reusable for many other things (it is also a charger)
-Cheaper (half the cost)
-Uses lighter gauge wiring, less voltage drop
-Less risk of damaging Gpu with incorrect voltage, since HDPlex will filter it
-Not any less efficient (97% + 95% vs. 88% Victron)
-Can use the 19v output for the x300 or 19v laptop too
-Uses a brick when on mains (less clutter than a PSU?)

Since I already had a HDplex 400 dc-atx, I got the 19v converter. It came in a plain white box with no label and definitely has that distinctive no-frills industrial look. These are about $25 on Ali Express.



Test setup using the x300 as a 19v load tester



With 120 watt load in benchmarks for about 15 minutes, the converter's temperature did not seem to increase at all. Comparing the amperages with a clamp, I estimate it was only wasting a few watts, so the claimed 97% efficiency seems to be true. This should work with an HDplex.

I am not sure if it really matters, but I could split the 19v output to continue to power the x300 this way. It might add a bit of protection from spikes and noise on the dc system, and it is less amperage on the DC barrel.
 
Last edited:

SFFMunkee

King of Cable Management
Jul 7, 2021
673
681
Did a little electrical prep work for a potential eGpu, still in the early stages.

It could be powered either by externally regulated 12v dc-dc converter such as a Victron Orion OR a "wide" dc-atx internally such as the HDPlex 400 dc-atx which requires a 19v step-up externally. There are a few 12v dc-atx out there, but problem is they aren't wide input or don't have a strong 12v rail.

12vDC eGpu (Victron Orion - 30a @ 12v)19vDC eGpu (19v 20a Step up + HDplex 400 dc-atx, 35a @ 12v)
-Smaller eGpu. 12v plugs right in, no dc conversion done inside
-Built in switch, relay, & on or off via Bluetooth phone app
-Can automatically cut power if battery voltage drops too low
-Electrically isolated
-Slightly less ripple (2mv vs 10mv)
-Reusable for many other things (it is also a charger)
-Cheaper (half the cost)
-Uses lighter gauge wiring, less voltage drop
-Less risk of damaging Gpu with incorrect voltage, since HDPlex will filter it
-Not any less efficient (97% + 95% vs. 88% Victron)
-Can use the 19v output for the x300 or 19v laptop too
-Uses a brick when on mains (less clutter than a PSU?)

Since I already had a HDplex 400 dc-atx, I got the 19v converter. It came in a plain white box with no label and definitely has that distinctive no-frills industrial look. These are about $25 on Ali Express.



Test setup using the x300 as a 19v load tester



With 120 watt load in benchmarks for about 15 minutes, the converter's temperature did not seem to increase at all. Comparing the amperages with a clamp, I estimate it was only wasting a few watts, so the claimed 97% efficiency seems to be true. This should work with an HDplex.

I am not sure if it really matters, but I could split the 19v output to continue to power the x300 this way. It might add a bit of protection from spikes and noise on the dc system, and it is less amperage on the DC barrel.
Is that step-up designed for a slightly variable input, or does it expect regulated 12V? If it's intended to work with a wobbly input then that'd work well, but if not I'd worry about it's longevity in the long run.
 

msystems

King of Cable Management
Original poster
Apr 28, 2017
784
1,370
12-19v input according to the product page, although the label on the device only says 12v.

It doesn't really look built to last but if it fails catastrophically I have it fused and some level of comfort knowing that the HDplex is standing between It and the Gpu.
 

msystems

King of Cable Management
Original poster
Apr 28, 2017
784
1,370
A while back I read a story about crypto miners being used to heat apartment buildings so I decided to try that concept, but on a micro-scale in the van camper.

In the previous post I got the 19v boost converter working, with intention to use it with the HDplex dc-atx. Then I made an eGpu assembly from existing parts I had laying around, which has the Deskmini mounted onto a passive cooled GTX 1080, with one 120mm fan for airflow - basically silent at night. The HDplex has it's own latching anti-vandal button to power it up independently.



With the 19v wiring in place, the transition of bringing the eGpu system from mains power over to DC was really simple. All it involved was replacing the HDPlex's Pci-e input connector with XT-90 plugs, which have a built in resistor to be able to precharge the HDPlex's capacitor bank when it is connected to the live DC system.



With everything hooked up, the system was drawing about 140 watts when mining (1080 = 33MH @ 112w). 140w x 8 Hours =1.1KWH, and the bank capacity is 2.5KWH. So less than a 50% discharge overnight.

As for the heating, well it did raise the ambient by a few degrees, definitely noticeable, but 140 watts is only 477 BTU. By comparison a diesel heater is 7000 BTU (way overkill for such a small space). So it's only about 7% as powerful as the diesel heater, but it doesn't need gas and it doesn't make noise. It seems to have some utility now, since it is the summer, but I don't expect it to do much in the winter.

As for the mining, it makes a very insignificant amount of money overnight since it's only 33MH ethash. There is only really electrical headroom for another ~60 watts TDP, maybe a 3070. Scaling up the idea beyond that would need a larger vehicle with more solar capacity and a massively increased battery bank.


The last thing worth mentioning is that the 5700g iGpu didn't just become useless - it's a bit clunky, but I found the feature that allows setting GPU at the application level really useful.



I couldn't figure out a way to switch between the GPUs with a keyboard button like a laptop can. But I could manually set which programs use which Gpu. After that was done it worked perfectly. So it was possible to have the 1080 just continue mining and pass all the other rendering to the iGpu, even to run games on the 5700g at the same time with no noticeable impact.
 
Last edited:

msystems

King of Cable Management
Original poster
Apr 28, 2017
784
1,370
Just updating this since I finally got the chance to upgrade to a 3060ti for a decent price, so I have removed the passive GTX 1080 from the equation. I remade the GPU adapter system to just attach the 3060ti to the main case and incorporate a Pico box 200 watt DC step down module. The current design is a lot more compact and lightweight, 3.5 liters when operated on brickless on the 14v lithium system.


The Pico is just hardwired "on".... so there is no latching switch for the GPU side for simplicity. However it's still possible to leave the GPU "unplugged" to save power when powerful graphics aren't needed, and just use the APU. There is a complete write-up with details on the GPU module system here.



Cyberpunk Ultra 1080p perf - about 85fps average on 150 watts GPU + 30 watts CPU.

The gaming thermals and noise are decent. Under 70C and about 60% fans. For safety the 3060ti is at only 75% power level to not push the Pico too hard, so the fans don't need to ramp high either. The CPU side is using the Noctua 120mm which is basically silent and keeps the memory really nice and cool.

 
Last edited: