Prebuilt [SFFn] ASRock's DeskMini A300 - Finally!

Valantar

Shrink Ray Wielder
Jan 20, 2018
2,201
2,225
Hi,

Currently, I am on Radeon 20.12.1 & Ryzen 2400G.

Is it OK if I want revert the Radeon software to the old version? This new version makes Windows 10 fonts to crisp for me.

If it is OK, to do that, just install the old version, right?
Why not just run through the windows cleartype setup to get font rendering to where you want it?
 
  • Like
Reactions: A300

gustav

Cable-Tie Ninja
Jun 18, 2020
193
90
Hey buddy, thanks for asking. Yeah I'm still stuck at 2400 MHz. On the bright side, it's stable at very tight timings like 14-15-15-30-48 or something like that.

Would you care to speculate why my 2200G is stable at 2933 MHz but my 4650G is only stable at 2400 MHz?

Yes I had a spare AM4 mobo the other day. I built a budget system for my ex-manager, with Asrock A320-HDV and my old 2200G. Why? What are you thinking, @gustav ?
Hey :)
Yea, but still pretty strange the whole thing around your Renoir. I have a thought:

Maybe the SOC VID was too high for renoir and it's intergrated memory controller. To exclude a defective CPU I would suggest you to try it out in a different mainboard - if you have the ability of course.

Maybe you have a friend where you can test your 4650G at. It's very strange behavior. By doing so I/you would try to rule things out. Then proceeded step by step.

First would be: Test the CPU on a different mainboard.

Update us, if you were able to do so :)
 

rubicoin

Cable-Tie Ninja
Jan 12, 2020
164
104
Happy to say, everything works like a charm. Thanks to @rubicoin for making it so easy by pointing out what to get, and how to set this up. Had over 99% in 3DMark stress tests before and after the change. Some numbers to crunch on (sorry they're all 3DMark... kind of lazy but good for comparison) against the i7-7700 when it had this same GPU with it (which, is a pretty comparable CPU at 4/8 and a max of 4.2GHz):

BenchmarkOverall scoreSystemGraphics ScoreCPU ScoreDateMax CPU/GPU tempsNotes
Time Spy1349A300 3400g 16GB1210389311/14/202072/56iGPU, 3.60 BIOS
Time Spy5404A300 3400g 16GB569941801/13/202167/69GTX 1070, 3.60K
Time Spy5412A300 3400g 16GB568842461/13/202167/67GTX 1070, 3.60S
Time Spy5735i7-7700 16GB5990462412/27/202071/67Stock cooler

Looks like the BIOS updates did help, raising the CPU score. A roughly 5% decrease as expected compared to the i7 desktop on the graphics front, presumably for the adapter. And now, perfect for gaming on a 4k 60Hz TV, you know, if you're into that sort of thing.

wow, awesome to hear that i inspired your build and i'm very happy that you are satisfied with the results! great scores you have there, nice cpu uplift with new bios as well. 5% for pcie loss is also very reasonable. did you cut i/o shield the same way?

btw GPU market is so f***ed up right now (bitcoin/eth to blame), that i've just sold my radeon 5500 xt for much more than i purchased it last year :\ my eGPU adapter is waiting for a radeon 6700/6600 series card sometimes later this year (or the next one maybe). no need to rush, till then i'm playing cyberpunk 2077 on geforce now with 1 year founders edition sub for a very reasonable fee. so i guess it makes no sense to burn our money on a new higher-end gpus at nonsense/scalper prices. still if you were lucky enough to get a 3060 ti and also have a 1070 laying around, this is a great and economic way to utilize it in a 2nd pc. thumbs up for that!
 
Last edited:

limsandy

Average Stuffer
Jul 3, 2020
71
31
Hey :)
Yea, but still pretty strange the whole thing around your Renoir. I have a thought:

Maybe the SOC VID was too high for renoir and it's intergrated memory controller. To exclude a defective CPU I would suggest you to try it out in a different mainboard - if you have the ability of course.

Maybe you have a friend where you can test your 4650G at. It's very strange behavior. By doing so I/you would try to rule things out. Then proceeded step by step.

First would be: Test the CPU on a different mainboard.

Update us, if you were able to do so :)

Why yes.... My main rig is a Ryzen 3600 with 3600 MHz memory tested stable on Asrock B450M Steel Legend. I could try my 4650G there but I can't be bothered to swap the CPUs, apply thermal paste, swap back, etc..... I might do it if I really have nothing to do. We'll see..... 🙄
 
  • Like
Reactions: gustav

paul

Cable Smoosher
Aug 6, 2020
9
8
wow, awesome to hear that i inspired your build and i'm very happy that you are satisfied with the results! great scores you have there, nice cpu uplift with new bios as well. 5% for pcie loss is also very reasonable. did you cut i/o shield the same way?

btw GPU market is so f***ed up right now (bitcoin/eth to blame), that i've just sold my radeon 5500 xt for much more than i purchased it last year :\ my eGPU adapter is waiting for a radeon 6700/6600 series card sometimes later this year (or the next one maybe). no need to rush, till then i'm playing cyberpunk 2077 on geforce now with 1 year founders edition sub for a very reasonable fee. so i guess it makes no sense to burn our money on a new higher-end gpus at nonsense/scalper prices. still if you were lucky enough to get a 3060 ti and also have a 1070 laying around, this is a great and economic way to utilize it in a 2nd pc. thumbs up for that!
Haven't cut the I/O shield yet (just left it off for my testing there) but that is this afternoon's adventure. I'm going to take a close look, not sure if I can cut a slot and leave the edge of the shield in place, or if the notch approach you used is better. Either way I'll double check alignment first, of course. And again, appreciate the phots and walkthrough!

Yea, I debated selling too. Ugh, it's way too crazy right now... but I also maintain the PCs for a number of people in my family, so there's always a home for a GPU (I'm actually surprised how well quad-core processors continue to hold up, even if they're not bleeding edge). I think I might swap the 1070 for a 1650 super in a computer I built for my nephew, due to power consumption and the fact that I won't use it *that* much with the 3060 Ti around (and I am super lucky - snagged it for retail).

Given that power brick is 220W, I'm a little concerned I'm right on the edge of power. The 5500 xt was 135w TDP IIRC, while the 1070 is 150w and the 1650 Super is only 100w... my understanding is they'll pull more than that at times (with the 1070 having a factory overclock, could be at 170w already, and I think 176w is the correct target of the brick at 80%?), and ultimately I'd like to maximize efficiency, minimize performance loss, space and waste. Or maybe, I'm right where I should be.

And I like the Geforce NOW idea... definitely an excellent alternative to scalpers. I get a free one year with my 3060 Ti, so I'll have to see how that performs too. It's such a great time to be into SFF.
 
  • Like
Reactions: rubicoin

rubicoin

Cable-Tie Ninja
Jan 12, 2020
164
104
Haven't cut the I/O shield yet (just left it off for my testing there) but that is this afternoon's adventure. I'm going to take a close look, not sure if I can cut a slot and leave the edge of the shield in place, or if the notch approach you used is better. Either way I'll double check alignment first, of course. And again, appreciate the phots and walkthrough!

Yea, I debated selling too. Ugh, it's way too crazy right now... but I also maintain the PCs for a number of people in my family, so there's always a home for a GPU (I'm actually surprised how well quad-core processors continue to hold up, even if they're not bleeding edge). I think I might swap the 1070 for a 1650 super in a computer I built for my nephew, due to power consumption and the fact that I won't use it *that* much with the 3060 Ti around (and I am super lucky - snagged it for retail).

Given that power brick is 220W, I'm a little concerned I'm right on the edge of power. The 5500 xt was 135w TDP IIRC, while the 1070 is 150w and the 1650 Super is only 100w... my understanding is they'll pull more than that at times (with the 1070 having a factory overclock, could be at 170w already, and I think 176w is the correct target of the brick at 80%?), and ultimately I'd like to maximize efficiency, minimize performance loss, space and waste. Or maybe, I'm right where I should be.

And I like the Geforce NOW idea... definitely an excellent alternative to scalpers. I get a free one year with my 3060 Ti, so I'll have to see how that performs too. It's such a great time to be into SFF.

learn from my error and cut only a much smaller part of the i/o shield just right at the optimal spot :p

i consider the dell 220w power brick to be ok for a gpu with max. 150w typical gaming/200w stressed power consumption, so beside price this should be the main factor when choosing my next radeon 6000 series card. i agree with you, a low power sff system like a300 is best fit with an efficient and lower power egpu around 100-125w tgp anyway.

geforce now has its limits, you need good cable internet connection with low latency, and be aware that streamed picture quality is very noticable compared to a local render. also forget rtx with demanding titles like cyberpunk on nvidia's last gen server architecture, it completely demolishes performance (i'm not counting dlss, no way to apply it and reduce image quality even further). still it is a great service, you can play your purchased steam/gog/epic games virtually anywhere. just set up an old laptop with ethernet port in 5 min, hook up its video output to a tv and you are ready to go. and gn is even better for expanding the graphics capabilities of a300 for a fraction of the cost of a new egpu setup. this way you can not only keep prices down but also keep the awseome 1.92 L form factor!
 

A300

SFF Lingo Aficionado
Jul 13, 2019
96
14
Why not just run through the windows cleartype setup to get font rendering to where you want it?

Thanks for your hins, it is a little bit better, but still not as comfortable with the old drivers.

So, at last, I was uninstalled this new driver and revert to the old one ver 19.6.3.

This is the very old one, but I am comfortable with it.
 
Last edited:

Hammerfest

Trash Compacter
Jul 15, 2019
47
40
I really hope we get a beta bios update with the latest AGESA for EITHER line (1.2.0.0 beta on my gaming and server builds are rock solid for 2 days when I couldn't get that before), the current "release" 1.0.0.1 is a disaster :/

Come on ASRock, your rocking it for 300/400/500 on AGESA updates for all other form factors, WE STILL EXIST
 
  • Like
Reactions: gustav

alles_alles

SFF Lingo Aficionado
Aug 11, 2020
107
28
still no official renoir/cezanne support for a300. and never will be.

and still waiting for someone here to post pictures of a working a300 with x300 bios. my last info is that saving settings does not work with it. someone please deny it and confirm that all ok :p
official not . But it works well in most of cpus :)
 

Nunuji

Caliper Novice
Jan 17, 2021
23
8
Hi everyone! Long time reader of this thread (and a few other A300 and SFF ones), but first time poster.

I've had a lot of success with my A300 shenanigans so far and plan to share them in the near future, but I figured I'd ask a few questions first...

So, does anyone have and experience with modding heatsinks? Specifically removing "non-essential" material to make them fit, like removing a few fins to reduce length/width, or potentially sanding down the top of the fins to reduce height (NOT lapping the the base that interfaces with the IHS or chip).

And then a follow up to that, does anyone have any experience with the "Noctua NH-L9x65?," or know the distance from the base to the top of the heatpipes? Basically I want to see if I could fit something like the L9x65 into an A300 by removing some material (almost 20mm of it?) from the top of the heatsink, above the heatpipes. I understand the fan would no longer clip in, but I have a workaround for that.
 

rubicoin

Cable-Tie Ninja
Jan 12, 2020
164
104
Hi everyone! Long time reader of this thread (and a few other A300 and SFF ones), but first time poster.

I've had a lot of success with my A300 shenanigans so far and plan to share them in the near future, but I figured I'd ask a few questions first...

So, does anyone have and experience with modding heatsinks? Specifically removing "non-essential" material to make them fit, like removing a few fins to reduce length/width, or potentially sanding down the top of the fins to reduce height (NOT lapping the the base that interfaces with the IHS or chip).

And then a follow up to that, does anyone have any experience with the "Noctua NH-L9x65?," or know the distance from the base to the top of the heatpipes? Basically I want to see if I could fit something like the L9x65 into an A300 by removing some material (almost 20mm of it?) from the top of the heatsink, above the heatpipes. I understand the fan would no longer clip in, but I have a workaround for that.

i'm all about modding a300, i really am. still, for me it'd make no sense to mod and squeeze a cooler with 65mm height into a300 which has a max clearance of 46mm. especially when you consider that the 37mm hight of NH-L9a-AM4 gives you the option to swap the 14mm fan with a 25mm one without problems for lower price. see my posts about the swap here back in early 2020 if you missed it. i guess heatsink size of NH-L9a must be easily enough for any apus supported (based on the fact that my 4650g runs cool at even 900 rpm in this 25mm fan configuration).
 
Last edited:

gustav

Cable-Tie Ninja
Jun 18, 2020
193
90
i have a great comparision with the exact same ram, 4650g completely demolishes 2400g:
https://valid.x86.fr/tsv63a
Hahhaah, well that's very nice.
Now I'm actually really thinking of 4650G purchase, since your system is pretty 1:1 to mine! Except the CPU and Harddisks.
You're reporting no problems in usage of the 4650G... It will be probably my next upgrade. Even your RAM clock is same as mine!

@alles_alles unofficial is unofficial. I have to agree with @Hammerfest Asrock can do things if they want! Look at all those AGESA v2 roll outs. We are left on our own. @ASRock System
 

Nunuji

Caliper Novice
Jan 17, 2021
23
8
@rubicoin So ultimately my plan is to put a 120^2x15mm fan in there, so I wanted as tall a heatsink as possible to reduce turbulence. I would put 120^2x25, but it won't clear the RAM, and if I use a 95^x25mm it won't cool both sticks of RAM.

I want the ram to get a little more cooling because I currently have it at 3600 CL16-19-19-19-38 and it's a little toastier than I'd like. ^^;

I might end up just getting an L9a like you suggested and seeing how it handles the 120mm fan.
 
  • Like
Reactions: rubicoin

rfarmer

Spatial Philosopher
Jul 7, 2017
2,588
2,702
I need some advice from the experts. I have a DeskMini 110 that I have been using for several years as a HTPC/Plex server, 6600k, 20GB ram, 128GB M.2 and 2x2TB 2.5 HDDs. Works well but I have been wanting something with more graphic power and more than 4 cores. Just ordered a DeskMini x300w. I want to pair it with a 4650G which I realize is OEM and not readily available. I found it on AliExpress for $196.61. My question is if AliExpress is the best buying option for shipping to the US or are there better? Thanks for any input.
 

yuusou

SFF Lingo Aficionado
Mar 16, 2019
115
70
The seller on aliexpress selling it for just under 200 bucks is a new store with no reputation so far. I'd be warry of that one in particular.