Prebuilt [SFFn] ASRock's DeskMini A300 - Finally!

gustav

Cable-Tie Ninja
Jun 18, 2020
193
90
Hi guys,

I have display scale set set to 125%, but, probably when not enough RAM, screen became blank for a while and when screen back to alive, display scale reset to the default.

Had to rebbot the pc to restore the display back to its previous settings.

Radeon App (ver 20.11.1) not open when double click it. CPU 2400G, BIOS 3.41, Windows 10 Pro. 16GB RAM.

Anyone have the same experince with mine?
You think that a 125% DPI-Scale consumes more VRAM (GPU-RAM) than usual? Nope, should be not the case.
Blank screen hints at some driver problems. How is your monitor connected? HDMI or DP?

Offtopic: I got Thinkpad L14 with AMD 4500U, 16GB 3200MHz@CL22,1T - I have to say it's amaizing. And the 6 Renoir cores (no SMT)
are outrunning the 4C/8T Ryzen 2400G. Basically "no" iGPU difference (Valley-Benchmark 1850 (2400G) vs 1450 points (4500U))
and still much better than i5-10210 (about the same price-point). I'm amaized. The iGPU is even better, when looking fron laptop-standpoint.

Cinebench R20:
Ryzen 2400G = 1780points
Ryzen 4500U = 2100points
 
Last edited:

rubicoin

Cable-Tie Ninja
Jan 12, 2020
164
104
Offtopic: I got Thinkpad L14 with AMD 4500U, 16GB 3200MHz@CL22,1T - I have to say it's amaizing. And the 6 Renoir cores (no SMT)
are outrunning the 4C/8T Ryzen 2400G. Basically "no" iGPU difference (Valley-Benchmark 1850 (2400G) vs 1450 points (4500U))
and still much better than i5-10210 (about the same price-point). I'm amaized. The iGPU is even better, when looking fron laptop-standpoint.

Cinebench R20:
Ryzen 2400G = 1780points
Ryzen 4500U = 2100points

wow is it a 15W part? or is it configured to 25W max?

my 65W 6/12 core 4650G scores 3707 in Cinebench R20 and 1967 in Valley-Benchmark, so 4500U is defenitely a very capable 6/6 core mobile apu! a 6/12 core 4600U laptop would make 2400G/3400G obsolete. not speaking of 4700U/4800U or H/HS renoir versions, neither upcoming cezanne parts ⚡
 
  • Like
Reactions: gustav

A300

SFF Lingo Aficionado
Jul 13, 2019
96
14
You think that a 125% DPI-Scale consumes more VRAM (GPU-RAM) than usual? Nope, should be not the case.
Blank screen hints at some driver problems. How is your monitor connected? HDMI or DP?

Offtopic: I got Thinkpad L14 with AMD 4500U, 16GB 3200MHz@CL22,1T - I have to say it's amaizing. And the 6 Renoir cores (no SMT)
are outrunning the 4C/8T Ryzen 2400G. Basically "no" iGPU difference (Valley-Benchmark 1850 (2400G) vs 1450 points (4500U))
and still much better than i5-10210 (about the same price-point). I'm amaized. The iGPU is even better, when looking fron laptop-standpoint.

Cinebench R20:
Ryzen 2400G = 1780points
Ryzen 4500U = 2100points
I don't think DPI scale consumes more RAM. I said when it was not enough RAM, I ran a few applications
when this blank screen happens.

Suspected that the AMD Radeon software that was not runs well when RAM was not enough.

Btw, monitor connected via HDMI.
 

Valantar

Shrink Ray Wielder
Jan 20, 2018
2,201
2,225
I don't think DPI scale consumes more RAM. I said when it was not enough RAM, I ran a few applications
when this blank screen happens.

Suspected that the AMD Radeon software that was not runs well when RAM was not enough.

Btw, monitor connected via HDMI.
Sounds like a display driver crash; if it persists I would remove the old driver with DDU and reinstall the newest driver. Highly unlikely to be related to RAM usage.
 
  • Like
Reactions: gustav

rubicoin

Cable-Tie Ninja
Jan 12, 2020
164
104
Still waiting for someone to prove (or disprove) you can run 4750G with 3200 CL16 memory on an A300 or X300...

it's been months but no new beta BIOS published on jzelectronic.de for A300. i guess they could not fix this issue on this level and it remains a random behavior (CPU & RAM coupling lottery). if you want to go safe, you should choose 4650G instead (at least for now or wait for Cezanne @ X300).
 

gustav

Cable-Tie Ninja
Jun 18, 2020
193
90
wow is it a 15W part? or is it configured to 25W max?

my 65W 6/12 core 4650G scores 3707 in Cinebench R20 and 1967 in Valley-Benchmark, so 4500U is defenitely a very capable 6/6 core mobile apu! a 6/12 core 4600U laptop would make 2400G/3400G obsolete. not speaking of 4700U/4800U or H/HS renoir versions, neither upcoming cezanne parts ⚡
Yea, indeed, fully agree with you.
I can say that the Intel i5-10210 I had here for 2 months went in thermal throttle while doing anything. Well it was 4C/8T, boosted up to almost 4GHz and was 15 Watt XD why?! I want sell MHz while on the other side I'm not allowed to run that high, + single channel memory and still slower than a 4500U (about 1/3, synthetic).
The 4500U should be 25W as the case is about ~0,4cm thicker, and no throttle, even while csgo benchmark. Now this is an upgrade :D
I don't think DPI scale consumes more RAM. I said when it was not enough RAM, I ran a few applications
when this blank screen happens.

Suspected that the AMD Radeon software that was not runs well when RAM was not enough.

Btw, monitor connected via HDMI.
Maybe you're able to lookup the cause of the crash in windows event log or in german 'Ereignisanzeige/System'.
Hope that it's not THREAD_STUCK... exception. :/

On the other hand, if it's the exact exception, you're welcome. I experience the same in my setup. It's very sporadic. There can be a week with no crashes. And sometimes up to 3 crashes a day. I did not manage to find the root of the cause.
 

A300

SFF Lingo Aficionado
Jul 13, 2019
96
14
Maybe you're able to lookup the cause of the crash in windows event log or in german 'Ereignisanzeige/System'.
Hope that it's not THREAD_STUCK... exception. :/

On the other hand, if it's the exact exception, you're welcome. I experience the same in my setup. It's very sporadic. There can be a week with no crashes. And sometimes up to 3 crashes a day. I did not manage to find the root of the cause.
Windows isn't exactly crashed, only DPI scale reset to default & Radeon App couldn't be open.

When Radeon App clicked, nothing happen, but Windows OS still operated normally.
I noticed this when opened lots apps that consumes lot of RAM.

After Windows rebooted, everything came normal again.
 

rubicoin

Cable-Tie Ninja
Jan 12, 2020
164
104
off topic:
i jumped a bit on the Cyberpunk 2077 hype train yesterday with 4650G and 5500 xt 4gb. 1080p medium default settings & 25-40 fps only, but man, this game is pretty and fun. i guess next step should defenitely be an epgu upgrade to 6500/6600 xt, as fortunately a300 is far from being cpu-bound at 2560x1080p. good to know that this tiny little rig still has potential to run even the most demanding new games at moderate resolution and settings.





update: someone just reported that cp2077 runs smoothly on 3400G igpu, 16gb ram, 1440x900 low settings (lags on fullhd though).
 
Last edited:

rubicoin

Cable-Tie Ninja
Jan 12, 2020
164
104
try this

check comments


thanks, i've previously seen this exe patching thing, but in my case 4650g is so underutilized (15-25% indoors) that i did not even care.

but now i checked task manager, as you can see all cores are used out of the box (1.04 steam version):

this explains why i'm so far from cpu binding, 4650g would need much beefier gpu to shine in games :p
 
  • Like
Reactions: D3NPA

Valantar

Shrink Ray Wielder
Jan 20, 2018
2,201
2,225
I would imagine the 4650G could feed pretty much any GPU on the market, especially in a heavily GPU-limited game like Cyberpunk. Very cool to see it running (even at very low resolution) on the iGPU though! If it hits 60fps somewhat steadily at 1600x900, that's way better than previous-gen consoles - the base PS4 struggles to maintain 30fps (dipping below 20 at times), and scales dynamically between 720p and 900p, with the former being more common.

thanks, i've previously seen this exe patching thing, but in my case 4650g is so underutilized (15-25% indoors) that i did not even care.

but now i checked task manager, as you can see all cores are used out of the box (1.04 steam version):

this explains why i'm so far from cpu binding, 4650g would need much beefier gpu to shine in games :p
The low utilization of SMT threads is pretty obvious in that screenshot - the 2nd and 4th column show much lower loads than the 1st and 3rd ones. Of course the utilization past 4 native threads is also pretty low, but I would imagine the load on the SMT threads there is just background tasks and Windows processes. Kind of odd for the game to not use SMT threads on Ryzen though, given that these are typically much more performant than SMT threads on Intel (AMD gains ~50% MT perf with SMT on, while Intel typically gains ~25% at best) - but then again that's in compute loads, and early Ryzen (1000 and 2000, especially Threadripper) did typically perform better in games with SMT off. Might be an attempt at improving performance on those systems, but that seems rather heavy-handed.
 
  • Like
Reactions: rubicoin

yck3110

Cable Smoosher
Jul 16, 2019
11
0
Anyone have experience on overclocking ram frequency of samsung c-die sodimm ram with zen2 apu on deskmini x300?

I have difficulties on overclocking F4-3200C22D-32GRS (2x 16gb)with 4750g on deskmini x300.
Originally, this ram kit has timing of 3200-22-22-22-52 at 1.2v, and i could only tighten the timing a bit to 3200-18-20-20-52 at 1.35v
if i increase the ram frequency above 3200mhz and the FCLK above 1600mhz, it can be booted into windows and able to run some benchmarks, i.e 10 mins of R23,
but eventually, it will ramdonly crash. I just cant stably overclock the ram frequency.
does anyone have any advice on overclocking c-die rram?
it is a must to increase the soc voltage, to stabilize ram overclocking?
would it help ram overclocking ,if i leave the cpu frequency and voltage at auto? as i currently fixs the cpu freq at 3.9ghz at 1.15v.
 

yuusou

SFF Lingo Aficionado
Mar 16, 2019
115
70
Afaik you're already lucky if you can get 3200 with the 4750G. Check your power draw at the wall, you may be overwhelming the power brick.
 
  • Like
Reactions: rubicoin

Valantar

Shrink Ray Wielder
Jan 20, 2018
2,201
2,225
Anyone have experience on overclocking ram frequency of samsung c-die sodimm ram with zen2 apu on deskmini x300?

I have difficulties on overclocking F4-3200C22D-32GRS (2x 16gb)with 4750g on deskmini x300.
Originally, this ram kit has timing of 3200-22-22-22-52 at 1.2v, and i could only tighten the timing a bit to 3200-18-20-20-52 at 1.35v
if i increase the ram frequency above 3200mhz and the FCLK above 1600mhz, it can be booted into windows and able to run some benchmarks, i.e 10 mins of R23,
but eventually, it will ramdonly crash. I just cant stably overclock the ram frequency.
does anyone have any advice on overclocking c-die rram?
it is a must to increase the soc voltage, to stabilize ram overclocking?
would it help ram overclocking ,if i leave the cpu frequency and voltage at auto? as i currently fixs the cpu freq at 3.9ghz at 1.15v.
I don't have an A300 or X300, so I can't speak specifically to that, but I can at least give some basic general guidance:
-Use 1usmus' dram calculator + taiphoon burner to find your ideal timings.
(-Open taiphoon burner, press "read" and pick any DIMM, press "report", then scroll all the way down and click "show delays as nanoseconds". Export->Complete HTML report, then import this into Dram Calc by pressing "Import XMP".)
-CPU voltage and clock speed has little relation to DRAM clock and voltage, so don't bother adjusting those.
-SoC voltage is the DRAM controller voltage, so you might need to increase that slightly - I needed 1.15V to hit 3800MT/s on my ITX Renoir build.
-Remember that increasing DRAM clocks also increases Infinity Fabric clocks (unless you've set those manually), so that might also require an SoC voltage increase if you're pushing it.
-In general though, Zen 2 (Renoir) APUs should handle >3200MT/s on stock voltages, but YMMV obviously.
-There have been some weird reports of 4750Gs not handling 3200MT/s, which I find hard to believe given how easily my 4650G hit 3800, but ... I guess it's possible? I've yet to see any clear proof of this though.
 

gustav

Cable-Tie Ninja
Jun 18, 2020
193
90
@Valantar Fully agree with you. The same way I did manage to get my results for the Ryzen 2400G. But there are couple of trials and errors. Sometimes A300 needed some time to reset from failing timings due to BIOS not POSTing.

@yck3110 Indeed you're a lucky one in the 4750G and 3200 regard on X300 Platform. Can you give us more insights? :) I would appreciate it, I think many of us would ;)

Possible point of interest:
* Power draw on the wall outlet
* Cinebench 15
* Cinebench R20
* UNIGINE Valley Benchmark
* 3D Mark Timespan (to compare with rubicon)
* Memory Bandwidth (GB/s read/write/(copy))

Well those are couple of points. But points, which would provide us with more understanding of what's possible.

PS: I was able to make A300 to "run just fine" on 3200. YMMV, as Valantar said.
Are the DRAM Voltage options only 2 with 1.x and 1.5V or are you able to set an offset at e.g. +/-0.5V interval?
(I'm not home, maybe those 2 Voltage values are a bit off, but I think you get the point)

Regarding your question about the soc voltage: exactly my experience! Not stable without adjusting SoC Voltage. I had to bump mine to 1.13750V I believe.
Basically, I had the theory that the voltage rails are not good on the mainboard itself. It does not coupe good with the Vdrop once the APU needs bandwidth. The SoC Voltage is actually not directly helping your cpu as much as it helps the board.

More premium boards have BIOS setting called "LLC" - Load Line Calibration, where you can set a level it deals with Vdrops at.

Unfortunately, this is also the point I see no reason in X300, when they do not even redesign the rails and then advertising it o/c-ready. Making a Renoir-ready-Beta for A300 and then see if it works, and then make a "new" product out of it. Seems like a legit marketing move.

I'm looking forward to your findings.

PS: to the point, where Valantar said it's hard to believe Renoir not handling >3200MTs. I agree on that too, since Renoir has the redesigned Memory Controller which features LPDDR4-(466.. I believe) and DDR4 interfaces from Zen2.

Feel free to correct me, if I'm stating something wrong.

Edit: I'm not arguing about your choice to buy X300, it's all fine, I find it also a nice small power house, and if I did not had the experience with A300, I would buy it too :) It's a statement towards AsRock.

Additional: I used HWINFO64 to track the SoC vdrop, it's very handy. Vdrop is more pronounced once the iGPU has a load, at low resolution with as much frames as possible to trigger significant Vdrop (since utilization of the SoC based IMC, high Bandwidth, iGPU loves bandwidth). Can you obverse the same behavior?
The UNIGINE Valley Benchmark ist very good at triggering iGPU-induced vdrop spikes. (~0,1V or about 10% on 1.0V defaults)
Additional 2: In A300, once you manually adjusted any frequencies above stock, it did automatically apply 1.1V on the SoC, but it was not enough. (Seems AsRock enginers also observed the Vdrop in testing)

Cheers
 
Last edited:
  • Like
Reactions: Valantar

yck3110

Cable Smoosher
Jul 16, 2019
11
0
I don't have an A300 or X300, so I can't speak specifically to that, but I can at least give some basic general guidance:
-Use 1usmus' dram calculator + taiphoon burner to find your ideal timings.
(-Open taiphoon burner, press "read" and pick any DIMM, press "report", then scroll all the way down and click "show delays as nanoseconds". Export->Complete HTML report, then import this into Dram Calc by pressing "Import XMP".)
-CPU voltage and clock speed has little relation to DRAM clock and voltage, so don't bother adjusting those.
-SoC voltage is the DRAM controller voltage, so you might need to increase that slightly - I needed 1.15V to hit 3800MT/s on my ITX Renoir build.
-Remember that increasing DRAM clocks also increases Infinity Fabric clocks (unless you've set those manually), so that might also require an SoC voltage increase if you're pushing it.
-In general though, Zen 2 (Renoir) APUs should handle >3200MT/s on stock voltages, but YMMV obviously.
-There have been some weird reports of 4750Gs not handling 3200MT/s, which I find hard to believe given how easily my 4650G hit 3800, but ... I guess it's possible? I've yet to see any clear proof of this though.

I have already tried to check on taiphoon burner, there is no xmp profile for my F4-3200C22D-32GRS ram kit, that why i also cant use the dram calculator to adjust the sub timing while overclocking the ram frequency,

what chipset did you pairing 4650g with? if it is b550 chipset, as i know, zen2 apu with b550 chipset, generally, the frequency can easily be overclocked to like 4000mhz from 3200 cl16,
however, deskmini x300 is no the same case.

I saw some reviews on youtube, people using 4650g/4750g with deskmini x300, they can easily overclock from 3200cl16 (micron e die) to 3600/ or from 2666cl19 (samsung c-die)to 3600, both are very stable,
so i am wondering that what is preventing me stably overclocking the ram, i.e overclocking from 3200 to 3600?
the ram kits itself (fk the samsung c-die) / the bios / my knowledge/ method of overclocking?
 

Valantar

Shrink Ray Wielder
Jan 20, 2018
2,201
2,225
I have already tried to check on taiphoon burner, there is no xmp profile for my F4-3200C22D-32GRS ram kit, that why i also cant use the dram calculator to adjust the sub timing while overclocking the ram frequency,

what chipset did you pairing 4650g with? if it is b550 chipset, as i know, zen2 apu with b550 chipset, generally, the frequency can easily be overclocked to like 4000mhz from 3200 cl16,
however, deskmini x300 is no the same case.

I saw some reviews on youtube, people using 4650g/4750g with deskmini x300, they can easily overclock from 3200cl16 (micron e die) to 3600/ or from 2666cl19 (samsung c-die)to 3600, both are very stable,
so i am wondering that what is preventing me stably overclocking the ram, i.e overclocking from 3200 to 3600?
the ram kits itself (fk the samsung c-die) / the bios / my knowledge/ method of overclocking?
You shouldn't need an XMP profile for taiphoon burner+dram calc to work, it should still be able to read the JEDEC SPD profiles off the DIMM (it does on mine, at least). All XMP and DOCP do are add additional non-JEDEC profiles in addition to the required JEDEC ones. It might however require more tuning as JEDEC profiles are far less optimized for performance, and simply align to an established standard.

As for B550 vs. the X300, this should have zero impact on RAM speeds, as the chipset is nothing more than an I/O hub and has no relation to RAM. If there are differences, these are in ASRock's BIOS configuration, which they can of course have limited in various ways (many of which are already well documented). But there's nothing a chipset or the lack of one can do to affect memory clock speeds; the limitations there lie in the available BIOS options, the capabilities of the on-die memory controller, the motherboard PCB quality and trace layout, and the DIMMs and the DRAM chips themselves. It's quite likely that the X300 has a relatively unoptimized trace layout, which would indeed hurt OC capabilities, but it also has the (significant) advantage of very short trace lengths, which ought to make up for that unless they did a really shoddy job. I guess it's possible that AMD has some sort of microcode instructions telling the APUs to act differently depending if there's a chipset connected or not (which would make sense to lock down DRAM OC on laptops, for example), but that again sounds rather unlikely. And as @gustav said, the IMC is capable of handling LPDDR4X-4266, so DDR4->3200 should be more than possible, especially at relatively loose timings. There's no reason why the IMC or other parts of the APU should be limiting you here, nor the combination of Renoir+chipset-less A300/X300 platform.

I'm not particularly experienced with DRAM OCing, but I can see a couple of points worth looking into:
-Your memory kit looks like it's 32GB. If so, is it dual rank? (I would assume so.) If yes, then that will inherently allow for lower clock speeds than a single rank kit using the same die - but it will perform better at the same clock speed. (There are plenty of videos around investigating dual vs. single rank performance with Ryzen.)
-Dual rank also typically requires looser timings than single rank.
-Dual rank might require a higher voltage to be stable at the same clock speeds.
-The lack of an XMP profile might require more manual tuning of timings, as the baseline you're starting from is less optimized.
-Even with a known die type, there can still be duds. Samsung C-die sold as any type of "performance" or "gaming" RAM with a high-speed XMP profile is likely to be a better/higher bin than something sold only with JEDEC profiles. It might be that your lack of success is simply down to the silicon lottery.
 
  • Like
Reactions: gustav

Serj47

Efficiency Noob
Oct 3, 2020
5
0
We need a quiet cooler for 3400g.
Is Noctua much quieter than Wraith Stealth?

Today I received my new Ryzen 5 3400g for A300 to replace my old Athlon200GE processor.
To be honest, I don't really see the difference in performance ... Win10 did not load faster, Internet surfing accelerated, but not much.
But here's where the payoff really comes in is Vega11! After Vega 3, this is heaven and earth.

Actually the essence of the problem and why I am writing here ...
Installed a Wrath Stealth cooler. With my old Athlon processor, it was always quiet. In the BIOS on the Athlon, I set the profile for the cooler from 900 rpm to 60 degrees and did not heat up to this temperature even in games, and, therefore, was always quiet.

On the new 3400g risen, especially when using a GPU, does the percentage immediately go to the prohibitive 70-80-90 degrees with maximum revs?
At the same time, youtube, and working in the explorer leave the processor quiet.

I'm already thinking of giving the processor back and taking the Renoir 4350 instead. It has a similar thermal package, but in fact it is noticeably less than that of the 3400g.

Do you guys have any ideas on how to reduce the noise of the 3400g car?