Motherboard ASRock X370 Gaming-ITX/ac and AB350 Gaming-ITX/ac

DwarfLord

Trash Compacter
Oct 13, 2018
53
31
The wattage I've given to you were from a watt meter plug, not hwmonitor (thus the complete system comment).

I seem to remember that the values given by hwmonitor were higher, so I would not completely trust hwmonitor on this, as it relies on the motherboard sensors which, from past experiences, can be misleading at times.
If I were you, I would try to find a watt meter plug and check with it if the wattage for the whole system is consistent with what you see in hwmonitor for the cpu alone.

With stock settings and turbo enabled for the two versions, I don't see how a 2400GE(45w) would hit 100w+ alone, when my 2400G(65w) hits 110w for the whole system !

EDIT :
Also, do not forget that TDP is different from wattage consumption. TDP is the maximum heat the cpu will dissipate on the heatspreader, value which is also given in the characteristics of a cpu cooler (example : the noctua nh-l9i is given for a max TDP of 65w).
So even if it's linked to the consumption (higher wattage = higher dissipation), it's not directly equal. You could have a higher wattage consumption than 65w even for a 65w TDP rating.
 
Last edited:
  • Like
Reactions: vinhom

parlinone

Trash Compacter
Nov 3, 2017
54
19
EDIT :
Also, do not forget that TDP is different from wattage consumption. TDP is the maximum heat the cpu will dissipate on the heatspreader, value which is also given in the characteristics of a cpu cooler (example : the noctua nh-l9i is given for a max TDP of 65w).
So even if it's linked to the consumption (higher wattage = higher dissipation), it's not directly equal. You could have a higher wattage consumption than 65w even for a 65w TDP rating.

This is totally incorrect. TDP does indeed relate directly to the max CPU Wattage and has nothing to do with heat generation. Turbo boost however does depending on your cooling setup and bios settings.

Until the 7th gen Intel Core the wattage wouldn't exceed the rated TDP even for turbo. Only with Coffee Lake Turbo Boost 2.0 short duration PL2 state exceeds the TDP if your heatsink can handle the heat. TDP is now the max wattage for the long duration PL1 state. It's up to the motherboard manufacturers how they set the PL1 state but for the 9900K it takes about ~160W max. turbo load. Any PL2 state under 160W will generally throttle the turbo speeds even if your heatsink can handle it. Your power supply, especially for small form factors may not survive such power states. And you thought you're all fine and dandy with a power supply that can handle double the TDP of that 9900K. But how much of that power from your PSU can be delivered to the 12V rail? For AMD it's less transparent but XFR is basically the same thing.

My advice would be to always set the PL2 state manually below the 12V Wattage rating of your DC-ATC power supply with at least some 20% margin. For XFR I have no idea how much above TDP it operates but I think it sticks quite nearby TDP, maybe about 10W extra? If you use a seperate GPU that connects to the 12V rail you should subtract it as well. So don't look at the total wattage rating of your DC-ATX, but the wattage rating for the 12V rail. And set your PL2 state at a healthy margin below that. This is all under the assumption your AC/DC is at least 20% above the rated 12V wattage for your DC-ATX.
 
Last edited:

DwarfLord

Trash Compacter
Oct 13, 2018
53
31
This is totally incorrect. TDP does indeed relate directly to the max CPU Wattage and has nothing to do with heat generation.
...seriously ? TDP is literally Thermal Design Power...

https://en.wikipedia.org/wiki/Thermal_design_power

"The thermal design power (TDP), sometimes called thermal design point, is the maximum amount of heat generated by a computer chip or component (often a CPU, GPU or system on a chip) that the cooling system in a computer is designed to dissipate under any workload."
 

parlinone

Trash Compacter
Nov 3, 2017
54
19
...seriously ? TDP is literally Thermal Design Power...

https://en.wikipedia.org/wiki/Thermal_design_power

"The thermal design power (TDP), sometimes called thermal design point, is the maximum amount of heat generated by a computer chip or component (often a CPU, GPU or system on a chip) that the cooling system in a computer is designed to dissipate under any workload."

I just explained how Intel and to a lesser extent AMD use the TDP. But you choose to reply with a wikipedia quote?

You realise TDP is defined by Watts and not degrees celcius? A thermal design point might be the temperature where say 90*C when the CPU will start to throttle. Based on wattage and maximum temperature manufacturers can rate their heatsinks acordingly. But to make full use of Turbo you need to look at the maximum wattage a CPU will consume or is allowed to consume under full load. The TDP is no longer a safe guidance for the power behaviour of Intel K and AMD X chips.

You understand there is a difference between a definition and the way it's actually implemented? Do you read any reviews where they actually measure the TDP? The TDP gives (or in Intel's case gave you guidance on the amount of watts a processor puts out under normal full load usage (burn software like Prime95 usually breaks that).

I'm wondering why I'm even replying to someone that doesn't bother to respond to what I actually wrote. I'd advise you to read this article on how TDP nowadays is defined by Intel: https://www.anandtech.com/show/13544/why-intel-processors-draw-more-power-than-expected-tdp-turbo
 
Last edited:

loader963

King of Cable Management
Jan 21, 2017
660
568
Tdp is almost meaningless to me anymore. For thermal arguements, I remember a 100w rated c7 could cool a 4790k easily, yet 65w 8700’s throttle with them.

And power draw really hasn’t fared any better. If the only way to meet the number is to cut all the default settings like Turbo and Ht off, just to barely meet the spec, should it count?

To me, it’s really becoming another advertising spec like those contrast numbers on tv.
 
  • Like
Reactions: Supercluster

DwarfLord

Trash Compacter
Oct 13, 2018
53
31
The TDP is no longer a safe guidance for the power behaviour of Intel K and AMD X chips.
Which is exactly what I told him.

I'd advise you to read this article on how TDP nowadays is defined by Intel: https://www.anandtech.com/show/13544/why-intel-processors-draw-more-power-than-expected-tdp-turbo
Same, again. Exactly what I told him. Not to use TDP as a base to know what his processor's consumption is. Which was his initial question because he didn't understood why his consumption readings under hwmonitor were higher than expected.

This doesn't change the fact that TDP is heat dissipation, not wattage consumption.
Watt is not only used for electricity, just so you know.
Again, from wikipedia (I know...) : https://en.wikipedia.org/wiki/Watt
"The watt (symbol: W) is a unit of power. In the International System of Units (SI) it is defined as a derived unit of 1 joule per second,[1] and is used to quantify the rate of energy transfer."
Which you totally (and rather aggressively) denied by saying "This is totally incorrect. TDP does indeed relate directly to the max CPU Wattage and has nothing to do with heat generation".

And if you still don't believe me on this specific point, I will just copy here the Intel definition, directly from any processor's details on the official ark.intel.com site :
"TDP
Thermal Design Power (TDP) represents the average power, in watts, the processor dissipates when operating at Base Frequency with all cores active under an Intel-defined, high-complexity workload. Refer to Datasheet for thermal solution requirements."
 
Last edited:

DwarfLord

Trash Compacter
Oct 13, 2018
53
31
Tdp is almost meaningless to me anymore. For thermal arguements, I remember a 100w rated c7 could cool a 4790k easily, yet 65w 8700’s throttle with them.

And power draw really hasn’t fared any better. If the only way to meet the number is to cut all the default settings like Turbo and Ht off, just to barely meet the spec, should it count?

To me, it’s really becoming another advertising spec like those contrast numbers on tv.

My thoughts exactly. Especially in SFF cases where heat generation and consumption managements are essentials.
 

ChinStrap

Cable-Tie Ninja
Sep 13, 2017
192
174
using ebay's last global coupon I picked up some G.Skill F4-3600C15D-16GTZ
(bios)
4.6 Nope
4.7 Nope
4.9 will boot at 16-15-15 but crashes in windows

played with 1.35~1.5 vdimm for each
Played with SOC for each.

Kind of a bummer. I'll still hold on to it, maybe the X570 version of this board will boot it. never know. :)
 

ChinStrap

Cable-Tie Ninja
Sep 13, 2017
192
174
Playing around w/ bios 4.9 a little more w/ 2400G. picked up a new MSI Optix MAG24C from BestBuy for $169.99 on sale.

the X370 version of this board will run HDMI @ 120Hz w/ freesync enabled. So far i'm loving this monitor.

 
  • Like
Reactions: Phuncz

ChinStrap

Cable-Tie Ninja
Sep 13, 2017
192
174
HDMI out on this board is reporting Freesync @ 48-120Hz. Just an FYI. I'm still pretty new to Freesync so I don't know if that number is monitor bound or CPU/MOBO.
 

ChinStrap

Cable-Tie Ninja
Sep 13, 2017
192
174
I'll get back to you on temps.

I just wanted it to work for testing, so I went max. Once I look for the 24/7 number I imagine it will be lower. I hope anyway.
 

vinhom

Cable Smoosher
Dec 29, 2017
8
2
The wattage I've given to you were from a watt meter plug, not hwmonitor (thus the complete system comment).
I seem to remember that the values given by hwmonitor were higher, so I would not completely trust hwmonitor on this, as it relies on the motherboard sensors which, from past experiences, can be misleading at times.
If I were you, I would try to find a watt meter plug and check with it if the wattage for the whole system is consistent with what you see in hwmonitor for the cpu alone.

With stock settings and turbo enabled for the two versions, I don't see how a 2400GE(45w) would hit 100w+ alone, when my 2400G(65w) hits 110w for the whole system !

I managed to verify that the HwInfo64 readings were erroneous and the uProf ones were correct, my system is respecting TDP settings just fine. Right now set to 35W it boosts to 42W for a period of time until it backs down to 35W. Thanks again for the help with this issue!

Has anyone tested the new 5.30 BIOS from Asrock?
 
  • Like
Reactions: DwarfLord

ruddevil

Chassis Packer
Aug 30, 2017
17
8
Has anyone here tried Ryzen 3000 processor (non GPU) using the latest 5.70 BIOS? I'm planning to upgrade to Ryzen 5 3600 without replacing my AB350 board.
 

wywywywy

Airflow Optimizer
Aug 12, 2016
272
219
Was wondering about the same thing.

I'm hoping to get a 3900x but unsure about the power delivery and VRM of this board.
 

ChinStrap

Cable-Tie Ninja
Sep 13, 2017
192
174
Info dated Aug 17’ - Per Buildzoid: Controller @ ISL 95772 running in 3 (doubled)+1 configuration. 8c/16t all core OC @ 4GHz on 1700/1700x/1800x @ 1.42vcore pulls 100a and this VRM will produce 9w of heat. He reports VRMs will be fine and can push power, the heat sink is the weak link. Active cooling suggested when being pushed harder / put in smaller case.
 
  • Like
Reactions: Elerek

ChinStrap

Cable-Tie Ninja
Sep 13, 2017
192
174
This post got the wheels spinning a little. Perfect timing, on Techpowerup they have an article talking about the 3900x on a B350 board that has an (arguably) better VRM heatsink vs. the Asrock ITX (the Asrock has a better actual VRM). I would be VERY careful running that chip and would put some active cooling on the VRM Heatsink if possible.
 

wywywywy

Airflow Optimizer
Aug 12, 2016
272
219
Just a reminder that some Asrock ITX have better VRM heatsinks than others.

The poorer one has a super thick thermal pad, and Asrock did send out replacements for a short period of time but then stopped doing so.

 
  • Like
Reactions: Phuncz