Motherboard Raven Ridge HDMI 2.0 Compatibility — 1st Gen AM4 Motherboard Test Request Megathread

HomeDope

Cable Smoosher
Apr 22, 2018
12
1
We just got the case closed up and will have it hooked up to the home theater AV receiver in a minute. So far, everything hardware-wise, including the Wi-Fi upgrade, M.2 PCIe, heatpipes, IR, RAM and PSU have been all flawless, thanks in a large part to @confusis’s build photos for the FC8 case from his review. We also installed the drivers from our test bench area with a no-frills full HD monitor, which worked well. Now, we just have to finish installing the other half of the drivers (already installed the chipset, graphics, and Wi-Fi; what is left is the Bluetooth, LAN, audio, and IR) out here. We should be able to connect in the next hour and see how 12-bit color over 4K at 60Hz works.

UPDATE: It works!

Are you using ASRock Fatal1ty AB350 Gaming-ITX/ac motherboard?

4k 60hz works?
HDR works?
 

a13antichrist

Average Stuffer
Apr 20, 2018
86
29
I see from his earlier messages that he does mention the ASRock AB350.

As explained earlier, I do have to settle for SDR conversion with MadVR on this system, but I already knew this since the 4K TV we have is just not up to snuff. Our TV is an early specification HDR Samsung unit which is well-documented (there are many posts online detailing this for NVIDIA graphics cards) as not working properly with any HDR signal, regardless of the device. But I can plainly tell that the HDR would be working if our TV was up to spec since the video has the same appearance in color and brightness as it did on my old GeForce GTX 1080, which I know for a fact works on other fully HDR compliant TVs.

I found I get the dimness on my (modern, but cheap) 4K-HDR Philips, -if- I use the following options:

8bit 4:4:4
8bit 4:2:2
12-bit 4:2:0

However if I use 4:2:0 at either 8 or 10-bit the colour returns to normal. I am guessing that is because my TV is probably an 8bit-with-10-bit-extensions or something like that. I.e. not fully HDR-spec as I also see other cheaper HDR-ready TVs are really only 8-bit colour also.
 
  • Like
Reactions: Hifihedgehog

HomeDope

Cable Smoosher
Apr 22, 2018
12
1
I see from his earlier messages that he does mention the ASRock AB350.



I found I get the dimness on my (modern, but cheap) 4K-HDR Philips, -if- I use the following options:

8bit 4:4:4
8bit 4:2:2
12-bit 4:2:0

However if I use 4:2:0 at either 8 or 10-bit the colour returns to normal. I am guessing that is because my TV is probably an 8bit-with-10-bit-extensions or something like that. I.e. not fully HDR-spec as I also see other cheaper HDR-ready TVs are really only 8-bit colour also.

Oright. My TV is rather new and it should have 10-bit panel. So I hope I get everything correctly.
 

Tubamajuba

What's an ITX?
Jun 25, 2018
1
0
I'd like to chime in and confirm the GIGABYTE GA-A320M-S2H mATX Mainboard(with Ryzen 3 2200G) to be able to fully support 4k@60Hz over the HDMI connection. I've tested the HDMI connection to a Samsung U28E590D HDMI Port 2 (Port1 on that monitor only does 30Hz). Both the AMD driver and the Monitors OSD Information confirm 60Hz.
Just in case anyone is wondering about the cheapo Mainboards.
Thanks for making this thread.

EDIT: Adding detail:
Freesync is not supported w/ Samsung U28E590D
Depth is 8 bpc only
RGB 4:4:4 is working

Hi there,

I have a question for you- I have the exact same combination of motherboard, processor, and monitor as you do. I am experiencing random dropouts where the screen goes black for two to three seconds at a time. When the image comes back, the "HDMI 2" indicator pops up in the top left corner which seems to indicate the signal dropping out. On rare occasions the image dropout is preceded by static. Have you ever experienced this? Thank you in advance.

EDIT: Forgot to mention, the manual only claims that the motherboard is HDMI 1.4 compatible with a max res of 4096x2160 at 24Hz... weird.
 
Last edited:

gwilly7

Cable Smoosher
Feb 13, 2018
10
9
Hi there,

I have a question for you- I have the exact same combination of motherboard, processor, and monitor as you do. I am experiencing random dropouts where the screen goes black for two to three seconds at a time. When the image comes back, the "HDMI 2" indicator pops up in the top left corner which seems to indicate the signal dropping out. On rare occasions the image dropout is preceded by static. Have you ever experienced this? Thank you in advance.

EDIT: Forgot to mention, the manual only claims that the motherboard is HDMI 1.4 compatible with a max res of 4096x2160 at 24Hz... weird.
I am experiencing the same dropouts on an Asrock x470 gaming itx in Windows using a 2400g. Only when I am in the gui. If I am playing a movie or show it does not happen. Does not happen in Libreelec so far in limited testing.
 
Last edited:

Filmstar

What's an ITX?
Jul 25, 2018
1
0
Official information from ASROCK says that HDMI version depend from CPU, not motherboard:

My question to Asrock regarding A320M-HDV:
I want to upgrade my A series CPU to Ryzen 3 2200G. This CPU must support HDMI 2.0. Will it be possible to get 4k at 60Hz picture on your A320M-HDV motherboard, or I need dedicated video card?

Answer:
Thanks for contacting ASRock.
The integrated graphics depends on the CPU's specification.
If Ryzen 3 2200G can support to 4K 60Hz, the motherboard can have this feature.
Therefore, please check the specification with AMD for more detail.

Thank you!

Have a great day,
ASRock TSD
 

Specific

What's an ITX?
Jul 27, 2018
1
0
Umm. This thread is kind of blowing my mind as I have been in contact with AMD and asrock, and asrock told me their HDMI 2.0 motherboards do not support HDR. I've bought a 2200g myself and am looking at getting a futureproof b450 motherboard (hdr support).

While AMD said HDR should be supported over 2.0, they also said to contact the specific motherboard manufacturer. I contacted ASROCK and after a longer conversation they contacted tech, and this is the relevant part of the dialogue:

"Hello,


got feedback from BIOS/Hardware department:

Due to this board HDMI is only 2.0, so there is no support HDR function.

best regards,

ASRock Support"
 

arshavin12

Efficiency Noob
Mar 18, 2018
6
2
I've finally completed my RVZ02 , 2400g, asus x470-i build. I have a Samsung KS7000 series TV and can only enable HDR in windows on the following modes:
8 bit 4:4:4 (RGB)
10 and 12 bit 4:2:0 (YCbCr)

Now the TV was sold as having an HDR 1000 nits 10bit panel and does automatically adjust picture/backlight settings when it receives HDR metadata. Running madVR with passthrough HDR in exclusive full windowed mode the TV can pick up the metadata and the picture does look fantastic (HDR in windows is turned off by the way). Only issue is here is you have to exit full screen mode to bring up the menu but can live with that due to keyboard shortcuts. If HDR in windows is turned on then the picture is too bright and over-saturated as if the TV and madVR are doubling up the HDR effect.

I have tried changing madVR settings to convert HDR to SDR and leave windows 10 HDR mode on but the colours seem far too bright and unrealistic (unless somebody knows of any good settings for this). Samsung's processing of HDR appears more natural even though this mode only seems to initiate on 8bit 4:4:4 and not on 10/12bit 4:2:0. So it's probably not a true 10bit panel and the TV possibly software processes the HDR in full screen mode provided the source file has the correct HDR metadata. Maybe someone with a true 10-bit panel could confirm whether HDMI 1.4/2.0 can output the full RGB 4:4:4 range with HDR.
 
Last edited:
  • Like
Reactions: Hifihedgehog

arshavin12

Efficiency Noob
Mar 18, 2018
6
2
Just been doing a bit more research into this and came across a very good article
http://community.cedia.net/blogs/david-meyer/2018/05/16/hdmi-data-rates-for-4k-hdr

Using hdmi 2.0, HDR can be displayed at 4K60Hz but only over 4:2:0 or 4:2:2 sampling. To achieve this with 4:4:4 sampling we will need to wait for the greater bandwith of hdmi 2.1. So the Samsung surely is a 10 bit panel however I'm still trying to work out why the Radeon panel doesn't let me select 10bit at 4:2:2 but only at 4:2:0. Can the TV panels be manufactured with just one out of those two chroma sampling? I'm using a hdmi 2.0 cable specified at 18Gbps.
 
Last edited:

irrelevantspechhacks

Efficiency Noob
Sep 15, 2018
5
0
having read all the 6 pages look like it could be very well that the apu is capable of decoding high bitrate hdr content but just that technology (tv, motherboard..) is not ready to transfer HDR 4K @ 60fps.

because basically if the player decode your hdr 4K 60fps content and tv is showing it doesn't mean you'r tv actually received the content in hdr / without subsampling.

since most movies are still in 30fps ish anyway, what I actually wonder is if there any setup (one of you'rs) where it's possible to 4:4:4 HDR (10bits) 4K @ 24fps.

It's what basically most UHD movies are, either really HDR or just 10/12bits colors, would be somehow some same kind of bitrate needed.

but for now with the specs you are seeing in the settings it look like chroma subsampling is ALWAYS active for 4K 10bits content.

that what make me wonder if it doesn't worth waiting new versions of hdmi become more standards, and so maybe new motherboards.

just to remind everyone that a 100$ android box can do the same as your complete pc build with ryzen. i.e. play HDR 10bits 4K thanks to their HDMI 2.0a output.

look like 0 mini itx board (even atx) do HDMI 2.0a/b, the one asus strix mentioned is tricky, say HDMI 2.0b ready but still say output is hdmi 2.0. This is not clear and come at a premium that can buy almost 3 4KHDR HDMI 2.0a android tv box.
of course it will be a real pc and that's nice. just saying that 600$ config with amd last apu etc seems to struggle for correct 4KHDR output while android box at 100$ do it just fine.
 

Choidebu

"Banned"
Aug 16, 2017
1,196
1,204
just saying that 600$ config with amd last apu etc seems to struggle for correct 4KHDR output while android box at 100$ do it just fine.
Yeah IMO it's just the demand for 4K in PCs that wasn't big enough.
In ARM SoC world it is easy enough to embed 4K, h265 decoding block because the architecture permits it. In x86 things move rather slow - they have to weigh in silicon space because they have such big instruction set and decades of backward compatibility to consider. Any new feature must be a new instruction set and after that it is still up to software developers to make use of that specific instruction set and perhaps also provide software fallback if it doesn't exist. See even x86 software devs have to consider backwards compatibility.

This is the benefit of RISC, you can have a pretty generic SoC and then you can also have highly specialised one.

I don't have much expertise in this area, it would be nice if we have a solution involving these two - one holds and streams the data and one decodes it and displays it.
 

ncohafmuta

Minimal Tinkerer
Feb 19, 2018
4
4
www.reddit.com
since most movies are still in 30fps ish anyway, what I actually wonder is if there any setup (one of you'rs) where it's possible to 4:4:4 HDR (10bits) 4K @ 24fps.

It's what basically most UHD movies are, either really HDR or just 10/12bits colors, would be somehow some same kind of bitrate needed.

Unless you're a gamer, there's no point to 4:4:4 right now. Nobody distributes in it. Even UHD blu-ray is only 4:2:0
 
  • Like
Reactions: Hifihedgehog

irrelevantspechhacks

Efficiency Noob
Sep 15, 2018
5
0
Unless you're a gamer, there's no point to 4:4:4 right now. Nobody distributes in it. Even UHD blu-ray is only 4:2:0

Yeah since this post i know now my post was confuse. I don't remember but either all movies even the "most HD" ones are 4:2:0 or 4:2:2.
Actually on a side note for some that are into detailed tech (not important to know) even UHD movies are not real 4K. They are usually 1080p, the 4K is for brightness informations, that kind of things.

Anyway, I was looking again to build a small form factor pc like you guys did (tired of my macbook pro overheating even when using chrome javascript games).

And it look like there is still not a real all round and known answer to the specific question :

Is there a mini itx am4 motherboard that does support HDR? (so hdmi 2.0a minimum), same question for HDR+HLG (so hdmi 2.0b)?

Because on this post some says asrock said it don't, some said asrock said it depend ONLY on the CPU.

Maybe hdmi 2.0 to hdmi 2.0 a or b is just a software update (apprently it's even the case for hdmi 2.1 that is not released yet). Or maybe the MB have to be compatible (software update or not).

I just want to see my options to build a small form factor PC / smart tv / htpc multi use system. And I really want to know beforehand if in the future I buy an HDR TV, if the build will be compatible or not. There is too much good HDR content for me to not considering it before buying. A 150$/200$ MB is not something i want to replace in 1 years or 2 when I'll have an HDR TV/monitor.
 

Legion

Airflow Optimizer
Nov 22, 2017
357
386
HDR on Win 10 is still a mess, right now I just wouldn't bother building a media machine on a Ryzen APU (or anything else) for use with HDR.
I've been through several AM4 ITX motherboards, ASRock, Gigabyte and MSI, none of them handle HDR with a Ryzen APU on a 4K HDR TV
The ASRock didn't even work properly at 4k 60hz (RMA'd one board thinking it was a fault and the the replacement did exactly the same thing)
I've tried several APU's to eliminate that, multiple ram kits.


I'm currently using the Gigabyte B450-I-AORUS-PRO.
It works fine running an LG OLED TV at 4k 60hz (HDR is a mess)

If you want HDR that just works, get an Nvidia Shield tv or just use the apps on the TV you buy for Netflix etc.
https://www.nvidia.com/en-gb/shield/shield-tv/

Or just build the Ryzen system for light gaming and 4k 60hz playback and forget about HDR (for now)
You could also investigate an Intel system using Kaby Lake (or newer) CPU and a low end Nvidia card (such as a GTX 1050)

I just got so fed up with HDR on Win 10, I've completely given up on it until it works properly !!!!
 
Last edited:

Stevo_

Master of Cramming
Jul 2, 2015
449
304
Yeah IMO it's just the demand for 4K in PCs that wasn't big enough.
In ARM SoC world it is easy enough to embed 4K, h265 decoding block because the architecture permits it. In x86 things move rather slow - they have to weigh in silicon space because they have such big instruction set and decades of backward compatibility to consider. Any new feature must be a new instruction set and after that it is still up to software developers to make use of that specific instruction set and perhaps also provide software fallback if it doesn't exist. See even x86 software devs have to consider backwards compatibility.

This is the benefit of RISC, you can have a pretty generic SoC and then you can also have highly specialised one.

I don't have much expertise in this area, it would be nice if we have a solution involving these two - one holds and streams the data and one decodes it and displays it.

Seems like the latest rounds of intel UHD iGPUs handle VP9 and H.265 quite well and are only limited by the motherboards, still don't see many with a 2.0x rating. OTOH my Asrock j5005 embedded mobo I have runs 4kx60 HEVC seamlessly, while the GT 1030(rated HDMI 2.0b, bought just for the resolution not 60Hz) in my 3770K based system struggles at 4kx60 (same on the DP port) using the same demo vids at 4kmedia.org(less FPS and HDR demos are Ok). Benchmark tests at Phoronix indicated the 1030 would possibly tap out around 54Hz and it seems to be the case though it could still be a kernel/driver issue. Both are on same kernel and video stack outside of Nvidia and Nouveau hardware drivers.

I think the adoption rate for motherboards has been slow due to demand, which is only now it seems picking up steam. Watching this thread as I'm looking hard at Raven Ridge or the follow-on for main PC replacement. Linux kernel 5, AMDGPU, and Mesa just about there for drop-in support. Phoronix reported yet another Intel speculative attack vulnerability. https://www.phoronix.com/scan.php?page=news_item&px=Intel-SPOILER-Attack
 
Last edited:
  • Like
Reactions: Soul_Est

ooztuncer

What's an ITX?
Mar 10, 2019
1
1
Hi,
You can add biostar x370gt5-nf to the list. Please see the screenshot.

My rig consists of mentioned mobo > ryzen 2200g > monoprice hdmi 2.0 cable > denon s730h > monoprice hdmi 2.0 cable > optoma uhd300x projector.

I installed trial version of jriver with adrenaline 19.3.1 gpu driver. I set the hdr off & 10 bit 4:2:0 color space under windows display settings and selected pass through metadata to projector under madvr (so that projector receives the hdr metadata and displays it correctly).

Thanks,

Screenshot: https://ibb.co/NSJ7dVK
 
Last edited:
  • Like
Reactions: Hifihedgehog

SunMount3r

Case Bender
Aug 15, 2019
2
0
Hello!

The boards like the MSI B350M-MORTAR is on the confirmed list, but I can't see any B450M boards listed. Like the MSI B450M Mortar (and it's latest MAX version), does any have any info regarding these HDMI 2.0a support?

I'm up to build a HTPC on Raven APU, possibly with 2400G or 3400G and the 4K / 60Hz and HDR support would be essential features.

Thanks!
 
Last edited:

Hifihedgehog

Editor-in-chief of SFFPC.review
Original poster
May 3, 2016
459
408
www.sffpc.review
Umm. This thread is kind of blowing my mind as I have been in contact with AMD and asrock, and asrock told me their HDMI 2.0 motherboards do not support HDR. I've bought a 2200g myself and am looking at getting a futureproof b450 motherboard (hdr support).

While AMD said HDR should be supported over 2.0, they also said to contact the specific motherboard manufacturer. I contacted ASROCK and after a longer conversation they contacted tech, and this is the relevant part of the dialogue:

"Hello,


got feedback from BIOS/Hardware department:

Due to this board HDMI is only 2.0, so there is no support HDR function.

best regards,

ASRock Support"
Barring any extremely low quality traces on the motherboard and over the course of the history of this thread, I have seen no such occurrence, full HDMI 2.0 specification support including HDR should work flawlessly on any HDMI-equipped B350 or X370 motherboard.