View: 301|Reply: 1

PC to HDR TV settings help

[Copy link]

11610K

Threads

12810K

Posts

37310K

Credits

Administrators

Rank: 9Rank: 9Rank: 9

Credits
3732793
2-12-2019 23:06:32 Mobile | Show all posts |Read mode
I just bought a HDR10 TV that I have my media/gaming PC plugged into.  I am familiar with all the color settings and some look great and others not.  I would like to know what the recommended settings for video output to the TV are.

1) RGB (16-235) limited 8bit   (doesn't look amazing)
2) RGB (0-255) Full 8bit          (looks the best in my opinion, but it isn't 10bit and when set to HDR, windows reports HDR mode 8-bit dither)
3) 4.2.2 (16-235) limited 10bit (From my understanding, this is what true HDR content uses, but the picture is not very pleasing on the desktop and it seems the limited color range is graying blacks and whites aren't as bright.)

In games, I notice a difference where the RGB Full range looks the best.  In movies, I don't really notice a huge difference between RGB full 8-bit dither and 4.2.2 limited 10bit

I don't like the idea that 4.2.2 shares chroma with other pixels to save bandwidth, but the difference is negligible in my eyes. My initial thought from these findings is to just set it in RGB full 8-bit dither, but I don't want to miss out on the full capabilities of HDR in movies or games that support it.

Does anyone have any input on recommended settings and why?
Reply

Use magic Report

11610K

Threads

12810K

Posts

37310K

Credits

Administrators

Rank: 9Rank: 9Rank: 9

Credits
3732793
2-12-2019 23:06:34 Mobile | Show all posts
The output for the PC connected to the TV should be SDR 8-bit RGB 0-255 444 at 4K 60Hz.

Many TV's require a PC mode to correctly display RGB 444, in addition on TV's they call the 0-255 setting a variety of things like HDMI black level RGB High, Full etc. If the TV does not have an Auto mode it should be set to High/Full manually.

The HDMI 2.0 interface does not have the bandwidth for HDR 10-bit RGB 444 at 4K 60Hz hence the need for chroma subsampling YCC/YUV 420 etc.

The built in Windows HDR mode just forces everything into HDR with SDR emulation, I would avoid using it.

In addition HDR support on Windows software is not at all common and your TV's internal media player is better suited for HDR than anything on Windows itself, install Emby or Plex server on the PC and check your TV's app store for their clients, use those to play HDR content stored on the PC with fancy artwork (if you don't care about that just use Serviio).

For games just leave it as SDR RGB 0-255 output, only a very small number of games have actual HDR support, the game should detect your TV is HDR capable and allow enabling the setting, I'm not sure what mode it outputs in either 8-bit HDR or 10-bit HDR but I found it hard to see a difference vs the base PS4 which outputs HDR in 8-bit too.

Perhaps the 8-bit vs 10-bit HDR matters more for movies than games but that's just my guess.
Reply Support Not support

Use magic Report

You have to log in before you can reply Login | register

Points Rules

返回顶部