View: 406|Reply: 9

Photographers views on HDR on TV's (Dolby Vision etc)

[Copy link]
2-12-2019 06:47:30 Mobile | Show all posts |Read mode
I recently switched my 1080 LCD for one of last years 4K OLEDs, so have only recently seen 4k and HDR (High Dynamic Range) content in irl.

My understanding was that the improvement from 1080 to 4k was modest, and the real improvement was with HDR over Standard Dynamic Range. Watching the Blu-ray (1080 SDR, upscaled to 4K) of Interstellar and it looked every bit as incredible as Planet Earth II on netflix (4k SDR).

My iTunes version of the Nolan Batman movies are all in 4K HDR (free upgrades, thanks apple!), and I've got to say, not too impressed with the HDR part. To me, it kinda looks like someone just dialed up the saturation and contrast sliders on a regular SDR broadcast. Not a very appealing image at all, and it made me look to see if I could turn it off, as changing the picture settings on the telly to compensate makes any non-HDR content look washed out and flat. It's not the increase in specular highlight brightness, but the OTT saturation that irks.

My telly isn't calibrated, and this is my early impressions only, but I wondered if the appeal of HDR movies etc was similar to when photographers first started pushing the unnatural limits of HDR landscapes etc where people (in a wider sense) gravitate to the most colourful images, without consideration of the accuracy or subtlety of tone. Are photographers more sensitive to this boost?

Have you seen, and do you like HDR content?
Reply

Use magic Report

11610K

Threads

12810K

Posts

37310K

Credits

Administrators

Rank: 9Rank: 9Rank: 9

Credits
3732793
2-12-2019 06:47:32 Mobile | Show all posts
I've not gone the 4k route on my TV yet. The motion resolution of 4k still doesn't beat that of my plasma, along with other motion issues yet to be resolved, so don't feel any urgency to change.

I've not been a big fan of the HDR content I have seen but, to be fair, this has been on uncalibrated TVs so far. It does seem to share the unnatural look of overly done HDR photos, but I don't know how much calibration would alleviate that.

Either way I'm not looking to replace my TV anytime soon, at least until someone else's setup makes me drool my way to AV GAS.
Reply Support Not support

Use magic Report

11610K

Threads

12810K

Posts

37310K

Credits

Administrators

Rank: 9Rank: 9Rank: 9

Credits
3732793
2-12-2019 06:47:33 Mobile | Show all posts
I would just like to point out that you can't really see actual HDR photos.  What you are talking about is the tonemapping of an HDR image down to SDR for display on a screen.  You can do the tonemapping in such a way it looks 'garish' or 'unnatural' or you can do it in such a way it looks natural and people don't really notice.
Reply Support Not support

Use magic Report

 Author| 2-12-2019 06:47:34 Mobile | Show all posts
I'm not certain I understand your point. Are you suggesting HDR photographs aren't actually HDR because they're viewed on an SDR screen? Interesting.

Photo HDR; Bracketed multiple images or recovered high/lowlights from a single image to display details from areas otherwise beyond the capabilities of the sensor to more accurately reflect what the uman eye would see. Typically measured in "stops" of light.

Video HDR; Increases the range of colours and brightness that can be resolved by a display. Typically measured in "nits".

High-dynamic-range imaging - Wikipedia

https://www.dolby.com/us/en/technologies/dolby-vision/dolby-vision-white-paper.pdf

From the Dolby white paper below;
"The maximum brightness for broadcast TV or Blu-ray discs is 100 nits. But modern TVs often have 300 to 500 nits maximum brightness, so TV manufacturers stretch the brightness of the content to try to use the capabilities of the display. This distorts the images. And because each manufacturer stretches the output differently, every viewer will experience a movie, TV show, or game in a different and unpredictable way."

So there seems to be a fundamental difference between photo and video HDR. Strange then, that the presentation of the film I watched (Batman Begins) had problems I associate with poorly done HDR photo's; exaggerated contrast and unnaturally saturated colours. The increase in brightness was good, but found the colour issues distracting. Maybe it was just that film? I really need to watch more HDR films!
Reply Support Not support

Use magic Report

 Author| 2-12-2019 06:47:35 Mobile | Show all posts
Yeah I only went to 4K because my last telly gave up the ghost. More than the increased resolution (I would struggle to tell the difference between a good 1080 and 4K image at 55" I think), the true black levels of an emissive display has been incredible and made me wish I had switched sooner!

I never owned a plasma, but the response rate increase of OLED and I guess more powerful processing in more modern screen means I'm seeing much better motion handling than on previous LCD (no smearing and reduced judder), but this is an area where personal perception makes a big difference. Some people abhor the soap opera effect, others love it! I'm blown away with the picture quality overall, just can't get to grips with HDR yet.
Reply Support Not support

Use magic Report

11610K

Threads

12810K

Posts

37310K

Credits

Administrators

Rank: 9Rank: 9Rank: 9

Credits
3732793
2-12-2019 06:47:35 Mobile | Show all posts
Any image displayed on an SDR screen is SDR as HDR cannot be displayed on SDR devices.

Any image saved as a jpg is not HDR.  JPG cannot hold HDR imagery.  Therefore pretty much all 'HDR' images you will see on the web arent really HDR.  They will be created with HDR somewhere in the workflow though.

There are HDR image formats (.hdr, 32 bit floating point tiff, and DNG if you are using Lightroom/ACR).  There is no way of displaying them in HDR.  At the moment HDR on computers is limited to the video realm as long as you have compatible software, OS, video card, and display.

The HDR photography workflow is take several images of the scene at different exposures so the entire dynamic range is captured.  Merge these photos together to create a HDR image.  Then you tonemap the HDR image into a SDR image to produce a useable output.  It is during the tonemapping process that you can go for the naturalistic look or the more extreme grungy/surreal looks.

When people say they use the single image HDR they aren't really using HDR as one image has the dynamic range fixed when the button was pressed on the camera.  It can't be expanded later.
Reply Support Not support

Use magic Report

 Author| 2-12-2019 06:47:36 Mobile | Show all posts
I don't disagree that technically you may be technically correct, but it doesn't make any practical sense to say HDR photography can't be displayed on an SDR disply or that no JPEG's are HDR IMO. It's pretty much universally accepted that where the steps to include more DR than can ordinarily be captured in one exposure have been followed, it is an HDR image. Even where it's saved as a JPEG and displayed on a VA panel - technical accuracy be dammed.

Photo HDR vs. video HDR discussion aside, I watched Dark Knight Rises in 4k HDR and found it to be much more palatable than Batman Begins - it looked quite beautiful. But then I tried an HDR PS4 game and felt like my eyeballs were melting out of my face.
Reply Support Not support

Use magic Report

11610K

Threads

12810K

Posts

37310K

Credits

Administrators

Rank: 9Rank: 9Rank: 9

Credits
3732793
2-12-2019 06:47:37 Mobile | Show all posts
I think the problem is that the term "HDR" had a very specific meaning to digital photographers as a way of combining multiple exposures before "HDR TV" came along.

A very similar thing happened when "HD TV" launched but the internet had been using "HD" for Hard Drive for years.  I remember lots of confusion about HD recorders - where they Hard Disk Recorders (SD or perhaps HD) or specifically "High Definition Recorders"?
And lets not start on "HD Ready"

As TVs and movies are massively more popular than photography I expect "HDR" in the TV display meaning will stomp over the photographers term , that when done well can be brilliant but can also make you want to close your eyes and run away screaming
Reply Support Not support

Use magic Report

2-12-2019 06:47:38 Mobile | Show all posts
I have my Photos synched with my AppleTV, and also my MacBook and iPhone. All connect to my LG OLED and it recognises it is an HDR signal and switches it’s settings. That way I can display hdr photos.
Reply Support Not support

Use magic Report

11610K

Threads

12810K

Posts

37310K

Credits

Administrators

Rank: 9Rank: 9Rank: 9

Credits
3732793
2-12-2019 06:47:39 Mobile | Show all posts
I'm in both camps, a photography enthusiast and 4K HDR fan.  I think the HDR in th elatest films (be it 4K disc, Netflix, Amazon Prime and 'Dynasties' on BBC iPlayer is absolutely stunning.  See it to believe it!).  Don't forget the 'Wide Color Gamut' (WCG) too.  Rec2020 just blows the archaic rec709 away which is what we STILL get when viewing jpegs on a 4K HDR TV!!!

Which brings me to a related question.....

Is it possible to view photos in WCG on a current consumer TV.  I guess both the source device and TV would have to be able to show it in a higher bit depth file format like TIFF?

Just imagine, photos with... 'NO BANDING!'

This to me is more important than 'HDR' for viewing photos.  I can't believe we're still stuck with only being able to view photos as 8bit jepg on a large screen TV) with all it's limitation and banding issues whilst video technology, WCG and HDR and all is progressing.  Yet stills have been able to store WCG for years, well before rec2020 came along.

So glad I've still got all my photos in the RAW and TIFF format that hopefully could one day be unleashed, WCG and all!

Or is this already possible and am I missing something?
Reply Support Not support

Use magic Report

You have to log in before you can reply Login | register

Points Rules

返回顶部