Jump to content
LCVG
NickC

The Official Television and Display Technology Thread - Enter of your own will.....(and leave with a lighter wallet)

Recommended Posts

HDR10+ is supported by Fox as their HDR of choice, Lionsgate appear to soon be going with both HDR10+ AND Dolby Vision on disc, and Warner having previously pledged support I suspect may follow Lionsgate’s lead in slapping both on the same disc. A couple of anime titles Disney distribute in Japan are also HDR10+ but whether that indicates anything for Disney as a whole right now is unknown (they’ve currently ditched support for DV on disc, allegedly due to their dissatisfaction with playback consistency between various TVs and players. Pixar are also said to have been dissatisfied from an encoding level, which may have trickled down to Disney as a whole ceasing to support DV for the time being).

 

As Iain notes, you may not see either flavour of dynamic HDR on projectors. It really depends. Projectors are getting a lot better at HDR, and Panasonic and Oppo’s players having great player lead tone mapping has helped projector handling of HDR a great deal (not to mention Lumagen’s external video processor at the high end). It’s perhaps possible we could see HDR10+ on projectors at some point. I’m basing that on the fact that it’s an open standard compared to Dolby Vision which by comparison commands control of every device in the chain, something that’s tough to govern at home vs theatres where Dolby install everything themselves as they want to know everything from screen size, to brightness, to distance from projector...etc. Honestly, nobody knows what the future might be for either with projectors right now. I wouldn’t worry too much. Worst case is get yourself a Panasonic player which, again, plays very nicely with projectors if you use their player based simulated dynamic tone mapping. 

Share this post


Link to post
Share on other sites

Dolby Vision doesn’t work if you rip your media for playback on a media server, but HDR10 does.  Any idea if HDR10+ is any different from it predecessor in that regard?

Share this post


Link to post
Share on other sites

More insight into the updates on this year’s LG OLED sets:

 

 

 

 

You really have to applaud LG for consistently listening to feedback. Some great little tweaks there, and they have indeed made smooth graduation its own separate control this year.

  • Like 1

Share this post


Link to post
Share on other sites

Well, I've decided to make a 2019 goal: to have a 4K television by early summer/early October (my birthday).  I'll start researching now.

  • Like 4

Share this post


Link to post
Share on other sites

My only hang up would be lack of HDR10+. If it's not the tight relationship they have with Dolby, it occurred to me that LG may be against offering it purely because of the bitter rivalry between them and Samsung who had a large hand in making HDR10+ a thing. I guess we'll see as the year goes on and more discs get released. Since Panasonic don't sell their OLED TVs in the US (which is INSANE, frankly) there's no HDR agnostic TV in the US this year. I'm really surprised I haven't seen Vincent, AVS or AV Forums even ask LG reps about their current stance on HDR10+ at CES, especially since LG have a good history of offering as many bells and whistles as possible on their OLED sets. 

  • Like 1

Share this post


Link to post
Share on other sites

Vincent seems more impressed with the Sony 8k Z9G series full array backlight master drive sets than he was last year’s Z9F model. The G’s are only coming in 85 and 95 inch sizes though.

 

 

Interesting that he notes these will do HDR10+ (flashes up a note on screen saying it’d need a firmware update). First I have heard of Sony offering it. The specs make no mention of this so I don’t know his exact source.

 

Share this post


Link to post
Share on other sites

HDTV Test did a very interesting interview about the progress of Quantom Dot technology and its potential application in both OLED and LED based displays:

 

 

CES 2020 could see some really notable progress in existing display tech.

Share this post


Link to post
Share on other sites
On 1/8/2019 at 7:39 PM, AlbertA said:

I can’t find much info on the uled stuff yet. Are they using quantum dots as color filters? 

 

If they can make uled cheaper than OLED they’ll have a hit on their hands.

 

And affordable uLED 120” wall (say less than $4k) would start to erode the projector business.

 

On 1/8/2019 at 7:47 PM, Angry the Clown said:

What’s uLED? Isn’t that Hisense’s branding?

 

 

Returning to this. I had missed Hisense's demo at CES of what they are branding ULED XD, which is sandwiching a 1080p monochromatic panel with a quantum dot 4k panel to achieve exceptionally high contrast performance...etc. The monochromatic panel controls the luminance with the demo display allegedly able to hit a 2900nit peak brightness. This isn't new technology as its been used in some industry monitors before, but it's the first I've seen of it being implemented for consumer displays:

 

https://www.engadget.com/2019/01/07/hisense-ces-2019/

 

It's expected to debut to the Chinese market later this year. If it's become more feasible at a consumer level then it would be exciting if, over the next year or two, we find out that the likes of Sony and Panasonic have been putting R&D into it. LCD still has some really interesting places it could go over the next few years.

Share this post


Link to post
Share on other sites

Oh that's clever - so a bit like having 2 million local dimming zones? Panasonic were talking about a year ago of doing a double layer of LCD, weren't they? This is presumably the same thing.

 

I wonder, when signals are typically 4:2:2 anyway, why they didn't do a 1080p colour layer and 4k luminance, rather than vice versa, though? I presume there's a good reason.

Share this post


Link to post
Share on other sites
On 1/11/2019 at 4:11 PM, iainl said:

As I understand it, very few projectors have the dynamic range to get much out of HDR of any kind, so it’s not hugely worth worrying about, I believe. 

Wait, so me thinking about, and figuring out, what the future holds for my home theater with regard to a projector purchase that supports HDR is basically fruitless?  Instead I should be targeting an 85 inch TV to replace my 88 inch screen/projector?  That isn't crazy.  That is under consideration in fact.  A little sad to get rid of my setup, but if that is the wisest thing to do, I'm glad to emotionally accept that the future is bright and that brightness will come from a flat panel instead of a projector.

 

Consider that I'm going to get a few more years out of my 1080p Optoma projector.  So 2 years from now, I'll reconsider.  But maybe a BIG flat panel will be the likely outcome.

Share this post


Link to post
Share on other sites

If you're projecting, you're sitting in a dark room, and you don't need 1000+ nits of brightness searing your eyeballs. So you may be happy to just get the extra crispness of the resolution. But otherwise, you're going to want to look at TV panels, sorry.

 

On the bright side, if you're not getting HDR or a wider colour gamut (low-end "4k" DLP projectors only have Rec. 709 gamut as well) then you can be happy about how Blu-rays are half the price of UHDs as well.

Share this post


Link to post
Share on other sites
On 1/11/2019 at 7:30 PM, Angry the Clown said:

Does it not rip the DV layer at all or is it just a matter of media server software players not supporting DV? 

 

I missed your reply on this the other day. If I remember correctly when I looked into it, it's something about the way the data streams or is transported. It's something technical about the actual process, not the lack of support. It didn't seem impossible, but the fact no one had figured it out yet was telling.

Share this post


Link to post
Share on other sites

 

26 minutes ago, iainl said:

then you can be happy about how Blu-rays are half the price of UHDs as well.

You're basically saying "Sorry I knocked all your teeth out, but think of the money you'll save on toothpaste and floss."

Share this post


Link to post
Share on other sites
6 minutes ago, foogledricks said:

 

You're basically saying "Sorry I knocked all your teeth out, but think of the money you'll save on toothpaste and floss."

 

Fair enough - get saving your pennies for a massive TV, then...

Share this post


Link to post
Share on other sites
1 hour ago, iainl said:

 

Fair enough - get saving your pennies for a massive TV, then...

I think I'm gonna focus on higher quality ball bearings.  Its all ball bearings now-a-days.

Share this post


Link to post
Share on other sites
2 hours ago, foogledricks said:

Wait, so me thinking about, and figuring out, what the future holds for my home theater with regard to a projector purchase that supports HDR is basically fruitless?  Instead I should be targeting an 85 inch TV to replace my 88 inch screen/projector?  That isn't crazy.  That is under consideration in fact.  A little sad to get rid of my setup, but if that is the wisest thing to do, I'm glad to emotionally accept that the future is bright and that brightness will come from a flat panel instead of a projector.

 

Consider that I'm going to get a few more years out of my 1080p Optoma projector.  So 2 years from now, I'll reconsider.  But maybe a BIG flat panel will be the likely outcome.

 

You’d have to ask yourself what you really want, whether the BIG screen experience of having a projector is of greater value vs improved contrast, HDR performance and all the other bells and whistles of a flat panel. I think there’s still an immensely compelling argument to be made for having a great projector over a flat panel. If I had the money and a dedicated light controlled room I wouldn’t hesitate to prioritise a projector over a TV, but great HDR and Wide Colour Gamut capable projectors are still few and may exceed what you’re willing to spend.

 

Iain covered it well, really. HDR is a minefield of variables. Projectors, even dual stacked, just can’t get as bright as flat panels, and short of some miracle breakthrough they never will. Theatrical Dolby Vision for example has a max value of 108nits (no zero missing there. One hundred and eight), that’s what theatrical DV is mastered to. At home, HDR/Dolby Vision can be mastered for up to 10,000nits by comparison, with 4000nits generally being the current accepted max standard for movies (these encoding metadata values vary depending on the studio and mastering monitor they use, but it’s important to note that those are maximum encoding figures baked into the metadata. Blade Runner 2049 for example reads as being encoded as 10,000nit on the US Warner disc, and 4000 on the U.K Sony disc if you bring up the display info on a capable player. In actuality the film’s max CLL level never exceeds 250 or thereabouts. Confusing gobbledygook I know, but the point is while you will hear 1000nits, 4000nits, 10,000...etc thrown around by manufacturers, it important to note that not all content hits those peaks, and we are talking about peaks here, not a blanket beginning to end brightness value). In short, you can only allow yourself to get so hung up on this issue because we still seem to be a long way away from any display being able to handle whatever HDR content you throw at it optimally.

 

We’ve still not seen consumer TV’s reach 4000nit capability yet. LED/LCD displays are putting out anything between 1200nits/2900nits depending on the model, and OLED technology is unlikely to ever significantly exceed the 800nit max range the best models are currently capable of once calibrated (short of some major breakthrough). When displays cannot display the peaks of some content they have to resort to what is known as “tone mapping,” the approach to which varies from manufacturer to manufacturer but is arguably becoming more sophisticated year on year, and is basically a TV or projector’s way of compensating for max brightness values it can’t display natively. HDR is really not about max bright and dark values, but preserving and displaying details in those values. 

 

Consumer projectors can deliver brighter HDR performance compared to cinemas as the screen sizes are dramatically smaller by comparison to a movie theatre and the projectors are closer to the screen, but you’re still looking at maybe 200-300nit peak performance on the best models in a light controlled environment. Is that a big loss compared to what TV’s are capable of? Personally I’d view BT2020/DCI-P3 support and performance in a projector to be of greater importance than peak brightness capabilities. HDR on the best consumer projectors is still very striking indeed, particularly if paired with a decent player that can do internal tone mapping like Oppo or Panasonic’s players, or better yet an external video processor from Lumagen (but the latter doesn’t come cheap). Again, it’s a matter of looking what how much you’re willing to spend and whether the truly big screen projection experience and more limited HDR performance strikes a better balance compared to a flat panel display. 

Share this post


Link to post
Share on other sites
1 hour ago, Starhawk said:

 

I missed your reply on this the other day. If I remember correctly when I looked into it, it's something about the way the data streams or is transported. It's something technical about the actual process, not the lack of support. It didn't seem impossible, but the fact no one had figured it out yet was telling.

 

Interesting. I never got to the bottom of what makes Dolby Vision more complicated on disc, compared to Dolby Vision streaming. There is absolutely a difference, we know that much, as it partially resulted in why it took so much longer for DV to arrive on disc as the encoder requirements were seemingly so much different to encoders for streaming, and Disney also were allegedly unhappy with the issues on the physical media end hence them dropping support a while back (every chance they will regain confidence of course, and some have claimed it’s less the encoding issues causing concern but the decoding variables, such as the headaches Sony have faced, that put Disney off as Disney wants to ensure consistency for consumers with their products. I’ve no idea what’s true. Some have Claimed Pixar were not happy with DV encoding tests that they did and that their concerns spread to Disney as a whole causing them to abandon it. Coco’s director when questioned on Twitter I believe was the source for that information, at least in commenting about why Coco didn’t have Dolby Vision on disc). 

Share this post


Link to post
Share on other sites
5 minutes ago, Angry the Clown said:

 

Interesting. I never got to the bottom of what makes Dolby Vision more complicated on disc, compared to Dolby Vision streaming. There is absolutely a difference, we know that much, as it partially resulted in why it took so much longer for DV to arrive on disc as the encoder requirements were seemingly so much different to encoders for streaming, and Disney also were allegedly unhappy with the issues on physical media end hence them dropping support a while back (every chance they will regain confidence of course, and some have claimed it’s less the encoding issues causing concern but the decoding variables, such as the headaches Sony have faced, that put Disney off. I’ve no idea what’s true. Some have Claimed Pixar were not happy with DV encoding tests and that their concerns spears to Disney as a whole causing them to abandon it. Coco’s director when questioned on Twitter I believe was the source for that information). 

 

I started googling it again, and it seems like it has to do with the DV data being on multiple layers that are read at once. This can't be remuxed into a .mkv for example. You can still rip everything, but you can't use it unless you burn a disc I believe.

  • Like 1

Share this post


Link to post
Share on other sites

Right. Yeah the multi layers of a disc being the culprit sounds familiar to something I’d read once (as far as the hang ups delaying DV on disc hen it as always common on streaming is concerned, anyway).

 

I wonder if there will ever come a day where understanding HDR doesn’t make your head hurt and has 100% defined standards. 😀

Share this post


Link to post
Share on other sites
14 minutes ago, Angry the Clown said:

I wonder if there will ever come a day where understanding HDR doesn’t make your head hurt and has 100% defined standards. 😀

 

That day being when a combination of improvements to panel brightness/contrast and the software that controls that panel mean that HDR10 is good enough without needing the extra metadata crutches of DV or 10+, at a guess.

Share this post


Link to post
Share on other sites
2 hours ago, iainl said:

 

That day being when a combination of improvements to panel brightness/contrast and the software that controls that panel mean that HDR10 is good enough without needing the extra metadata crutches of DV or 10+, at a guess.

 

Perhaps, yes, but we may be talking heads in jars by that point. Don’t forget it’s one thing for TVs to keep improving on peak luminance, and another thing for them to keep inproving on increasing their native handling of HDR’s bt.2020 max colour volume. We’re sort of lingering around 73% for the latter on some of today’s best displays, though TCL were boasting that they’ve achieved over 90% in prototypes at CES.

 

Whatever rapid improvements we see to displays I think we can most assuredly expect the release of exclusively static HDR content to fade away and take the future to be various flavours of dynamic metadata based HDR because of the varying standards of how content can be delivered. HDR10/HDR10+ can go to 4000nits for example, Dolby Vision to 10,000 and HLG to (I think) 5000 if the content creators so choose. Dolby Vision does also carry with it 12bit matadata which tests have shown even seems to be of benefit even downconverted to contemporary 10bit panels, so if it’s native 12bit 10,000nit 100% bt.2020 displays that are going to be the optimal standard then we may be waiting some time.

 

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×