By Ian Nock, July 2017
The Ultra HD Forum booth at NAB17 demonstrated Ultra HD and particularly HDR technologies that are deployable today. For the Forum, HDR has indeed now matured to the point where live services can be deployed with today’s production technology. The forum also demonstrated several developing technologies that will mature in the next 12 to 18 months to further build on today’s technology and enhance the customer experience in video.
The Forum has been working on its UHD guidelines since 2015. The current version (1.4) focuses on what we term “Phase A” technologies that are available to deploy in 2016/2017. These can also be what standards organisations refer to as “Phase 1”. Phase A covers 4K resolution at 50 or 60fps, HDR based on static metadata or HLG, with Wide Color Gamut, and multichannel sound solutions. Future Phase B technologies include further developments in HDR to include Dynamic Metadata, High Frame Rate (>100fps) and object audio based solutions such as AC4, DTS-X and MPEG-H.
The booth had five thematic Pods and this is the second of five blogs about those Pods and their demos. I’ll describe the lessons we learned and how the demos were received at NAB17.
This second pod combined two different aspects – Mixing SDR/HDR and the quality of UHD HDR.
Using HLG, as demonstrated in the first pod, is not the only way of providing universal service to consumers, because with the right client capabilities in the STB or TV, the operator or broadcaster could broadcast in all the different formats and allow the STB to dynamically convert the content to what is suitable for the display.
The STB connected to a HDR10 capable display can process SDR video and color convert / tone expand and convert the content so that it is presented ‘up-converted’ to the TV.
Additionally, in the scenario that the TV connected to the STB only supports HDR10 and the broadcaster distributes in HLG10 for some services, then the STB knowing that the display only supports HDR10 or SDR could convert the HLG10 service into HDR10 rather than render only in SDR.
Similarly, where the operator transmits in PQ10 and the TV is SDR-only, the STB can down-convert/tone-map the content to SDR.
These approaches are all targeted at providing a way of presenting content in the highest possible quality to all consumers, using detection capabilities in the STB to identify content formats and pass through or convert accordingly. These approaches work in both adapting for the presented content type and for the capabilities of the consumer’s TV or other device.
It is also possible that operators harmonise the delivery of their content, such that a mix of different formats presented on the same channel (as for mixing SDR adverts with mixed SDR/HDR content) can be converted in the headend to be separate services simulcast as SDR and HDR, regardless of the source content format.
These scenarios were technically reproduced on Pod 2, demonstrated using Huawei STBs and using pre-processed content produced by Harmonic and BBright – all members of the Forum. The complexity shift to the STB is a big one, but it is a feasible approach for bringing a universal service to the consumer’s display as long as the SoC supports the capabilities. Unfortunately, many early deployment chipsets have limited capabilities in this respect, so not every operator can take this approach.
Quality of UHD HDR
Most people think about UHD content being in 4K resolution, however this is not an absolute intrinsic part of UHD’s definition. The Ultra HD Forum Guidelines define UHD as being resolutions down to 1080P – a display resolution that some operators and broadcasters are considering when they are bandwidth restricted.
In addition, mobile devices make use of a wide variety of profiles to allow for bandwidth variation, so consumers will need access to lower resolution content profiles. It was with this in mind that Neulion demonstrated UHD Adaptive Bit Rate delivery of UHD HDR using an app on a Sony UHD display.
This fully functional demonstration allowed us to manually select each of the profiles to show how HDR improves the perceptual quality of the content at any resolution, as well as allowing us to show the quality variations from 4K resolution at 60fps, all the way down to 640x360p60 in HDR10. We believe this was a World first, never demonstrated before scenario, as we quipped to visitors. They found that HDR was indeed significant in the perception of quality all the way down to the lowest resolution, clearly showing how all content resolutions were improved by HDR.
Our previous Blog looked at Operator HDR. Our next blog in this series will look at the third demonstration pod… the near future with Dynamic Metadata.