FutureSport Analysis: High Dynamic Range – Separating Hype from Reality

Mark Grinyer, Brian Clark and Tim Borer speaking at FutureSport HDR session, December 2 2015

Mark Grinyer, Brian Clark and Tim Borer speaking at FutureSport HDR session, December 2 2015

Under probing guidance from moderator Adrian Pennington, the FutureSport session ‘HDR: Separating Hype from Reality’ went a long way towards disentangling the misnomers around a very complex technology area. Opening the session Tim Borer, Lead Engineer BBC R&D, provided an overview of the standardisation process (ITU-R RG24), and a focus on the crux of the issues that need exposing. The HDR solution Hybrid Log Gamma (HGL) is a “roll our own” effort after the BBC saw the need for a live production standard.

“There is quite a lot of progress, but obviously we need a suite of standards to go from the camera at the production end right through to consumer displays,” said Borer. “Almost all of those are either in place or will be very soon, but not all the issues are resolved.”

HLG was developed in partnership with Japan’s NHK with an eye on the known standards of Rec. 701, Rec.709, Rec. 2020 and the Asian STB B67.

“The mother standard comes from the ITU, and that is a difficult one to get through because it has to be by consensus,” said Borer. “The HGL has not yet been ratified, but that’s not a show stopper because the next RG24 meeting is in February and we have a preliminary draft of new regulation. The standard in place is the Dolby proposal utilising perceptual quantization (PQ). This is in the Blu-ray spec and it is likely to be part of the ITU standard, but it is best suited to movie and drama production, and less suitable for live production.”

We should expect a dual approach matching PQ to HGL then, with STD B67 in the frame as many broadcasters like it. “They tie together in a theoretical framework using the Opto Optical Transfer Function (OOTF) as the way of making sure they are consistent on a reference monitor,” said Borer.

One big problem is legacy UHD receivers that do not know about HDR. This raises the issue of backward compatibility.

“These are pre-standard receivers, and by the Olympics there will be about 2.6 million,” said Borer. “They will need to display the same signals as HDR sets, so we have signalling put into the DVB standard to allow us to do that.

“And now we are working on UK programme delivery standards. These will be using H.264 rather than HEVC because they are about production and do not need quite such compression. For the DPP standard, we will be back porting some of the signal put into HEVC in H.264. We obviously need to indicate that there is an HDR signal going to consumer sets, and that requires signals over HDMI,” he added.

This led him to the EOTF (Electro Optical Transfer Function) for which he said HGL is earmarked. Asked to clarify why the BBC proposal is hot to trot he said: “We take the conventional approach of coding what the camera sees in the signal, while the PQ approach actually codes the picture on a reference display and needs metadata for that. In live environments you don’t have reference displays.”

The world has gone full circle

Brian Clark, Sales Director with NEP Major Events, looked at how wider colour gamut impacts on the final picture as produced by an OB company.

“The practical applications were always how do we get a 2020 colour space, a P3 colour space, and an HD colour space in Rec.709? People forget there is a high dynamic range and colour space that applies to this and how that works,” said Clark. “Broadcast is not one medium. You have to consider at the very point of capturing 4K what standards you capture and what colour space. If you get that wrong you screw up every (subsequent) colour space.

HDR session line-up (L/R): Nemeth, O'Carroll, Conidaris, Grinyer, Clark and Borer with moderator Pennington

HDR session line-up (L/R): Nemeth, O’Carroll, Conidaris, Grinyer, Clark and Borer with moderator Pennington

“You cannot rack a camera in Rec. 709 colour space and expect the patches to look great in 2020. Lessons have been learned about how you apply different look up tables (LUTS). There are different applications for how you would rack a camera, what you need to rack in, and what you need to shade in,” he added. “There is amazing colour space in grading, and the world has gone full circle for me. Years ago I sat with guys using telecine machines to grade pictures, and we are coming back to that.”

As exists now with HD, there will be a button his colleagues push to demark BBC UHD productions. “However, that is not a sustainable commercial model for an OB company,” said Clark. “We need to be able to do what the BBC want, what Sky want and what NT Live wants to do for cinema. You have to not think about it in silos. It is a compromise.”

Mark Grinyer, Head of Sports Business Development, Sony Europe, was asked how a vendor develops kit when all the standards are so unsure.

“The deliverables from a production entity are increasing, and what has been vexing us is that we need to think about what we are recording. There are standards beyond some of those covered – gammas like S Log 3 and S Log 3 colour space, which means we can almost record in RAW like the movie industry does,” he said. “Then we can look at how we are going to deliver it – P3 colour space or for the BBC using HLG.

“The other vexing question is what goes to go to the viewfinder of the camera. With some of these gamma curves the cameraman cannot focus on the highlights,” he added. “And what are you going to show in screens in the truck, and how is the truck workflow going to change? Our process is to capture in the best possible way, and work with everyone on the deliverables.”

Craft technique has to be evolved

Christo Conidaris, Regional Sales Director with Quantum, was asked to tackle the implications for HDR storage. He said: “More space is what you are going to require. That is driving two very important points in parallel. First, because you are dealing with more data everyone wants to save more money and that really means utilising different types of technology to reach the right cost of ownership. People are monetising their content so much better so they want to store it for longer.

“Secondly, we are talking about data getting bigger. It becomes more difficult to share that data or to copy it. You have to find ways in which you can share that data across different platforms.”

David O’Carroll, Head of Technology with Presteigne Broadcast Hire, took on the practical aspects of workflow.

“With HDR and the wide colour spaces and the number of deliverables we are trying to achieve with different parameters it is very important we tailor the original,” he said. “A number of vendors are already supporting wider colour space and HDR acquisition, but I actually think a lot of the investment will be spent on less glamorous bits of the chain.”

He had in mind the importance of monitoring in OB trucks. Seeing the various sub sets is crucial to knowing that the production teams are delivering against the right standard. He added: “We found that when we moved to 4K that focus suddenly became far more critical and harder to see.”

Grinyer added: “There can be implications in what the camera shaders are doing. It might change their position in the workflow.”

Clark added: “We are talking about moving from four or five stops to 14-15 stops with a camera. That is a big change!”

Borer commented: “Because HDR pictures look significantly different, there is quite a lot of production and craft technique that has to be evolved.”

Breaking any form of automated workflow

John Nemeth, EMEA VP of Sales for Elemental, added: “It is also going to have to be dynamic, and that is the big challenge.” He continued: “We have done demonstrations with Technicolor, Dolby and Philips, and we are talking to the BBC. What came out of that was there is a compelling reason for HDR in terms of enhancing the viewing experience, but on different material it did show a need to do it dynamically.

“There are all the discussions around backward compatibility and whether you do single layer or dual layer,” he added. “A dual layer needs twice as much encoding, and while a single layer is less complex in terms of encoding it is more complex in terms of metadata. The closest we have got to a standard at the moment is HDR-10, which utilises some of the SMPTE recommendations. Like Dolby, Philips and Technicolor we will be rolling out HDR-10 as far as we can very shortly.

“What has been very apparent in doing this is the way metadata has become an issue. It is different between all those proponent companies, and different again for VOD and live,” he added. “The good news is with our software based encoder we can handle all of this. The bad news is that without a standard it breaks any form of automated workflow. Looking at all the different material we have played with, the need for some kind of dynamic metadata or change is apparent.”

In the closing remarks Borer said: “Trying to achieve consistency between panels is one of our big concerns, so we are developing that as part of the standard.”

Further regarding RG24 he added: “Hopefully the ITU standard will be agreed in February and ratified later in the year. It will have the PQ ST 2084 and the HLG aspects as well, and we will tie those together so they look the same on a reference screen.”

Nemeth added: “We can guarantee there will be different sub sets of standards and different operational patterns, and it is pretty certain metadata will be involved somewhere. With VOD you can probably think about it being in a separate file within an MXF wrap. Within realtime it may well still be in the video essence. Nobody has mentioned captions, graphics, ad insertions and markers, all of which have the potential to break with HDR. We need to think about all of these within a standard as well.”

Subscribe and Get SVG Europe Newsletters