SportTech 2017: Making the move to High Dynamic Range

High Dynamic Range panel, June 7 (L/R): Simon Gauntlett, Dolby; Klaus Weber, Grass Valley; Peter Schut, Axon Digital Design; Mark Grinyer, Sony Europe

High Dynamic Range panel, June 7 (L/R): Simon Gauntlett, Dolby; Klaus Weber, Grass Valley; Peter Schut, Axon Digital Design; Mark Grinyer, Sony Europe

Some people believe that HDR has a more significant impact on the viewer than Ultra HD. However, there are considerable challenges in adopting it for live workflows, as panellists revealed at SportTech 2017 in London June 7.

HDR is not just wider dynamic range, “we also mean wide colour gamut,” said Peter Schut, Axon, CTO. “You need both to get the ultimate experience.” He bought an HDR UHD set at Christmas “and what was really an eye-opener was what a fantastic picture a company like Netflix could give to the home, with a limited infrastructure, and quite a competitor to us people in the linear television industry.

“What was very obvious is that it was so good that everything else looks broken now and that is, I think, our chance to move on to the next era. 4K, fine. I’m not too passionate about 4K, but 4K is the vehicle to get HDR and wide colour gamut into our homes and we, as an industry providing mainly linear television, we have to go with that beautiful picture and make sure we don’t lose more viewers than we already have.

“So, understanding what’s going on is quite important. It is not easy. It took me quite a while to understand why it’s so different but at the end technically I think we will solve it sooner than later. The next step is making sure we produce it in the right way.”

The colours that human vision can see range from ultraviolet to infrared, unfortunately, Rec 709, as used for HD, covers a fairly small triangle of what the visual system can see, so we’re moving to 2020, which is almost a full colour space with more light. Older televisions were limited to about 100 Nits, whereas HDR is standardised up to 10,000 Nits, “massively increasing the volume of information that we still need to fit in say 10-bit infrastructure, and that’s actually where the technical problem comes from,” said Schut.

Beyond bottlenecks

“From a technical point of view, HDR and wide colour gamut don’t have to be connected together. There are very good reasons to connect them together and I think or I hope there is not a question to use them separately, because it makes even more choices — which makes the market even more confused,” said Klaus Weber, Grass Valley, Senior Product Marketing Manager, Cameras.

One bottleneck in HDR is having to squeeze a very much wider range of contrast into the 10 bits resolution we have in broadcasting. “So we are now hitting, or becoming much more closer to, the limits to see banding effects and so on. Therefore it would be really good if we could go to 12-bit, or maybe even 14-bit.

“Unfortunately the whole infrastructure today that is available on the market from switchers, servers, everything we use, is limited to 10 bit so we have to cope with this one. Even so, that’s not an ideal situation. But because we have that, we have to be very careful when selecting the workflows and the ways how we use the different signals together to use them in the most optimal way — to avoid these banding artefacts.”

Wide colour gamut is relatively easy to achieve on a camera, but no camera fully covers the 2020 colour space. “We are far beyond 709 with the latest cameras, but if one goes into detail he will realise we will not capture completely 2020. But, this is more a theoretical limitation than practical,” said Weber.

Lunchtime at The Emirates Stadium, home to Arsenal Football Club

Lunchtime at The Emirates Stadium, home to Arsenal Football Club

While Rec.709 and 2020 are essentially broadcast standards, Netflix is able to deliver content that isn’t necessarily using these standards, because they come from the film world, said Mark Grinyer, Sony Europe, Head of Sports Business Development. If consumer televisions and companies using internet delivery can jump ahead of what broadcast infrastructure can deliver, broadcasters need to make sure “we capture the best images we can catch for our archives and for our other delivery mediums, because television isn’t the only thing that’s using these images now.”

It’s not a matter of the broadcast industry doing something wrong, but “I think we’re being led down a road of ‘this is how we’ve always done it’, like the BNC argument. We’ve always done it. We’ll hold on to the BNC for a while and we’ll do both things.” However, he thinks there are things we can learn from movie industry, especially from the delivery capability of some of the partners coming in. “There’s a whole lot of competition out there that aren’t worried about this.” They just want to deliver the best picture.

As a television and camera manufacturer it has to support all the standards, “so that as a production company you can offer that to your producers” — whichever mode they want to shoot in, from 2020 to Raw.

Simon Gauntlett, Dolby Laboratories, Director of Imaging Standards and Technology, believes it is important when talking about colour and the luminance ranges that standards support, that it’s more the container you’re delivering rather than the absolutely limits of the standards. “The PQ standard goes up to 10,000 Nits. There’s absolutely nothing can capture or display 10,000 Nits at this stage, but there’s a container that says ‘here you go, put your values into this container and when we reproduce it today and when we use it in 10 years time on better technology you’ll be able to accurately represent what was captured and what was displayed’.

“So, if you take the 2020 colour spectrum we’re not talking about trying to use colours that are right on the edge of that, although you can, so the fact that the cameras today can’t quite manage it, you can capture all of the capabilities that the cameras do have today within that container and deliver that container and hopefully eventually deliver that container to the consumer.”

Back to black

Dolby Vision delivers HDR complete with dynamic metadata to describe the images “to create the best images possible,” but there are few Dolby Vision theatres in Europe, other than Dolby’s facility in London and some in the Netherlands, although more are coming soon, but there are many of them in the US and in China.

Session moderator Ken Kerschbaumer, Editorial Director of Sports Video Group, saw Guardians of the Galaxy Vol. 2 in Dolby Vision in New York recently, and said that “it was unbelievable when the screen was dark, the theatre was dark. There wasn’t that grey screen in front of you. It was completely black.” The colours were “so amazing” that he didn’t know whether the movie was created to show off Dolby Vision or whether this was something that’s always been in movies but wasn’t capable of being displayed before.

There is now a lot of experience of creating movies and getting the best out of the standards that are available, something they are now starting to learn in TV, responded Gauntlett. “What we learned from the BT Sport work with the UEFA Champions League final is that we can now use those tools to create a good sports experience that makes the grass look the right colour, makes the white Juventus shirts jump out and shine as they did in the stadium and look right.”

Workflow in the shade

In Sony’s tests with broadcasters and HBS it can see that “we’re at a transition period for the shaders,” said Grinyer. “At the moment the best way to shade is if you’ve chosen to work with a production standard like S-Log3 which is pretty close to as much as our images can catch, so it’s a fair representation of Raw in terms of what they’d be using in a movie.”

However, he advises: “don’t let shaders see the 2020 colour space or the HDR screen, because they FR A SportTech HDR panel 2get confused. The image isn’t as you’d expect, as using S-Log is quite flat, but it allows you to capture everything.” This will change, but monitors are only just coming on the market to give a fair representation of HDR, and he admitted that Sony’s BVM monitors are probably too expensive to put in front of every shader. “As we stand today, shaders can carry on doing their job as they are. What you do need is someone who’s looking at the HDR, just double-checking and doing a bit of QC on it and is aware of what the differences are.”

“It also depends on the workflow which has been chosen,” said Weber, “because there are different ways to produce HDR and especially HDR/SDR if they are compatible footage and is for sure learning required, especially for the camera shaders but also for the people in graphics: how to handle post graphics and again that depends a lot on the exact workflow chosen, and each workflow has a slightly different requirement.”

On the Champions League Final the majority of the cameras used were HDR cameras, even the Spidercam. BT Sport did a test the week before and it was a real mix of cameras, as there probably will be for most applications, with HDR, SDR and even SD referee cameras, “and when you are trying to cut them together there are real challenges of when you then present that to the final user,” said Gauntlett. “The changes in quality and colour are quite noticeable, so you have to be quite careful in how you pull them all together for your final output.”

Subscribe and Get SVG Europe Newsletters