How the BBC designed UHD HDR for Euro 2020: Islands in the sun

A screenshot of the BBC’s Euro 2020 virtual reality studio in use

By Simon Thompson, BBC Research and Development, senior R&D engineer

The last few weeks have been busy with the UEFA European Football Championships. Viewers may have noticed our two major studio complexes in MediaCity in Salford and Wembley Stadium in London as well as a smaller studio in Glasgow. I visited MediaCity and Wembley Stadium to help with ultra-high definition (UHD) high dynamic range (HDR) television production settings.

Here, I’ll look at some of the ‘islands’ that influence workflow design. Islands is a term I first came across during the transition from tape-based to file-based production – it’s an area in your production process that does not yet conform to the new process. In this article we’ll look at some existing standard dynamic range (SDR) islands, their effect on the UHD HDR workflow and what we’re doing to remove them.

UHD HDR for Euro 2020
The feed from each match was handed over to the BBC at UEFA’s International Broadcast Centre in Haarlem in the Netherlands. UEFA’s host broadcaster was in charge of creating the football match feed for all broadcasters, and the BBC added in elements such as the pundits in the studio, match commentary and pitchside interviews.

We used the workflow that we pioneered for the 2019 FA Cup (and presented in various fora, including the EBU and FKTG) for the BBC parts of the coverage. In this workflow, the programme is produced in UHD HDR, this feed is supplied to BBC iPlayer, and the main BBC One HD feed is created by down-mapping this UHD HDR feed and down-scaling to 1080i25. One of the advantages of hybrid log gamma (HLG) is the control it allows the creative team to achieve their desired look. They have the flexibility to deliver either its native natural-looking images or, with greater flexibility than SDR workflows, to add saturation and tone controls to achieve other artistic intents, such as the traditional sports look.

Specialist cameras
Small cameras used in difficult to access places (for example, drones, cricket stumpcams, football netcams) and newsgathering cameras used for pitchside interviews will probably never get the same dynamic range as a large sensor or broadcast triple sensor camera. So that these cameras can be used, we have created a mapping that takes the standard-compliant (ITU-R BT.709) SDR video and maps it in to the HLG HDR video feed in such a way that colours, shadows and mid-tones subjectively match. Many small cameras have proprietary HDR outputs, which could be transformed to HLG HDR, and this is an area that needs more work.

Current uncompressed graphics file formats used for virtual reality (VR) set insertion and overlays in video feeds use the sRGB colour space standardised in 1996. sRGB-based files cannot store images with highlights above the diffuse white point, nor can they store colours that are more saturated than their limited gamut (sRGB primaries are identical to ITU-R BT.709 primaries). This is an issue, especially for sports, where accurate reproduction of player’s jerseys (which often fluoresce and are very saturated), countries’ flags and metallic trophies are all desirable.

This meant camera shots that included a VR view, such as the main studio in MediaCity, had to be rendered in the ITU-R BT.709 colour space and then mapped into the ITU-R BT.2100 HLG colour space. Our Wembley outside broadcast studio, which had no VR elements, used cameras in native ITU-R BT.2100 HLG colour space. It also meant that graphics overlays (either the BBC’s or UEFA’s) are limited in saturation and brightness.

Though new proprietary image formats can store HLG data, they are often patent-restricted or take a lot of processing power to render. We have been working with the World Wide Web Consortium (W3C) to extend the Portable Network Graphics (PNG) specification, which will allow a whole host of non-sRGB formats to be stored correctly with simple to read data tags identifying the format. This new HDR graphics format will allow HDR rendering of VR studios and graphics in the future.

Replay servers
Replay servers are used for three types of replays at a sports event: action replays of the sports event; reaction shots of the pundits in the studio; and preprepared packages, for example, montages, archive footage and interviews.

Three main issues have slowed the implementation of replays in HDR: the availability of replay servers that record using a 10-bit video codec; the host broadcaster needs to provide HDR sources of the alternative angles, player feeds, crowd feeds etc; and portable editing systems for UHD HDR video need to support the relevant video standards and portable HDR screens of sufficient quality need to be available (either standalone or built in to laptops).

This is slowly beginning to sort itself out. Servers are now available to buy using 10-bit codecs but are not widely deployed yet. An increasing number of HDR cameras are being used in production and HDR portable edit facilities are becoming available. Hopefully, we will soon see HDR replays and inserts being introduced. The choice between using 10-bit HDR replay servers at 3840 x 2160p50 or 1920 x 1080p50 is largely down to the availability of replay feeds from the host broadcaster and the cost. Operating a replay server at 3840 x 2160p50 compared to 1920 x 1080p50 requires around four times the server storage and databus bandwidth.

Reference monitors
Reference monitors for HDR are significantly more expensive than for non-HDR displays (approximately 5x for a similar-sized screen). This means that not all production monitors at an OB or studio are HDR, typically the camera operators are looking at an SDR signal. We found a few years ago that the SDR output of an HDR camera was difficult to work with – the controls did not affect the HDR and SDR signals in the same way and the signals often drifted apart as the camera exposure changed. Therefore, we use a workflow where the camera’s SDR output is ignored and operators look at a down-mapped HDR camera signal, using the same conversion used to create the BBC One HD feed. This way, our most viewed signal maintains the highest quality and the HDR shadows and mid-tones follow exactly.

Production monitors will still be SDR for quite some time, but as more HDR monitors become available, there will be a time when productions can move to camera controllers monitoring on HDR screens, exploiting more of the available HDR colour volume.

Static conversions
Today, the HDR video signal follows the SDR signal but with better, more natural highlights and the ability to display a greater range of naturally occurring and manufactured colours. This is possible as the static conversions used, in conjunction with controlling cameras viewing the SDR signal, match the shadows and mid-tones of the HDR input and SDR output signals.

However, in future, if we fully explore the HDR volume with camera controls, we may prefer to allow a greater exposure latitude in HDR and need to dynamically alter the conversion used to map the HDR to fit in to the SDR colour volume. There are some exciting candidate technologies out there but, at present, they act over the entire video plane. To truly work with a sport style workflow, they will need to dynamically map the video elements in a signal while identifying and applying a fixed transform to graphics elements. You would not want a football score graphic to change brightness or a national flag to change colour because the football action was moving from a sunny area to a shaded area.

This blog was first posted on the BBC R&D website

Subscribe and Get SVG Europe Newsletters