BBC trials IP Studio at Commonwealth Games with dramatic storytelling consequences

What would a live studio need if it worked directly on IP networks? That was the task BBC R&D set itself with a project that began in 2012 and which will hit a peak when it plays a central role in the world’s first live end-to-end IP production in Ultra HD to be conducted at the Commonwealth Games.

IP has been used by many broadcasters, BBC included, to link a studio to remote locations but the missing piece has been an IP production experience using internet protocols to switch and mix the videos.

The BBC R&D trials conducted during the CWG next month promise to do just that, while also testing the limits of network performance by shunting 4K data around the UK in a collaborative production workflow – live.

“The concept is to introduce software and IP into the overall chain so it can be used alongside existing technology like DTT,” said Matthew Postgate, controller, BBC R&D. “IP will enable us to be more flexible with services we already produce, and longer-term, to introduce new kinds of services.”

BBC R&D describes IP Studio as an open source software framework for handling video, audio and data content, composed of off-the-shelf IT components and adhering to standards like IP packet synchronisation protocol IEEE1558.

It encompasses a means to capture feeds directly from cameras. At this moment there are no IP-connected cameras, though Sony and Grass Valley are working on them. In the interim, for the CWG experiment, the BBC is using existing connectivity by taking quad-SDI cables out of up to four Sony F55s directly into a multicore PC (one per camera). This is loaded with an IO card for processing the video into IP and performs a similar function to the BBC’s commercially available HD-only product, Stagebox.

“Stagebox is similar in concept and IP Studio is developed from the same line of R&D,” explained John Zubrzycki, BBC R&D section lead. “The project will support multiple vendor IO cards. We don’t have to be fixed to one.”

The IP streams can be transferred and received at various points across a nationwide fast broadband network. In the case of the CWG trial this is a series of 100GB/s interfaces spanning Virgin Media’s broadband linking the BBC’s Glasgow production base with each of the venues around the city with Cisco switchers and Joint Academic NETwork (JANET).

The production format uses H.264 encoding “as the best way of achieving a balance between bitrate and quality at the sort of production quality levels required,” added Zubrzycki. “The codec choice is an implementation detail for this test.”

Once received, the feeds can be displayed on a multiview monitor with the ability for the vision mixer to switch between them. “It’s a simple vision mixer which makes clean, frame accurate cuts to create a new output stream to drive IP delivery over HEVC encoding onto a distribution network to the home,” he explained.

Testing distribution

The HEVC delivery portion is being tested from Brazil next week when the BBC takes the FIFA- and Sony-produced 4K feed into a DTT and IP chain back to closed sites in the UK. This same distribution pattern will be re-tested from Glasgow, although there will be opportunities for a public view of the output – presented as if delivered to the home – from within the Glasgow Science Centre (also home to the Games’ International Broadcast Centre).

The requirements of an all-IP studio are demanding: among them are high bit-rate low delay streaming, timing and synchronisation, distributed configuration and control, real-time data, flexible reuse of processing, and access to production content and information. Crucially, BBC R&D needs the IP Studio system to work in multiple resolutions.

“Although the source media is 4K one of the things we want to do is provide the right form of media to particular production devices,” said Phil Tudor, principal technician on the project. “It’s not always appropriate to send a 4K signal to a multiviewer, for example, or a tablet where the size of screen is a fraction of full screen Ultra HD. So IP Studio will create a set of streams. It will be 4K in and on the network, but there will also be an HD feed and a sub-HD proxy so that content is suitable for different devices. Working in IP Studio, production staff can access any of those streams.”

For example, if a producer is working from a tablet in order to make first assembly edits, they will require a lower resolution preview suitable for a smaller screen and once the edit decision is made it is pushed back into the system and the 4K signal is cut in a corresponding way.

The ability to assign production staff in different locations – or a central studio – is one cost-saving benefit of live IP production. Another is the capacity to produce more content, which the BBC has long desired to do but is constrained by the cost of additional satellite uplink space.

These are well-worked arguments and it remains to be seen if the economics of scale of migration to IP are borne out. Much more intriguingly are the long-term ideas the BBC has for what it terms this ‘future IP-based broadcast system’.

“It will be able to provide new experiences based around the concept of broadcasting data sets rather than whole programmes packaged in video and audio,” says Tudor.

Transforming storytelling?

Object-based broadcasting is a hot topic in BBC R&D and the team behind IP Studio is its focal point. Researchers have been thinking for some time about how the flexibility and power available in a software-based production environment could be used to transform storytelling.

Conceptually, audio and video elements can be split into separate media flows at the point of entry into the IP Studio. Where audio and video tracks belong together this relationship is noted (tracked in metadata), but the audio and video are moved around, processed and stored as separate entities. According to the BBC, this affords maximum flexibility in how and where in the production/broadcast chain they are combined.

“Traditionally we link a lot of assets together to produce one final rendition which we call a programme,” says Zubrzycki. “The viewer cannot alter it, the device it is received on cannot alter it. It is ‘one size fits all’ and we have to live with it. But with the ability to process information as data at source we can break down the programmes into assets.”

That’s the theory – but in practice what does this mean? “We can make content more responsive to the reception device,” says Tudor. “If you’ve a large screen you might have graphics, tickers or captions in different locations, but if you’re watching on a very small screen you may want those overlays to be rearranged or even not to appear at all. You could broadcast just the foreground or background objects if desired. You could tune out certain noises [the vuvuzela trumpets which drowned much of South Africa’s World Cup in 2010 for example]. The aim is to adapt the media without the viewer explicitly having to interact with it.” The BBC calls it Perceptive Media.

Another suggestion: “All camera angles might be virtual camera angles for viewers to choose the edit they wish to see in one of our shows,” notes Tudor. “We might incorporate a user’s personal Facebook data or information about the local weather to enhance the storytelling based on the user’s current environment.”

Beyond even that there is the concept of object-based storytelling which treats narrative blocks as things that become freed from being locked into one presentation. “These are all advances enabled by the basic foundations laid by IP Studio and in these this trials,” says Tudor.

Subscribe and Get SVG Europe Newsletters