Live from the Commonwealth Games: BBC in object-based media trial

The BBC is trialling the technique at the Opening and Closing Ceremonies of the Commonwealth Games

The BBC is trialling the technique at the Opening and Closing Ceremonies of the Commonwealth Games

As part of its investigation into object-based broadcasting BBC R&D is trialling web application Venue Explorer at the Opening and Closing Ceremonies of the Commonwealth Games. “We see object-based audio and video media formats as a core element of future media services,” explains Steve Baily, general manager, BBC R&D.

As it applies to live events this starts with the idea that current broadcast coverage provides an experience very different to actually being there. The views are controlled by camera-ops and the director, instead of viewers being able to look where they want. Venue Explorer is an example of how a broadcaster might give viewers the ability to look around a multi-sport event more freely. In addition, it provides separate audio mixes corresponding to the part of the scene that viewers choose to look at.

The BBC is not alone in following the audio angle. Dolby is bringing its object-oriented sound system, Atmos, to the home environment.

Roland Vlaicu, VP, Consumer Imaging, Dolby Labs says that one application of Atmos would be to enable the viewer to select a soundtrack from a stadium view while watching a live soccer game to augment the fan experience of being present.

“Another idea could be to have audio from your local pub sent to accompany your home view of the game, as if you were in the pub with friends,” says Vlaicu.

He stresses that such an application was not in Dolby’s plans – but that there is no reason why operators could not exploit the technology’s potential in that way.

Dolby’s system, which is now rolled out to over 600 cinemas worldwide, delivers the soundtrack as set of audio objects and metadata which configures automatically to the sound system (number and layout of speakers) installed.

The BBC’s object-based audio for Venue Explorer is derived from broadcast feeds relating to individual events within the venue combined with an ambiance feed located at the camera position.

“When viewing a wide shot, the audio will convey the ambience of the stadium, similar to what would be heard by someone in the audience,” explains a BBC R&D blog. “As the viewer zooms in to an area, the audio is remixed to a broadcast feed corresponding to what is being seen.”

Data capture

Key to the experiment is the capture of high quality data from background information on athletics events to live information about, say, the latest results. This is required to allow audiences to search, discover and personalise content.

“We are developing techniques to address the challenges of creating this metadata, whether from today’s content or that taken from the vast wealth of content held in archives,” explains Bailey.

It is working on the COMMA (Cloud Marketplace for Media Analysis) project with partners Kite and Somethin’ Else, with funding from the UK’s Technology Strategy Board.

“The user can choose to overlay this information on the image, aligned with the corresponding location, providing an ‘augmented reality’ display. This approach could in future be coupled with even richer sources of data, for example from object and face recognition systems, to give information on individual athletes as the user zooms in to them.”

As SVG Europe explains here, an object-based approach to broadcasting intends to divide audio-visual programme content into separate ‘objects’: the video is divided up into tiles, the audio is sent as a number of separate streams relating to particular picture areas, and overlay data is sent separately, with information about the place in the image it relates to, and what kind of data it is (such as results or schedule). The device at the user’s end automatically assembles these objects according to the view selected by the user.

For the closed trial in Glasgow, the BBC positioned a 4K fixed wide-angle camera viewing the opening ceremony in Celtic Park. It relocated it to Hampden Park to cover the athletics and the closing ceremony.

In theory, the video could be delivered to viewers over a high-bandwidth link and displayed on a large UHD screen, but with networks and displays expensive and not yet commonly available, the BBC has developed a way of displaying the image on a tablet but allowing the user to pan and zoom around the scene, much as they would when using a map application.

“This means that we only have to transmit to them the portion of the scene that they are looking at, significantly reducing the bandwidth requirements,” explains BBC R&D. “By focusing on a tablet, we are hoping to be able to test out the way that this type of experience could work as either a stand-alone or second screen experience (i.e. how it fits in or contrasts with conventional broadcast footage), and a tablet is an obvious starting point for second screen applications.”

In addition to developing an HTML5-based application as a part of BBC R&D’s Commonwealth Games Showcase, the BBC is also partnering with the Dutch institute for applied research, TNO, to test their iPad application on the open internet.

Subscribe and Get SVG Europe Newsletters