PyeongChang perspective: How NEP supported Discovery and Eurosport for Winter Games 2018
SVG Europe sat down with Donald Begg, Director of Technology NEP; Casper Choffat, R&D Manager and Lead System Architect NEP; and Terje Røijen-Hammer, MD of NEP Media Services to discuss the massive glass-to-glass service offering provided by NEP to Discovery and Eurosport for the PyeongChang Winter Games in February.
During the Games, 14 NEP divisions provided facilities and solutions to enable Discovery to deliver its first-ever coverage of the Games across 48 countries in Europe on free-to-air, pay-tv and every minute online across the continent. More than 20 venues were supported by 300 NEP technical experts on the ground in South Korea and 100 in NEP Europe locations, including 17 studios.
NEP’s support of Discovery as a managed service provider included custom buildout of the International Broadcast Centre, which integrated more than 85 racks of equipment that underpinned its operations hub, production control rooms, audio suites and edit facilities supplied by NEP; centralised sourcing of 50 remote commentary feeds to synchronise multiple video and audio feeds simultaneously, including auto-audio mixing, between remote commentator booths at events in Korea and commentary booths throughout Europe; a virtual studio based in Hilversum for the ‘Winter Games Today’ live daily show on Eurosport 1 (Eurosport Benelux) and available for download through the Eurosport Player; an Oslo-based broadcast centre built for Discovery; and Mediabank, NEP’s cloud-based toolbox of workflow solutions, to ingest, tag, review, edit and distribute Discovery’s content across a dozen European viewing markets.
SVG Europe and the NEP executives focussed on three of those elements during our call: the turnkey Discovery operation at the IBC in PyeongChang; the extent of the cloud-based Mediabank SaaS operation; and the remote commentary set-up that in many ways was the ‘secret sauce’ behind how Eurosport managed to deliver multiple Olympic events in multiple languages simultaneously around European markets in February.
Discovery at the International Broadcast Centre in PyeongChang
Donald Begg: “The fundamental thing to say is that Discovery has never done this before. Nobody has ever done this before! I don’t mean that in a big-headed way: I mean nobody has ever done an Olympics before with so many variables and multiple languages, all out of one system.
“Initially we obviously went through an extensive procurement process where we put our proposal to Discovery and thankfully we won it. We then embedded two people into the Discovery operation, one on the vision side and one on the audio side of things – Simon Jones doing video and Stuart Cruice for audio.
“There was a lot of dialogue and I think it’s worth noting that you get some clients that come and say ‘this is what we want you do to’ and it’s a very defined list. The dialogue here was right at the core of the planning process, and Simon and Stuart along with the Discovery team of David Roulson, Jon Sweeney and Neil Cooper were all very much involved in creating workflows and coming up with suggestions.
“We set up a test at NEP in Bracknell where we had some edits, some EVS’s and a 1 Petabyte SAN which colleagues in Holland supplied to us. We got to a point where within reason the hardware was 90% solidified, probably around July/August last year,” he said.
“Running in parallel with that we had a team of people, led by Paul Flook, doing the system integration work. Paul was also embedded with Discovery and with the help of his SI team they were turning the plans into actual, physical drawings and cable schedules.
“We brought everything together in Bracknell and built the Central Technical Area that was going to be in PyeongChang, along with one of the production control rooms so that we could invite Discovery along to have a look. It meant we had tested the technology — although of course you can never fully test until you’re on site and get all the right feeds.
“We then put everything we could in sea containers. Some of the expensive items like EVS’s and cameras were sent by air freight; but all the racks went by sea. A team of people including me flew out in early November. We started out with 13 people, mostly wire men, and one week later were joined by another dozen engineers,” said Begg.
“We built the IBC system for Discovery before Christmas. David Roulson was there in late October to effectively accept the space and do all the power and air conditioning testing. David was also pretty much the last person still there in March – in fact we flew home together.
“In terms of the hardware build we probably got 90% of it done before Christmas. We came back in early January and finished the build by installing the EVS’s and the cameras, and then very much worked with Discovery to finesse the workflows and check and double check latency etc. Also at that stage we were getting the connectivity from the venues.
“We had a hybrid SDI and IP infrastructure where SDI was at the core and IP was used for contribution between the venues and the IBC, and again from the IBC to Europe.
“We also used IP to allow the Discovery German team to have a remote studio at the German National Olympics Committee venue (Germany House), where we had three cabled cameras and one radio camera, along with intercom, IFB and audio circuits plus Discovery’s corporate network, all linked back to the IBC on Ethernet circuits.
“That configuration became quite intense, but it paid off as it didn’t miss a beat on the job. We had all kinds of monitoring systems; we could even tell if someone closed a lid on a laptop they shouldn’t have closed. We would get an alarm to say ‘could you open that laptop please’!
“All in all it was a very successful first outing. There are definitely things we learned that we would do differently if we were invited back to do it again. I don’t want to put words in their mouths, but the vibe we got from Discovery was that we had done a good job,” said Begg.
Massive traffic and searchable clips with Mediabank
Terje Røijen-Hammer: “Mediabank is a cloud-based software-as-a-service MAM. There were up to 50 circuits (and 14 backup feeds on satellite) sent to Europe with content [during the Games], ranging from the unilateral feeds to the distribution feeds from the venues. All those feeds were ingested in Mediabank together with metadata, generated both automatically and manually, which made it possible for all the European markets, as well as PyeongChang itself, to start getting clips and edits for their programming.
“Mediabank really made sense for Discovery, not only for all the live signals but for all the different
markets working together on the same content that was stored only once and then linked with different audio and metadata to individual accounts for each of the countries. So one overall account for the international feeds with all the metadata and then linked to unique accounts for each market – and they could work on their stories or publish to their platforms in an easy way.
“Mediabank automatically generates clips based on external metadata feeds synced with the video feeds or by people who are live plotting [logging] in Mediabank. The externally generated metadata came from OBS. We took the OBS metadata feed, together with the video and audio feeds, and ingested that into Mediabank – which made all the material searchable, in combination with the plotted metadata by the Eurosport team in Paris.
“Discovery created 50,000 assets or 12,000 hours of high res content in Mediabank during the Games and we received around 1,000,000 push messages from OBS that generated more than 30,000 auto clips to multiple platforms – and everything was searchable from Mediabank for each of the markets.
“It was massively used by Discovery. There were around 800 users working on the system at one time, in PyeongChang and the different markets around Europe. Plus, people sitting in New York and publishing to social media, based on search and auto clips.
“It was certainly a challenge, because you don’t have the possibility to test all workflows for such a
system before the Olympics. And just before the Games some of the metadata was changed, like the ‘Olympic Athletes of Russia’ and North and South Korea joining together in a combined team. This was a challenge in the last days before the Games.
“Around 7,000 files were delivered between the IBC and Mediabank, and around 130 terabytes solely of clips produced at the IBC and shared with the rest of the markets. I think we managed quite OK, and Discovery were happy with the system as far as we know.
“I believe it was the first time that a cloud-based MAM was used on such a scale for an Olympics. When you are serving several countries like this, it makes sense to store it only once and then link it to each of the markets with the correct audio,” said Røijen-Hammer.
Remote commentary and automated audio mixing
Casper Choffat: “For all the feeds sent to Europe, NEP was both in Oslo and in The Netherlands a hub on the Discovery WAN in Europe where all the locations were connected. My understanding is that 26 locations around Europe were connected to it, a network set-up for both contribution and distribution of live feeds as well as file transfers.
“Of the up to 50 fibre feeds coming from PyeongChang (and 14 backup feeds on satellite), we were sourcing 13 simultaneous event feeds for language-altering. These included commentary audio as a mono audio feed produced at the venue location booths with different languages. In addition these contributed feeds would go to all the Eurosport markets, so if commentary was not produced in a language at the venue Eurosport was capable of adding commentary from their European markets.
“And all those different feeds were then sourced into NEP where we did the automated synching of all audio feeds with the video as well as an automated audio mix. That meant we were doing 13 simultaneous event feeds with nine different languages per event feed, adding up to 117 languages — of which 50 were basically unique,” said Choffat.
“We had to build a system in which 50 languages could be mixed: this complete system was controlled by only two operators. Before this you needed to have an audio engineer per commentary feed, but by doing it in this automated way we were able to reduce from 50 people to two people mixing the audio feeds.
“Once this was completely automatically mixed and synchronised we encoded the video and audio
feeds in order to distribute onto the Eurosport Player platform. The languages involved were Swedish, Norwegian, Finnish, Danish, Dutch, German, English, Polish and Italian.
“The Eurosport Player was not delivered by us: but we were the source of the live event feeds (which we also do during non-Olympic times). Within the Player there is the functionality to select the language in which you would like to listen and in this way Discovery was able to localise its content,” said Choffat.
Commenting on the unique nature of the Discovery Eurosport Winter Games project and the massive scale of facilities required for the operation, NEP UK Vice President of Client Services for Discovery Eurosport Keith Lane said, “It has been a tremendous opportunity and privilege to support Discovery on their inaugural Olympic Games. NEP is unique in the industry in that we have the depth of technical talent and diversity of service offerings to be able to deliver this kind of large-scale support.
“NEP is a very people-driven organisation. Everyone can buy hard- and software but NEP’s people make the difference because they develop unique workflows for the creation, management and distribution of content.
“This event was special in comparison to anything we’ve done before – the number of countries, regions, cultures and languages involved, plus the complexity of our solutions, was staggering. We have some of the best people in the business, and I’m proud of what they did and how much they accomplished for our clients. We are very much looking forward to working with Discovery’s team again,” said Lane.