Euro 2024: How AE Live is supporting BBC Sport with its LED-based XR presentation studios in Berlin
For its Euro 2024 output, BBC Sport has two mixed reality presentation studio areas opposite the Brandenburg Gate. Both make use of extended reality (XR), augmented reality (AR) and LED screens.
One of the key partners in the project is AE Live. The virtual set designs, created by Paul Kavanagh Studio, were realised by AE Live’s Virtual Art Department.
The company is providing overall technical integration, implementation and operation of the real-time LED and AR Set extensions, as well as presentation graphics and a match clock across 27 matches, plus graphics for highlight programmes.
In a special Q&A, Chris Izatt (pictured, right), the company’s director of innovation and virtual, gave SVG Europe the lowdown on AE Live’s involvement and how the studios went from idea to reality.
How long have you and AE Live been involved with the BBC’s Euro 2024 studio project?
The first time we heard about it was at IBC 2023 when we briefly caught up with BBC Sport design director John Murphy, BBC Sport senior director Colm Harty, and BBC Sport executive producer, football, Phil Bigwood. They casually dropped into a conversation that ‘we want to do LED panels, in daylight!’ They were very keen and had already established a relationship with ROE Visual, making it possible for them to use the LED technology that they needed.
Had you done anything like this before?
The BBC Sport Russia 2018 FIFA World Cup project used two large LED walls with camera tracking to provide XR and AR set extensions. The content produced for the LED extensions was more graphical in style, and therefore did not require as much fine-tuning as required for the Euro 2024 project. This is the first time we have worked with camera tracking and LED floor extensions in an open studio which required significant R&D and planning to ensure smooth execution. There were quite a few unique challenges we had not previously faced before.
What happened after the initial meeting at IBC?
At the turn of the year, we started having meetings with all the third parties involved and it became clear that there were a variety of creative and technical requirements that needed to be figured out. These included what kind of camera lenses to use. And how to test the camera moiré. How would we provide accurate camera tracking on a rooftop terrace that could withstand all weather conditions? We also needed to know if the set design would work with the proposed camera positions and if the screens would go bright enough for daylight. The list of questions that needed to be answered was significant.
It’s a hybrid remote production too. Did that make it more difficult?
It’s a very complex production in that regard, yes, with the locations and various places that you can direct and produce a transmission from. Presenters and a core production team, including an AE Live virtual technician, are in Berlin, with the remaining production team including a roster of seven AE Live graphics operators and core OB infrastructure, being based out of MediaCity in Salford.
What technology did you deploy?
For camera tracking and real-time graphics, we chose stYpe technology. We typically choose stYpe for camera tracking. But more recently, because they’ve been developing their StypeLand Unreal Engine Plugin, we’ve started to use that too because it makes a lot of sense to have one partner providing as many services as possible. Stype RedSpy is being used to track three cameras in the internal studio, and Stype Follower is being used to track two cameras on the Terrace.
We are also using several products built by our in-house development team to provide the bespoke functionality required for a unique project like this. We have a GPI gateway that manages and directs the GPI triggers we receive from the various galleries.
Our next generation of control software for Unreal Engine (ae CUE) also makes its debut, providing our AR operators with a modern and highly responsive interface to build and play out AR graphics sequences within the LED and AR virtual environments. CUE integrates with our data platform Aether, allowing match stats, tables and team line-ups to be driven by live data sources.
In this virtual world, cutting between cameras is very different to what we are used to in live sports production. Can you explain how it works?
Normally, of course, a vision mixer will have buttons that cut between sources. Pretty simple. But how you cut cameras is quite complex in this project.
The LED surfaces need to show the virtual extension from the camera that is currently live. So that means we must be able to draw that virtual environment from any of the cameras that can see the LED panels. This means that when you’re on a camera, if you want to cut to another camera, the first thing that happens is AE Live needs to receive a trigger to tell us what camera we’re going to next. This trigger is passed into the engine driving the LED wall or floor content which changes the perspective to match the camera we are cutting to next. Then the camera that we’re coming to will see that change. And bear in mind, if you’re doing this from the Berlin gallery, you push a button, that message goes back to Salford, to the equipment housed there, then that command comes back to Berlin and then gives us the trigger: so before we even get the trigger it’s gone to Salford and back to Berlin again! And then the change happens, and the camera sees it, and then again it has to go through a path all the way back to Salford before it gets into the desk for the actual cut.
Are they actually cuts?
Not really. They’re macro recalls essentially. The first part of that macro is sending the message to us to convey what camera we are going to. And the end of that macro is actually doing the cut, which is a timed-in event, which happens accordingly. So there’s this frame-accurate multi-country pass signal message going on that has to be perfect. The director and vision mixer have to take into account a 340ms delay.
Have you worked this way before, where the cuts have been so prescriptive?
Yes, but not in a hybrid, remote scenario. We’ve done it in a self-contained location, but not via this many fibre links and across countries.
With that in mind, did you do a lot of testing ahead of deployment?
It was a big test in the end. It took place at Production Park in Wakefield. During the testing, we changed some things, particularly how the triggers are handed off to us. We were assisted in this process by Timeline Television, and we ended up handing off to them in different ways, or a different route, which made a bit more sense. Thanks to the testing, there have been no technical surprises since the tournament started.
Are there any learnings you can take from the project?
It’s an incredible experience to be part of a project of this scale and It’s always surprising when we reflect on how much we have learned during the process. This project has allowed us to work with several incredibly talented industry veterans across all fields which has really helped to refine our knowledge in a variety of disciplines. Almost all projects these days seem to involve a significant amount of research and development, and I think the client’s decision to hold a reasonably large-scale test event was key to ensuring we were able to solve all the key challenges as a group, enabling us to understand precisely how everything would come together once we got on-site.
Regular group meetings to keep on top of any outstanding tests and tasks were also crucial to ensuring we would be able to get on-site with as few surprises as possible. When a project requires such a significant amount of R&D, you are taking the decision to go on-air with solutions that have not had the luxury of being previously battle tested on-air, so ensuring we had a dedicated testing environment within our London HQ was also critical to enable us to capture and resolve any unexpected issues with the overall system design. We are certainly coming away from this project with a lot of ideas for new tools, techniques, and processes to improve the various services we are providing in this sector.
Overall, are you happy with how it’s gone?
We are very pleased. It’s a good representation of the experience that AE Live has gained in providing virtual services over the years. We got on-site and ready for rehearsals in just seven days. Which is amazing.
We’ve always worked with lighting and vision up to a point. But never as much as we have in this project. The fact that these are not controlled studio environments is the really big thing here. The weather and light changes. You can’t just pre-define colour calibrations and adjustments – you have to do it on the fly. This has involved creating new workflows with lighting and the vision supervisor which has been a great experience.
Everyone involved has been so good from all sides: within our company, within the third parties, the client, it’s been fantastic. These jobs, you know when they’re special, quite early on. And it felt quite special from the start.
BBC Sport’s coverage of Euro 2024 continues on BBC1, BBC2 and BBC iPlayer until (and including) the Final on 14 July.
AE Live credits:
- Chris Izatt, director of virtual and innovation
- Scott Marlow, head of production – virtual
- Gavin McCandlish, head of engineering
- Brice Beauvillain, director of software dev
- Lewis Phillips, production director
- Peter Wedderburn, senior environment artist
- Darren O’Neill, senior technical artist
- Daniel Hackman, technical artist
- Richard Faulkner, graphics supervisor
- Sven Grbec, stYpe engineer
- Luka Crnelc, stYpe engineer
- Karar Sunny, virtual technician
- Chris Barker, AR graphics op
- Mike Major, AR graphics op
- Oswyn Williams, AR graphics op
- Becky Gregory, match graphics op
- Christina Saunders, presentation graphics op
- Tom Hanson, product manager
- Liam Cook, junior Vizrt designer
- Rob Field, software programmer
- Josh Yee, broadcast engineer
Read more: Inside the Development of BBC Sports EURO 2024 Studio Operations with AE Live