Elevated view: Aurora Media Worldwide on developing fresh new graphics with AR for the E1 World Electric Powerboat Series
Graphics are a key aspect of understanding any sport, yet they come into their own particularly when that sport is completely new. When host broadcaster Aurora Media Worldwide began looking at how to translate the action of a Union Internationale Motonautique (UIM) E1 World Electric Powerboat Series race for viewers, graphics were therefore a key part of the plan.
Aurora has cut its teeth in elevated graphics packages many times before, including on Formula E and Extreme E, and this time around it wanted to push the boat out even further, Mike Scott, Aurora executive producer on E1, says.
“We wanted the graphics to look as if they were within the environment, the 2D as well as the augmented reality (AR) stuff; not a stamp on top”
Before the graphics plan was put on paper, Scott – who like the rest of the world had never seen an E1 race – had to try and understand how the boats and the race would actually work so he could visualise what the graphics needed to show viewers.
He notes: “I’ve done a few of these, and this is the most complicated one because of the amount we wanted to achieve. I come from motor racing. I’ve only ever done cars in circles, so this is my first look at boats in circles. That helped me in some ways because I came at it like, I don’t know anything about it, don’t know how they work, don’t understand what makes one faster than another; didn’t know anything. What was very helpful was to spend a lot of time with Mark Sheffield, broadcast and innovation director at E1, and other people at E1 like Roddie [Basso, E1 CEO], to explain to me how these boats and races work.
“Then I went to the E1 test and spoke to the test pilots to find out what they were doing, then I went out on the boat and they showed me more, because then I could work out, okay, so what is it we want to inform the viewer like me who hasn’t got a clue how a boat works. That was a starting point, to try and work out what you’ve got.”
Reframing the shot
The sporting data coming from the boats during a race is the keystone for the graphics package, says Scott. He says that the basics required are results, race order and timing, yet a lot more was needed. “From the start, the idea with the graphics was to make it look like nothing else [already out there]. I tried to get away from the traditional motorsport look, which I thought was important. I’d never thought about it before, but the coverage of boats is more of a landscape; if you think about motor racing, it’s more a portrait view because you’ve got a track, which is a straight and it has edges and white lines and fences, or it’s more you looking down at things, whereas in boats, it’s all very much a wide.”
“Formula E, Extreme E, E1; they are a vanguard of changing mobility using different technologies and we are invested in that. This technical playground of opportunity in developing technology in sports media and media overall is similarly progressive”
Once he had come to that conclusion, Scott and the team at Aurora, as well as partner NEP The Netherlands which it also works with on Extreme E graphics, reframed how E1 would be shot in order to fit in all the action. The race order was put on a carousel graphic at the top of the screen, then other areas to fit into this layer of graphics included visuals of the boat with identifiers, pilots in the cockpits and a way to identify them, the celebrity team owners on shore, plus all the celebrations and emotions that came with this high speed race.
Then eight different graphics engines – which Scott says is the most he’s ever come across in one show at one time – were employed. He says: “We wanted the graphics to look as if they were within the environment, the 2D as well as the augmented reality (AR) stuff; not a stamp on top. We had two engines using Xpression, using the standard sport graphics, 2D ones. Then after that, we had five Unreal Engines, each doing specific things. We wanted a heads-up display, which is like a circle with all the data around it, which we used for the new Agile cameras; that was one Unreal Engine. We had another two doing HIT tracking and two doing HUD.
“After that, we had the AR,” Scott continues. “Now the AR was brilliant; so hard to do, I can’t even explain. I got a few heart attacks over that thing, over the last few weeks.”
The course layout contributed to the AR; normally a buoy in the water would mark the layout of a race, but for E1, smart marks were used which can be remotely moved to another location. The GPS signal and other data contributed from the smart marks was used by Aurora to create maps within the race environment to lock the AR scene onto the water. “So that was quite an interesting thing,” says Scott. “We’re hoping in the next stage of E1 we’ll have a drone with the same AR on it, which will be another level on top. So that’s our next development on that part.”
Scott adds on the AR: “We were trying to put graphics on top of water that look as if they’re real to the viewer, which is quite a challenge. Clearly, they’re never going to look perfectly real, but [the idea was to make it] feel like it doesn’t stand out as something that shouldn’t be there. I think we achieved that. We can still go further with it, but I think we really got to good place.”
GFX features developed for E1 include: 2D Sporting GFX; Race Bird status; two Human Interface Tracking (HIT); two Heads Up Display (HUD); one ‘Live’ virtual map; and AR on a hoist and a drone.
These features run on two Ross Video Xpression Engines, five Unreal Engines and one data hub.
Operating all these features live are seven GFX operators based at NEP in Hilversum, one tracking engineer on site, and one data engineer.
This works on NEP’s custom build GFX operating platform called NEP Cube with NEP’s custom fork of Unreal Engine called NEP Surreal. All the data from on site (boat telemetry, sporting data, timing) comes to NEP GFX via Alkamel Systems. NEP GFX has built a data hub to digest all the data and act as a proxy to the GFX engines.
The AR tracking data generated on site is transported to Hilversum via an audio stream, which is then merged in Hilversum with a clean feed into Unreal Engine to provide AR.
It was a tense time leading up to the first E1 race which took place in Jeddah on 2 and 3 February, because, as Duffy says, no one know if the data and the graphics would actually work. He explains: “We didn’t really know if would work till quite late. It didn’t work till we were there on site because we didn’t have the real data from the real boats, in the real environment. So you spend nine months in development, planning something, and you don’t know it’s all going to work until you get the real live data in the real environment, with the real boats couple of days before you go live.”
Duffy adds: “Even late in the day, there was a debate about would it be safer to have trucks on site, but if you have trucks on site or a flyaway on site, we didn’t think we could use the NEP graphics pack in the same manner. So we decided to go for it and just do everything remote, and it came off.”
He continues: “A major benefit of being remote, apart from a sustainable footprint and cost, is from a graphics point of view you can use Unreal in a way that you wouldn’t be able to if you’re on site because [the engines are] too powerful to take and too expensive to take and too cumbersome to take to site. So you get the benefit of the ability to use Unreal by doing it remotely, which is why, creatively, we’re always pushing [our graphics output]. Having seen it work time over time now, we’re always pushing to have remote production so we can engender those productions with more powerful and more creative graphics.”
Making a scene
Continues Duffy: “The thing that’s important for us is that we’re not even working with graphics. We would call them scenes, so that when the director, Ben, is cutting, he knows he’s cutting a camera and he knows that the graphic on AR is going to be on that camera. We’ve created these different scenes in the show that are a composite of the graphic within the camera output, that really tell the story of what’s going on in the race at that particular time.
“Each [Unreal] Engine creates a scene so we want to maximise the scenes in the show,” he continues. “We’re even just slightly inventing terminology as we go along in relation to how we make those shows but that’s here now and we’ve done it time over time. It’s not necessarily right for every sport, but when I look at some sports’ graphics packages, which are flat 2D, I’m just thinking “they could benefit from Unreal”.
Duffy concludes that cutting edge sports lead to cutting edge use of technology in broadcasting: “Formula E, Extreme E, E1; they are a vanguard of changing mobility using different technologies and we are invested in that. This technical playground of opportunity in developing technology in sports media and media overall is similarly progressive. Unreal is just the best example of that.”
Aurora’s core team including Scott and the director were working from a specially developed end-to-end gallery and production space at Presteigne Charter in Crawley, UK while seven graphics operators and support engineers worked in another gallery in Hilversum, The Netherlands at NEP.
Watch E1 race two in Venice on 12 May.