Advancing the Creation, Production and Distribution
of Sports Content

Formula E

Augmented upgrade: AR graphics enhanced with AI bring Formula E’s 2025-2026 season new detail as well as challenges

On 21 March 2026, Formula E made its debut with the Madrid Mundial de Formula E. Marking the partnership between FIA Formula E World Championship and the Real Automóvil Club de España (RACE), the vehicles took to the track at the Circuito de Madrid Jarama-RACE with new augmented reality (AR) graphics enhanced with artificial intelligence (AI), bringing more information to viewers.

Formula E works closely with its host broadcast production partners, Aurora and Whisper, as well as its technical services provider Gravity Media, alongside partners for its onboard cameras Timeline TV and Domo Broadcast Systems, and Tata Communications, its broadcast distribution partner.

Aurora’s technical director, Lee Flay, spoke to SVG Europe from the Circuito de Madrid Jarama-RACE prior to the weekend. “It’s a new venue for us. Formula E has been rather inundated with demand for tickets and they’ve had to change the whole layout of the venue to accommodate more people, so it’s all changed at the last minute, but we’re working around it. The cars were on track on Friday afternoon for the first time.”

The track has enabled a fast race, says Flay: “It’s a wonderful circuit; it’s all undulating and flowing, so it looks fantastic, but there was a lot of work to do ahead of being ready for it at the moment.”

Flay adds that as the Formula E cars are now much faster than they used to be, they are becoming better suited to track racing rather than in city centres, which has been the historic home of Formula E racing.

He notes: “The speed of the cars now, they really do lend themselves to proper circuits. We used to race in city centres, but the cars are getting too fast for the size of runoffs that you’re able to build in city centres now. We’re outgrowing that ability a little bit.”

Augmented upgrade

Formula E has been using established AR tracking technology from Ncam with Vizrt in the back end for the last eight seasons, including for its ‘Attack Mode’ powering up feature. Attack Mode is a mandatory race feature where drivers drive off-line through a designated ‘activation zone’ to receive a temporary 50kW power boost, totalling 350kW in the Gen3 Evo vehicles and all-wheel drive, aiding in overtaking. It generally lasts for eight minutes per race, and is often split into two activations.

However, Formula E is now branching out into the latest AR technology purely with Vizrt, which is enabling it to do more on screen, as well as having significant other benefits for the productions.

Explains Flay: “We’ve been using Ncam, which does the Attack Mode tracking, historically. We use Ncam tracking with Vizrt graphic engines to apply those graphics. But we’ve expanded that and have started getting more adventurous with the AR capabilities. When we looked to expand our capabilities out further, Al Kamel, our graphics partner, proposed the use of the new Vizrt Arena engine, a tracking and AR-capable machine. It’s still a Vizrt engine in the backend, but with ostensibly the ability to switch between cameras and detect the cut sequence and pull the graphics in where required.”

Formula E is now branching out into the latest AR technology purely with Vizrt, which is enabling it to do more on screen, as well as having significant other benefits for the productions

Playing with workflows

Madrid marks the first time Aurora is using the Arena engine on site and with the operators working remotely back in the Gravity production centre in Westworks. Until the Madrid race, the workflow has been with both the operators and the Arena engine in London, however latency issues meant it was time to try a new approach.

Explains Flay: “We’ve been playing around with the Arena workflow set up over the last couple of events. The workflow’s a little bit complicated because we want to be able to put the AR sources in at source on site, because we create a track cut on site. There’s a racing cut produced with just track cameras and mini cams, which is then layered on top in London with onboards and graphics and RF cameras in the pit lane; all the rest of the assets are layered on top in London. So we needed the AR sources with very low latency at source on site with us here. Hence we’re playing around with the workflows a little bit at the moment, because we found it difficult to apply them in London effectively because of the round trip; we’ve got to send the camera to London, the AR put on, and to send it back here. The latency was just unworkable in that workflow.

“For this event, we’ve brought the Arena engine onsite with us here in Madrid. The idea is it’s still being remotely operated, so the operator doesn’t have to travel and we don’t have the flight impact and the cost and the environmental impact of having a person on site, so it’s more sustainable. But the equipment travels with us, applying the graphics at source. That way we can get it in as if it was any other camera cut into the sequence with virtual branding or presentation graphics or laps to go or fastest lap or anything that we want to apply to it, and we do all sorts of different things with it. We can put them on source now and get them in a lot easier; that’s the plan anyway.”

Says Flay: “It’s a slight deviation from where we started, but it should give us a more adapted workflow and enable us to get the cameras cut in at that source a lot easier.”

On the first few months of using the Arena engine, the team tried different workflows out to try and get the best results. Flay comments: “We had some varying workflows due to having availability of the servers where we wanted them to be. In Miami, we got a server from Viz themselves in the States.

“It’s all come very thick and fast, this development in the last few rounds [of the Formula E season], and we’ve had to rely on the goodwill of people like Viz to support us. They’ve done a great job in making kit available to us. Sometimes that’s been onsite and we’ve worked a workflow around that with operators on site; we did that in Miami, but that was forced by circumstance more than a planned workflow.

“Now we’re getting to a point where we’re fine-tuning,” Flay continues. “We’re in Europe, so it’s easy to get equipment and people moved around. Al Kamel is based in Spain, so they’ve got support where we need it, but ironically, we’re now moving [the Arena engine] to work from our facility in London. That would be the standard workflow for us – having Arena in London – and that’s what we’re trying to establish, to get solidity of solution so that we can roll that out wherever we are in the world, and it will be exactly the same.”

Faster setup

On why Arena was chosen, Flay says it is because it has multiple benefits for the style of Formula E production. He says: “Does Arena do anything which another engine won’t do? No, not really. In all reality, Unreal Engine is more hyperrealistic, but we are not doing that sort of AI; we’re not doing virtual studios, we’re not putting people in a false background and trying to make it look real. We’re presenting graphics on screen to give information to the viewer and they’re stylised, they’re not necessarily hyper-realistic. So Viz is well suited to doing that sort of stuff. And at that point, it’s down to the skills of the operators in London who are making it look realistic and repeatable. We give graphics designs to Al Kamel that we’ve come up with and want to put on screen. They translate those into a Viz model and then adjust them throughout the course of the week.”

Flay continues: “But the tracking technology itself is remarkable in its ability to recognise the image and place a graphic within it repeatedly, and that’s including cameras zooming in past it and tracking a car, and it’ll just appear on the edge of frame and it knows exactly what the image is, knows how to place it within that environment, and gives us a lot of toys to play with.”

The new system provides optical tracking. Flay explains why that is useful: “[Optical tracking] has some real benefits. The speed it takes for the system to learn a camera is incredibly quick. You’ve literally got to pan and zoom the camera for two minutes, and the AR engine learns the shot and calibrates very, very quickly. So for setup for us on site, it’s far less invasive than building a camera with sensors on it and tracking bars, like the Ncam tech. the Ncam tech is relatively old, but it’s reliable. It works. We invested in [Ncam], but it’s now almost end of life, it’s not supported by Ncam directly anymore; they’ve got moved on, product wise. So we looked at the Arena solution over Christmas, and we’ve only been using it for four months.”

Hamish Harris, Formula E’s broadcast and media technology director, comments on Arena: “It probably gives us a bit more flexibility in terms of where we can put the cameras as well. If you think about it optically tracking, some of our cameras are up on hoists and cherry pickers, and if that cherry picker is not in exactly the same spot the next day, then you’ve got to recalibrate that whole shot because the system has to learn where it is again. Whereas with the optical tracking, you can just do another recalibration to teach it where the camera is and then it can crack on.”

Flay says on the main benefit of using the new system that it means a faster set up for the Aurora and Gravity Media crews on site. He explains: “A major benefit to us is setup; the speed of deployment is a real asset.”

“We’ve modified the way we deploy our cameras as well,” says Flay. “We use a lot of remote camera heads rather than have an operator up in a hoist for eight hours on a race day; instead we might deploy a remote camera in that position. It can stay there all day long. If you do need to move the hoist it’s generally for things like lens cleaning and adjusting the actual camera itself. We try to avoid that even; we’ve got rain blowers on the camera lenses now. We’ve got kits that we remotely operate and if a lens gets wet or dirty, we can clean them remotely to stop us having to bring those cameras down.

“But the Arena engine really helps us in that because if we do have to bring it down for any reason and put it back up, we haven’t got to get it inch-critical; the recalibration and the relocking of that is so quick that it means that we’re not losing four or five laps of a race while they’re recalibrating it. It’s done on the fly, and done quite quickly.”

AI keying

The other AI-driven element within the AR is the keying ability of Arena. A keyer system that is driven by AI takes out the need for constant manual adjustments if, for instance weather conditions on a racetrack change.

Comments Flay: “We haven’t utilised the AI keyer properly yet, but it’s on our list of things we want to play with. One of the limitations we’ve got with the old Ncam and Vizrt solution for Attack Mode, for instance, is it’s a traditional key. You’ve got to key onto the track and then you’re at the mercy of the light, the track conditions, and everything else. If the track gets wet and shiny, it changes the keying.

“This AI keying engine will take all that away. It literally looks at the position and turns off X pixel because, for instance, the car should be passing over it at that time. It’s not keying in a traditional sense; it’s turning on and off pixels to make it look like the car’s either driven over, under, or around whatever graphic element you want, so it opens up a lot of avenues for us to play with AR graphics and put them all over the place,” concludes Flay.

Sharing

Related Articles

BEFORE YOU GO...

You could get sports broadcasting & production articles like this sent directly to your email inbox.

Simply sign up for one of our 'Insider' newsletters:

IMPORTANT: Once subscribed, PLEASE ADD our email address [email protected] to your safe sender list to ensure safe delivery of newsletters

Already have a login? Log in here to manage your newsletter preferences.