The advance of AR: Racelogic on meeting the demand for more immersive viewing experiences

By Julian Thomas, Racelogic managing director.

As I look back at 2022, I find it inspiring to think how much progress has been made bringing sport back to the masses. At the start of the year there were still question marks over what events would proceed, however the industry has shown its strength and put on more top-level events than ever would have been imaginable back in the height of the pandemic.

As our attention turns to the year ahead with the industry thriving and viewers keen to experience more live action in person and at home, I am drawn to thinking about the technology that can take the experience up to the top league. 2022 had broadcasters dipping their toes into the possibilities of using augmented reality to enhance the viewer experience and deliver more value to advertisers. 2023 is set to take this to the next level with more widespread use of the latest graphics and tracking technology.

What we loved in 2022

2022 saw more use of augmented reality (AR) elements in live sports broadcasts, leveraging the latest developments in real time graphics engines from the likes of Unreal Engine and the companies that build upon that and other platforms. Camera tracking technology continued to improve both in quality and quantity with several exciting new propositions entering the market. Wide area tracking across entire stadiums or events was demonstrated and looks likely to continue to grow in popularity and capability in the coming years.

Static graphic overlays have been common in broadcasting for many years but where we really see growth is in dynamic graphics that are aligned to the venue, athlete, or other real-world elements. To get these often-challenging alignments takes collaboration between camera teams, broadcasters, graphics teams and tracking providers. The challenge is even more so when dealing with highly mobile cameras such as those on cables or mobile Steadicams, so we see these cross-industry collaborations as essential to delivering these improved experiences.

An important part of virtual production is camera tracking. To align virtual elements with live footage, camera data such as position, pan-tilt-roll as well as focus, iris and zoom values of the real camera need to match with the virtual camera. We see a multitude of tracking technologies out there. Many virtual production studios use marker-based technologies. A small optical device on the camera is able to calculate the relevant data by tracking markers that are placed in the studio space.

Markerless technologies are also available. These systems rely on interpreting video data, using artificial intelligence (AI). Next to marker-based and markerless systems some studios use more traditional systems that are also used for human-body motion capture. These systems require a number of motion capture cameras spread around the recording volume that track the actual recording camera.

Finally, there are entry-level radio-based systems that can track a camera in a relatively small volume. What we find is that all these existing tracking systems typically work well in studios and other controlled environments, but do not easily scale.

During 2022 we have seen an increasing demand from broadcasters also wanting to add advanced AR elements to the footage they shoot at the sports venue, both indoors and outdoors. For many of the existing tracking systems it is a challenge to operate reliably at the sports venue due to the sheer volume that needs to be tracked or the changing lighting conditions that interfere with optical signals.

Most of the AR elements on top of footage from larger environments we see nowadays is generated by using tracking data from cameras in fixed positions, mounted on encoded pedestals that can provide reliable tracking data. A good solution, but what if we want to move the camera around?

2023 will remove the restrictions to AR

We believe the big challenge ahead of us is to be able to track moving cameras in large environments both indoors and outdoors ie, being able to track cable mounted cameras, Steadicams, shoulder-mounted cameras, drones or helicopter cameras in stadiums, ice rinks, race tracks, etc. We believe that there is no single ‘silver bullet’ technology to solve this problem, but expect the solution to be a combination of several camera tracking technologies being fused in real time.

For indoor tracking we expect broadcasters will use a combination of a ‘wide area’ radio technology (call it ‘indoor-GPS’) in combination with inertial sensors (IMUs) and potentially markerless AI-based interpretation of video data. In April 2022 during the NCAA Final Four basketball games, WarnerMedia ran a first successful test with such a system, tracking a cablecam in a large indoor stadium (Caesars Superdome in New Orleans, LA). It enabled WarnerMedia to add AR to its cablecam footage. More testing has been done by several parties in 2022 and we expect to see more AR on screen in 2023 coming from moving cameras.

Tracking cameras outdoors would ideally be done using a global navigation satellite system (GNSS) such as GPS – using the latest technology it gives the opportunity to track with sub-2cm accuracy – with further improvements possible by fusing inertial sensors and potentially AI-based video interpretation. With the development of high-precision real time kinematics GNSS modules, high-accuracy outdoor camera tracking will be possible in the very near future. We expect to see the first tests and trials with this new technology in 2023.

Think of AR elements on top of footage shot using any camera and even helicopters at outdoor venues.

I strongly believe that the technology that is currently being developed and will be introduced at events in 2023 will set new standards and probably be the norm for the years to come.



Subscribe and Get SVG Europe Newsletters