IBC 2023: Behind the scenes on the real-time XR sport edge accelerator project

In the final of our sports-focused IBC Accelerator series, we look at how extended reality is making waves in mixed martial arts.

“This is living R&D, it’s warts and all,” said Muki Kulhan, innovation co-lead for the IBC Accelerators, introducing this last Accelerator of IBC 2023, the Real-Time XR Sport Edge project.

The project aimed to broadcast digital twin extended reality (XR) sports using combat mixed martial arts (MMA) into live, immersive high-end graphics, spatial and social audio and real-time, edge-compute deliverability to fans in a 3D virtual world via virtual reality (VR) headsets, computers, mobile devices and also simultaneously to an outdoor location-based experience venue at The Outernet in London.

This interesting session covers everything from the challenges of bringing spatial audio and motion capture into game engines to rendering in 8K real time and the ethics of digital twins.

“The original plan for our project was to capture real-time sports, specifically martial arts,” said Christine Marsh, CTO, CCO and co-founder at Bowie State University and Prima Terra Labs.

“And the challenge there for us is that it’s a very fast-moving sport. But the additional level of difficulty is to relay that performance to audiences, also in real time. So it’s a lot of information going to multiple destinations, to location-based entertainment, as well as VR headsets, computers and mobile devices. We were also looking at bringing in an extra level of fan engagement. But the main thing that we’re trying to do is capture the likeness of the fighters and put them in a virtual reality environment.”

“We wanted to capture that with motion capture,” said Andy Hook, technical solutions director for White Light and D&B Solutions. “We were looking at what we could do with volumetric capture too, and compare the technologies. We were also looking to see if we could capture spatial audio.


Moderator:

Muki Kulhan, innovation co-lead for IBC Accelerators

Panel:

  • Christine Marsh, CTO, CCO and co-founder at Bowie State University and Prima Terra Labs Andy Hook, technical solutions director for White Light and D&B Solutions
  • Rob Oldfield, co-founder and CEO at Salsa Sound
  • Mike Whittaker, CTO at Outernet Global

Champions: 

Kings College London, Trinity College Dublin, Prima Terra Labs, Outernet Global, Vodafone Group, University of Southampton, RVPA, Bowie State University

Participants: D&B Solutions, Salsa Sound, Sparkup, Hand, AMD, HearMeCheer, Movrs


“Audio is as important as the visual experience,” he continued. “So we were making sure that we tie those things together. We wanted to take all of that data, push that into a real-time engine and create a digital twin of that experience – placing those fighters inside a virtual world rendered in real time. Then we could bring commentators in. We also wanted to bring a fan engagement platform into that so that users on different devices around the world can interact with that experience.”

The project was also lucky to have an immersive audio expert on board, in the shape of Rob Oldfield, co-founder and CEO at Salsa Sound. “We realised that audio is what can bring the emotion,” said Oldfield.  “There are so many things that can be enabled with good audio, that tick the boxes of what we wanted to do as part of this project. So yes, it’s immersive, but we also wanted to bring the accessibility angle in, the ability to alter the different sources and improve the commentary to make things clearer for people who are struggling with hearing, maybe simplifying a scene so they’re not bombarded with a gazillion different sources if that’s a struggle for them.

“We wanted to bring in personalisation, to have different audio elements depending on the location that I was currently based in. Also to bring in interactivity, to have an experience where people can totally engage with it wherever they are. It became apparent that to facilitate all of those things, we needed an object-based audio engine,” Oldfield commented.

“I’m sure most people [at IBC] agree you can have great video but if you have terrible audio, you lose your audience,” said Marsh. “But we were trying to do more than that. We were trying to create a sense of presence and the audio part of that really is key, so that it feels like you’re really there.”

Mike Whittaker, CTO at Outernet Global, talked about the challenges of bringing the virtual world to the Outernet, a massive outdoor digital expo space in London.

“It’s a 16k resolution, immersive experience, and if it’s pre-rendered it’s too heavy, but we like real-time,” he said. “As for audio, you’re going from personalised devices and a pair of headphones to 98 channels of audio. There were real challenges which needed real intellectual grunt and real processing grunt behind it.”

Hook described how the mocap session utilised a marker-less skeleton tracking system, Movrs, to capture the live combat at very high frame rates, as well as Salsa’s audio capture solution.

“AMD provided hardware for us to render that into Unreal Engine,” continued Hook. “We took all the data in from an audio and video perspective into Unreal Engine, recreating that entire experience as a scene produced inside a virtual world.”

Kulhan pointed out that the virtual fighters had to look authentic. Marsh agreed: “Part of the goal here is for the fans to engage not just with each other but with the fighters. So they want to see that fighter looks like the avatar and vice versa. If they can’t recognise the fighter as the person they’re rooting for it breaks the illusion and it doesn’t give the experience that we’re trying to deliver. We spent a fair amount of time working on making those characters really look like those fighters.”

This prompted Kulhan to discuss how an athlete or the talent can protect ownership of their avatar, before bringing the session to a close.

 

Subscribe and Get SVG Europe Newsletters