Revolutionary viewing: 3D volumetric video streaming set to become reality for sports fans
Getting creative with photo-realistic holograms and Condense Reality
Condense Reality, an 18-month-old virtual reality (VR) startup that develops technology to transform how viewers watch live entertainment and sports, is banking on 3D holograms as the future of live sports engagement.
The company has developed a system for streaming hologram-style 3D volumetric video of live events alongside a normal television broadcast in real-time, with the potential to bring entertainment and sport to life on the tabletop and elsewhere.
Next-generation volumetric video
Until now, capturing volumetric video, which creates a three-dimensional image that can be viewed by multiple people from different angles, required fixed studios with green screens and many precisely calibrated cameras to work. Additionally, the processing power required meant it could take days to process minutes of content for streaming.
Condense Reality has developed a next-generation solution that enables broadcasters and content creators to capture and stream volumetric video in real time, outside the confines of a studio, and with far fewer cameras.
Condense Reality’s modular approach combines proprietary software with off-the-shelf hardware. Its CR Capture platform uses state-of-the-art computer vision and deep learning to accurately reconstruct the contents of a scene in seconds, while CR Stream enables broadcasters to stream that content to viewers via their own augmented or VR headsets – including Oculus, Vive, Microsoft Hololens, and Magic Leap. The multi-platform CR Playback app gives viewers control of their experience through an intuitive 3D UI.
“What we’re building is the capability of capturing these photo-realistic holograms, and then what we want to do is take these holograms and allow the broadcasters to do what they want with them”
Speaking to SVG Europe, Nick Fellingham, CEO at Condense Reality, says: “Our technology integrates with Unity and Unreal [Engine]. These game engines are taking over the world. They’re doing massively well, not just because they create games, but because create these 3D immersive experiences. So we’re betting less so on augmented reality, more on game engines, and if you bet on game engines you naturally bet on augmented reality because you’re creating augmented reality experiences inside a game engine.”
Condense Reality’s system is a key part of a project to bring this engaging technology to the masses; BT recently became one of the winners of the UK government’s Department for Digital, Culture, Media & Sport (DCMS) funding boost for 5G, under the 5G Create competition. Named 5G Edge-XR, the consortium led by BT’s Media and Research teams alongside Condense Reality, TheGridFactory, Bristol University, Dance East, and Salsa Sound, will develop virtual and augmented reality experiences for live sport, working closely with BT Sport.
Fellingham comments: “We contacted a number of broadcasters, and when we spoke to the guys at BT Media and Research about eight month ago, they said we’ve been thinking about this exact idea. So it just turned out that our visions aligned quite well. It’s such a nice coincidence.”
5G Edge-XR will enable people to view immersive sporting events from all angles, across a broader range of devices including smartphones, tablets, AR and VR headsets and TVs. Condense Reality will initially focus on optimising its technology for boxing, with other sports to follow.
Getting creative with photo-realistic holograms
However, adds Fellingham, “this technology we’re building isn’t just designed for AR”. He goes on: “What we’re building is the capability of capturing these photo-realistic holograms, and then what we want to do is take these holograms and allow the broadcasters to do what they want with them.
“We are really interested in a tabletop experience, but that’s not the only way you can enjoy them. You can see them in VR and make them life sized you can almost stand in the action, but you can also, for instance, increase the size of them massively so let’s say you captured a performer or musician, you could then place that performer 20-times the size in the centre of the stadium and people could watch that through their phones.”
He notes: “It may turn out that tabletop AR isn’t the thing that people really jump on and in fact what they want is to be able to control the cameras using an Xbox controller on a standard TV. The way that would work is you can move your camera around, or set it to follow your favourite boxer, or rotate it around yourself so [it would be like you are] flying a virtual drone around the boxing match.”
Fellingham states: “We’re building the tools that allow people to do really creative stuff with this.”
From boxing to pole vaulting
Right now the focus for the company is all about sport, but this could broaden out with time. Fellingham states: “We want to capture sports events. Eventually we want to capture everything, but when you’re a start up you have to focus, and we saw this need in sports events, and in particular, boxing. Other things that are interesting [for this technology] are [sports like] climbing and athletics I think is going to be really popular in this kind of format; being able to watch pole vault on your table [in 3D] will be absolutely fascinating.”
The reason these specific events are good targets for Condense Reality is due to the technological restrictions it is currently facing, explains Fellingham: “We can only capture events of a certain size. At the moment we’re working towards this boxing ring size capture area. It’s really difficult to capture larger areas and stream them in real time. There are systems that can do very big areas, but they require a lot of time and an awful lot of processing to get the footage out.”
“The difficulty with this kind of content is it’s always going to be quite large; it’s larger than a 4K video”
Sports which come off the back of this boxing ring sized capture area include table tennis, sumo wrestling and fencing, which are all also – and crucially, according to Fellingham – dynamic.
He explains: “You want things that are dynamic. You want the kind of sport that if you’re watching, you’re moving your head around to try and get that slightly different angle; if you watch people watching boxing, they’re constantly shifting from left to right [to get a better view]. With our content placed on the table [as a hologram], if you’re a giant [in comparison], you can shift left to right much easier because you can change your entire perspective by moving your head.”
5G the secret sauce for making this a reality
But is this technology actually working right now? Fellingham says yes: “Our prototype, as of today, can capture two human beings and stream them in the format [of a tabletop sized hologram viewed using a smartphone].”
However, he adds: “The difficulty with this kind of content is it’s always going to be quite large; it’s larger than a 4K video. So the reason that 5G is important for this is that firstly, you can get more data over the network, and secondly, the phones themselves actually need to render the content, almost like we’re viewing a video game. Video games stress out the phone. It’s hard work for them; the battery heats up and gets used up. What’s nice about this project [with BT], and what’s nice about 5G, is [5G] has this unique property which allows us to shift the hard rendering jobs to the 5G masts, essentially the towers, and that way the phone can just be the portal to view the content. All of the hard work shifts over to the Edge.”
Fellingham notes that this content is not only consumable via a smartphone. “Spatial computing is the best way to view this content. Spatial computing is a catch-all term for interfaces which allow you to shift your location, so mixed reality, cross reality, VR is a spatial computing device, phones are now spatial computing devices because of AR, and Apple has invested enormous amounts of money into AR in phones. The reason they’re doing that is because they are working on AR headsets. Our plan is to continue to develop our technology so when Apple release their headsets everyone buys it because it’s Apple and we say, “what are you going to watch on it?” then “tadaar!”.”
Fellingham says: “Our technology aims to bridge real and virtual worlds by enabling broadcasters to record live events as volumetric video and instantly stream them to viewers. Our initial focus has been on recording sports, in particular boxing, so to be working on this project with BT, one of the biggest boxing broadcasters in the world, is a huge opportunity.
“At a time when many sporting events cannot be viewed in stadiums, enhancing the communal viewing experience in the comfort of your own home is more timely than ever,” continues Fellingham. “We see a bright future for volumetric video, with modern augmented and VR technology providing consumers with the necessary hardware to enjoy volumetric content to its fullest. This hardware continues to improve at a staggering rate, and the mooted launch of an Apple AR headset in 2022 could see adoption explode.”
This also has implications for esports, says Fellingham. The technology can enable viewers to watch game play from above, seeing the entire virtual course, virtually, with the esports athletes as 3D assets so the viewer is able to physically move around the table to get any perspective they desire. Even nature can benefit, Fellingham says; natural history documentary makers could use this for capturing activity at a watering hole or elsewhere for an amazing viewing experience.
In five years’ time Fellingham says the vision would be for self-configuring 5G-connected cameras on sticks, placed around an event, which begin filming automatically, spit out the content as a 3D model that can be viewed in any way possible using .
He adds: “I actually expect that eventually the production crew won’t need to be on site. The camera person could control the position of the camera using an Xbox controller, and they wouldn’t be moving a real world camera, they’d be moving a virtual camera because we would have captured the whole scene [before] and we can then decide where we want our viewpoint.”
“I see this as being something that will be expected by viewers,” he notes.
Rapidly growing company
Additionally, Condense Reality has just announced that it has raised a seed round of more than £800,000. The investment will see Condense Reality increase its R&D capabilities and commercialise its technology over the next 12 months, and it is currently in the process of taking on seven new employees, which will more than double the headcount to 12.
The founding team consists of four stars. Fellingham has several years’ experience leading technical teams as an engineer and product owner. With expertise ranging from machine learning to UX design and sales, he has a passion for immersive technology, the skills required to build it, and the commercial insight to sell it.
The CTO of the company, Dan Fairs, is a veteran of multiple start ups. Fairs was CTO and cofounder of SecondSync which was acquired by Twitter in 2014; Fairs later led the backend team that delivered live streaming video globally. He is an experienced software architect and team builder, used to working at scale. He brings two decades of experience in software, start ups, and scaleups.
Chief scientific officer, Dr Ollie Feroze, has expertise in applied machine learning, artificial intelligence, and a PhD in computer vision. Feroze has led research in industrial applications ranging from drone navigation to medical image analysis.
Meanwhile, director of operations is Andy Littledale, a highly experienced entrepreneur with 20 years industrial experience including the high profile exit to Twitter as CEO of SecondSync in 2014. He has previously held innovation roles at the BBC and Twitter and for several years ran a digital agency specialising in creative prototyping with clients including Adobe, HP, BBC and Aardman.
Watch this space for more on both Condense Reality and 5G Edge-XR. The future is coming…