Football Summit 2019: Using AI and automation to improve storytelling in production

A session at SVG Europe’s Football Summit in Paris explored different use cases for artificial intelligence (AI) in football TV storytelling, including VAR at the high-end and automated production for lower league coverage.

AI panel: (L/R) Moderator Martin Stanford with Olivier Barnich, EVS; Gal Oz, Pixellot; and Michael Bais, Mobile Viewpoint

Can AI help rights holders deliver more (and better) live content to an increasingly diverse and connected audience? How much can AI and machine learning aid high-end production? How much can it enable lower end production – and how much can it actually create a new market for tier 3, 4 and 5 football that could never afford to televise and show matches in any commercially viable way in the future?

How do we make machines really work for us? And will we always need to have a highly skilled human or two around, to supervise the machines?

Broadcaster and event host Martin Stanford chaired the session, involving Michel Bais, CEO of Mobile Viewpoint; Olivier Barnich, EVS engineering manager; and Gal Oz, CTO and co-founder of Pixellot. UEFA transmission specialist Nicolas Deal was due to join the session, but was unfortunately unable to travel to Paris due to illness.

Stanford asked Olivier Barnich how EVS is engaging AI to help professionals in football production to impvove their coverage. “For a few years now we’ve been helping directors and operators to do their storytelling for live production. We’ve worked on automatic camera selection. The idea is to train artificial intelligence to choose the most relevant shot, depending on what’s happening on the pitch – in real time,” he said

“If you have say 20 cameras, maybe you want to only show a subset for the director to choose, or to suggest camera angles. With AI we can know whether an image is interesting to show on screen, in order to make a decision,” said Barnich. “In order to make no compromise with image quality we use wide angle cameras and stitching to get a full view of the pitch. We analyse this view with AI to know where the action is happening.

“We train the system with footage from human operators. We have multiple robotic cameras at different positions, that replicate the framing done by the human operators. That way we get an image of full quality as we don’t resort to cropping, and we can also see the action from multiple points of view – which is something that viewers are accustomed to. They see the action from multiple points of view around the pitch. You can feed this stream to someone that does the automatic camera selection.”

What about automated production and AI for VAR, wondered Stanford. Is there a role here for AI to help VAR officials make key decisions? “Could we use technology to help aid decision-making? We’ve been working on the calibration of the cameras – not a manual set-up. In the context of live production, we’ve been working on improving slow motion using artificial intelligence by generating frames in-between two frames that are acquired by the camera. This is useful when you want to do slow motion and you don’t have a super motion camera.

“When you are doing augmented reality, you are basically working with a 3D model of the game in front of you. If you want precision you have to make this model very complex. The more complex the model. the more work is needed to calibrate the system when you want to do it manually. Using AI ultimately enables a better precision for the offside line.

“Here’s something we’ve been thinking about: currently we are doing VAR using cameras that are dedicated to live production. Camera operators are working to produce images that will lead to enjoyable content for the viewers.

“But they may not produce the most suitable images to make refereeing decisions. In that context it could be helpful to have additional cameras that are looking at the action specifically for refereering – and it could be made cost effective using automated robotic cameras,” said Barnich.

40,000 hours of automated production every month

Pixellot has made an impact in sports production in recent years by empowering leagues, rights holders and media companies to affordably capture, produce and distribute games while increasing monetisation. The Pixellot solution uses computer-vision AI algorithms to dynamically track on-field action, providing viewers a seamless, TV-like experience that includes automated highlights and graphics, remote/local commentary, scoreboard, and game clock.

The system includes a proprietary camera unit installed on the field. After a short set up process, it will automatically capture sporting events, stream them over the web, and automatically edit video highlights in real time. Its computer vision algorithms are capable of following football, soccer, basketball, hockey, volleyball, lacrosse, baseball, field hockey and futsal.

“Pixellot uses the concept of taking fixed camera settings with a panoramic view which is stitched together,” said Gal Oz. “We’ve been doing it for the last 4-5 years and today we generate more than 40,000 hours of production per month. So possibly we are the biggest platform for generating [automted] sport content.

“Will it replace cameramen? Out of the 40,000 hours we produce, almost none were produced before [the availability of AI]. So the most important thing is that we are democratising sport; we’re generating much more content. The channels are not the issue any more, with all the new OTT platforms available. Generating high quality content is what we do.

“The most important thing is that we are democratising sport; we’re generating much more content” – Gal Oz, Pixellot

“What we say is that the traditional broadcasters are broadcasting famous people. What we are doing is broadcasting the important people. If it’s a town in Germany with 20,000 people with a team in the fifth league, then you have maybe 10,000 fans. They want to see the game in the same way as Bayern Munich is covered. So yes, there is a market for this.

“There are more leagues being covered around the world – not the premier leagues. We think this market is 100 times bigger than the traditional market. If you have 40,000 hours of production every month and you aggregate the eyeballs, then it all adds up.

“Our solution is end to end,” said Oz. “We start with the cameras and processing of course, but we build a full platform to serve our partners. Usually we don’t have fibre connection from the grounds, so we need to rely on the public internet. We’re talking about enrichment with graphics and monetisation tools, at low cost. The monetisation tools are part of the solution – even the automated replay and the OTT platform distribution,” he said.

Football and ice hockey coverage and player tracking with AI

The innovation team at Ajax FC has recently partnered with Hikvision, Mobile Viewpoint and TNO for a technology collaboration. Dutch research foundation TNO and Chinese video surveillance company Hikvision have developed an ecosystem for live sports production and analysis using new technologies to automate the capture of live sports, based on AI algorithms.

Mobile Viewpoint has contributed by productising part of this ecosystem to make it ready for broadcasters and editors to be used in online streaming and live television. Its IQ Sports Producer is an AI-driven production suite that supports the capture and delivery of live sports streaming from any location.

The combination of Hikvision’s four-lens camera with Mobile Viewpoint’s AI technology captures matches by tracking players, balls or other objects and live streams the resulting content – removing the need for production facilities, camera crews or directors.

“Ice hockey games look very good on a 180-degree camera, so you can have one camera covering the game,” said Michel Bais. “Watching ice hockey with this panoramic view is not bad at all. The middle camera has auto-tracking, and you have to build in a set of basic rules about the sport – the celebration moments and the fighting. You have to train your AI so that you capture those moments.

“What we try to do, with a single camera, is to automate the recording of the game. With our system it’s a four times 4K camera that we stitch together into one 180 degree image. With football you can focus on the game: with ice hockey we focus on a group of players, as the puck is too small and moves too fast.

“The success rate is about 90-95%. Today the AI is assisted. If you want full automation you need a little patience, we think. In our case the automatic switching between cameras is basic AI. But for a lower level of matches it’s a very interesting option.

“Ajax TV wanted the solution we’re talking about. It’s used more for player analytics, so the camera can feed multiple streams. This is the Under 17s, which is shown on YouTube. And you can give your sponsors more visibility as well.

“The key with all AI is training, training, training,” Bais told the Football Summit audience in Paris. “Then AI really starts to learn what to do. You still need someone to watch the game, and to feed back to the AI that the wrong shot was chosen. You need people to annotate games, and then it evolves. It’s ongoing.”

 

Subscribe and Get SVG Europe Newsletters