Sports Content Management: Inside AI and machine-learning developments

Despite headline-making applications, the technology is just getting started

Artificial intelligence (AI) and machine learning are among the industry’s hottest topics, and, for those involved in storing content, the technology’s potential has led to some sexy projects. Automated cutting of highlights, pattern and facial recognition to automate logging, and plenty of other applications have made headlines. But questions remain, and the work is still very much in its infancy.

“There are ways to wade into this and understand it, but you have to come armed with a purpose,” advised Mike Arthur, SVP, sports and licensing, Veritone, during a panel discussion on the topic at the recent SVG US 2019 Sports Content Management Forum at the Westin New York in Times Square. “You need to start there and then decide where to move on.”

Veritone’s operating system for AI aggregates cognitive engines, such as for transcript production and facial and object recognition, from providers like Microsoft, IBM, and others. “We will put three or four engines against a project to get the accuracy as high as possible,” Arthur said.

He said defining a purpose begins with figuring out the priorities: are they operational, technical, or monetisation?

“Ultimately, it has to be something that is consumable at an operational level first,” he explained. “And, in a lot of those cases, there may not be monetisation. So you start on the frontend to support pain points.”

Ethan Dreilinger, client solutions engineer, IBM Watson Media, opined that AI and machine learning vendors need to start from the ground up with a client and understand the client’s objectives.

“You can deploy [AI or machine learning] to save money, but, if no one uses it, you can’t monetise it,” he said. “Know your objectives and then build things like search and discovery on top of that. Automated workflows for tagging are a matter of understanding and building a solution that works.”

Xena Ugrinsky, partner, applied data science, Genre-X Consulting, said that one challenge in building a business case is that those who often get involved don’t understand it. “A CFO who signs off on a project and doesn’t understand the premise or investment is not going to make an informed decision.”

Scott Bounds, media industry lead, Microsoft, noted that organisations like Zone.TV and Virgin Media are bringing AI to video clips tagged with metadata and putting together linear channels of clips based on things trending on social media or personal preferences. Metadata tagging will open up new opportunities to bring personalization to media-asset–management systems.

“AI is a tool to make things more efficient, not to replace someone but instead to make turnaround time so much faster,” he explained. The production team, for example, can much more quickly turn around an on-air promo. And there are things like going into an archive of video clips and automating the process of changing out thousands of old logos for new ones.

Drelinger noted the Fox Sports Highlight Machine, which was used for the FIFA Men’s World Cup in 2018. Fans could search for specific play types (goals, penalty kicks, red cards) from any World Cup game in the past and instantly bring up all of the appropriately tagged clips. This year, Fox Sports wanted it to be tied into the broadcast, and Microsoft went below the surface to find nuggets of video that could be tied together to tell a story.

“Tagging video at the front-end allows people to generate their own highlights,” he pointed out. “It gave the producers and analysts a chance to look at things like how a player going down would change the way the team played.”

Applied data science is being used in new ways every day. Ugrinsky mentioned how a firm came up with an algorithm that could figure out whether an NFL player was being overpaid.

“It’s a seismic shift,” she explain of the ability to make real data-driven decisions that make a player’s value based on revenue generation rather than just popularity.

Plenty of new opportunities are on the horizon. Applying AI and machine learning to tasks like equipment maintenance can speed the acceleration to equipment that doesn’t need to be monitored, with automation of failover protection.

“Predictive maintenance automation could transform everything you’re doing in five years,” said Ugrinsky.

Chris Witmayer, director, broadcast, production, and new media technology, NASCAR Productions, said that open-sourced tools can be useful in finding the best ways to provide such functions as object recognition, transcription services, and language translation. “To scale it out,” he explained, “we looked to cloud providers like Microsoft and AWS as we build our own specific models.”

One reason was the need for customisation of objects being looked for. For example, many of the AI tools do not offer the capability of easily recognising the face of a race-car driver. In February, look for a new system to be unveiled that will have driver faces, sponsor logos, and car numbers pumped back in.

“I can tell you that the machine is more efficient than a human, which doesn’t lead to many friendships,” Witmayer said. “But the machines are very robust.”

NASCAR is also looking to go directly to the cloud.

“When it comes to churning video,” he said, ‘the Mac-mini is efficient, but it is nowhere near as scalable, so we’re looking at the cloud. And there are other services, like translation and transcription, where there are companies who can do that. So we’re going to move all of this to the cloud.”

The end result of moving to the cloud might be smaller production facilities.

“When you look at the amount of content that we acquire, about 300 hours of content per week,” Witmayer explained, “we use only about 1% of it, and then it is archived. When you think about these deep archiving solutions, I don’t know if you need to have much infrastructure other than for serious editing.”

Subscribe and Get SVG Europe Newsletters