It’s complicated: M2A Media asks if sports broadcasters are ready to move acquisition and distribution workflows into the cloud

By Kevin Heavey, M2A Media pre-sales technical architect.

Public cloud and internet acquisition and/or distribution workflows have been commonplace for many sports broadcasters and service providers over the past few years. These cloud-based workflows have helped sports broadcasters realise the rights for sporting events when on-prem or dedicated infrastructure was not available. What started as an ad-hoc resource to flex on-demand when on-prem capacity was unavailable has turned into the primary path for some broadcasters.

Typically cloud-based workflows have been a huge win for broadcasters and service providers alike as they remove many of the complex capacity planning challenges and can work out significantly cheaper than the on-prem counterparts if resources are managed carefully.

That said, it is widely assumed that cloud workflows come with the caveat that they are great so long as you don’t need to perform complicated transformations to your acquired content.

Hybrid on-prem-cloud workflows have been adopted by some to circumnavigate the cloud’s shortcomings for lack of audio commentary, dynamic graphics, frame rate conversion, live capture, and routing. These hybrid workflows are fallible to hardware availability, on-prem capacity quotas and more recently supply chain issues. Another issue is the increased latency delay and additional network hops the hybrid workflows add to the overall video pipeline. While this is fine for some broadcast use cases, typically sports broadcasters want low latency for their viewers.

It’s been hard for the cloud to compete with the symphony of tangible cables, boxes, and control panels that operators have relied on for many years to deliver live events, but it’s now possible to do lots of the complicated stream transformations in the cloud. A new breed of virtual systems integrator has turned the intangible tangible. Enabling complex cloud workflows to address live graphics, frame rate conversion and virtual routing from a single web console.

A virtual flood

If we start with dynamic graphics, this was once the sole purview of dedicated GPU-laden servers running reassuringly expensive software. This approach is still the only way to put a weather reporter into a virtually flooded news studio.

But what if you only need a dynamic scoreboard and maybe a dynamic ticker feed? Well, some clever engineers worked out that HTML5 can be used to generate beautiful video graphic overlays and requires zero GPU and uses very low compute resources. These graphics applications can be consumed as a SaaS offering where you simply paste an output URL into your cloud or source encoder and control the application from a separate control URL that is accessible from anywhere over the internet. The data for the graphics can be driven via a web interface or via an external data source such as an XML feed.

Frame rate conversion is another hardware-intensive and CAPEX-heavy on-prem activity. Frame rate converters typically provide baseband inputs and outputs, which presents more resource problems when you need to convert SDI to IP or vice versa. Contrast this with the cloud-native frame rate converters that natively ingest an IP source and outgest a TX emission quality IP output reducing the need for an additional encoder. The ability to magic a frame rate converter from thin air in the cloud really is a game changer when contrasted with the complex capacity planning headaches of old.

Remote audio commentary took off during lockdown, commentators who were unable to travel to studios and needed a way to add commentary to sports events from home over a domestic internet connection. Many ingenious hybrid on-prem/cloud workflows were devised to provide remote commentary. This too has been turned into a SaaS offering; some of these services have also rolled in fan engagement allowing fans to interact with sporting events, poll live fan sentiment and sell digital merchandise.

Live capture servers have traditionally added a Russian roulette aspect to a broadcast engineer’s shift. They seem to be a constant source of pain caused by failed disks or running out of disk space. This is where the cloud comes into its own; cloud live capture services can capture directly to object storage. This will typically make three copies of the recorded media across multiple data centres and removes the need for time-consuming constant disk housekeeping. Once the stream has been recorded it’s easy to share the media with other partners or apply transform rules to encode the media to other formats.

Streamlined management

The introduction of scheduling, routing and switching in the cloud has been a big win for a crop of recent cloud transport stream acquisition and distribution SaaS platforms. This has streamlined the management of multiple concurrent sporting events in the cloud and provides operators with a single-page view of many individual live event handoffs. This gives operators a tangible view of events allowing extra taker destinations to be simply added or removed without disrupting existing running handoffs.

This has also introduced a centralised scheduling API that has allowed broadcasters and service providers to perform data-driven scheduling reducing event setup time and mitigating configuration errors that can occur when done manually via a graphical user interface. This operation by exception allows operators to concentrate on monitoring events rather than performing time-consuming event setup.

It’s important to acknowledge the challenges of the first and last mile of cloud acquisition and distribution workflows. Typically, costly contribution encoders and decoders will be required to encode and decode live sports events. In recent years lower-cost options have come onto the market. Some vendors are providing software encoder and decoder options that run on readily available off-the-shelf servers and on small form factor computers. Another development is vendors using mobile phone chipsets to perform video encoding and decoding by offloading the video processing to dedicated video encode and decode chips. Both these options provide broadcasters with a lower-cost option that is readily available with reduced lead times. Being able to control the encode and decode resources across multiple sites remotely is another advantage of these new software and hardware-based solutions with SaaS fleet management functionality.

In summary, 2022 has been a very exciting time for sports broadcasting technology, it feels like we are on the cusp of big shifts in the way workflows are designed and deployed. I think it’s fair to say we’re in a place where the cloud is beating some of the current on-prem solutions and sports broadcasters really are ready to move complicated acquisition and distribution workflows into the cloud. I’m excited to see how this trend plays out in 2023.

 

Subscribe and Get SVG Europe Newsletters