The long road to Paris: The rise of remote production in broadcasts of the Games
By John Mailhot, senior vice president, product management at Imagine Communications.
With Paris 2024 just around the corner, I can’t help looking back on the 2010 Winter Games in Vancouver, where I had the privilege of working with one of Imagine Communications’ broadcast customers.
In all the excitement, one thing that struck me the most was the sheer size and complexity of the International Broadcast Centre (IBC). Broadcasters from all around the world had packed the space with a massive amount of equipment — including multiple control rooms and dozens of full editing suites — and thousands of people were working tirelessly to produce their programs. It was a spectacle on par with the games themselves.
As in all televised Games before it, show production in Vancouver was performed on-site, with the finished product being sent back to each broadcaster’s home country. The reason for this was simple: Telecom (or even Satellite) links at the time were more expensive than sending personnel and equipment to the host city.
Over the last 14 years, however, there’s been a gradual shift towards a mix of local and remote production, with camera operators and commentators working on-site to cover the games, and some editing, graphics, and other finishing touches being applied in the studio at home.
This trend has become more pronounced with each biannual event and was accelerated dramatically by the pandemic. During the 2021 Summer Games in Tokyo, there was a significant push to increase remote production, with far more staff and equipment than ever before remaining at home. The following year, the Winter Games in Beijing provided a bellwether of how much production could be performed remotely and which aspects must be conducted on-site.
Today, as the world gets ready to watch the 2024 Summer Games, most broadcasters have applied these remote production technologies to all manner of sport content and figured out how to achieve the right immediacy and event feel by balancing local and remote production.
So, while the IBC in Paris will no doubt remain an impressive, bustling hub of activity, it will be more about handing off signals from the pool to each broadcaster’s on-site team, and less of a site production compound than in years past.
Higher bandwidth, lower cost
From a technological standpoint, there are three key factors that have brought the broadcast industry to this point where remote production is suitable even for such high-profile events.
The first is the availability of bandwidth for transmitting signals. Compared to years ago, the amount of carrier bandwidth accessible to broadcasters has increased significantly, while the associated costs have become far more reasonable. Both of these factors are vital, because the bandwidth demands for remote/split production are significantly higher than simply forwarding home a finished show.
With on-site production, broadcasters really only need one link back to their home country for the finished channel — at most, two or three paths for redundancy. With a remote or split production process, however, they might be sending back 20 or 30 signals from various cameras, at production quality levels, which requires dozens of links between the sites. The increase in availability and the reduction in the price of bandwidth are key enablers of remote/split production.
HDR Workflows
The 2024 Paris Summer Games are expected to mark a significant milestone in live HDR broadcasting, with several world broadcasters planning to incorporate HDR productions to varying extents. This brings the second factor — well-documented HDR workflows — into play.
Until recently, broadcasters were limited to locally producing live events in HDR, as workflows required careful visual coordination between camera shaders and producers looking at the same monitor.
That has changed over the last few years as broadcasters have developed and documented their HDR workflows across major events, including standardised LUTs for conversion and shader checking the Standard Dynamic Range (SDR). Today, these standardised workflows are capable of supporting local and mixed/remote production, including creating SDR work products of very high quality — a requirement for the all-important legacy distributions.
Reducing latency
Finally, the relatively new JPEG XS codec tackles the issue of latency, which has traditionally been a stumbling block in remote production, especially when it comes to communication between on-site camera operators and technical directors in the studio. With traditional codecs, it may take the director a few seconds or longer to see the result after they’ve asked the camera operator to adjust something, such as zooming in or panning left. This can lead to a frustrating and disjointed process that hinders cohesive team interaction.
By reducing the latency of the signals being transmitted between the sites, the entire team feels like they are working more naturally together. JPEG XS dramatically reduces latency to the bare minimum while maintaining production picture quality.
At Imagine Communications, many of our customers have found that the JPEG XS codec offers the ideal combination of high picture quality, ultra-low latency, with an 8:1 bandwidth savings over uncompressed — allowing them to achieve the look they want while enjoying the benefits of the remote/split production. So, with its support for JPEG XS alongside its complement of UHD and HDR conversion capabilities, our Selenio Network Processor (SNP) has become an integral part of their remote production workflows.
There are more than 3,500 SNP units actively deployed around the globe — for more than 100,000 video channels worth of video processing — and many of them will be on the ground this summer in Paris. It’s going to be a watershed event for remote production, and we are thrilled to be a part of it.