SVGE Analysis: Sony promises unified IP infrastructure for live production

Nicolas Moreau: Sony’s Live System Manager will be central to its vision of live IP production

“Live production is the last part of the industry that hasn’t moved to IP.” Although there are news and studio set ups that do use file-based infrastructure, they tend to be more constrained than recorded programmes, according to Nicolas Moreau, Product Marketing Manager IP Live Production & Workflows, speaking at a recent Sony Technology Day held at Pinewood Studios.

He sees “a huge advantage to going for an IP-based solution, as you can manage your resources way more easily; you can share the resources and you can expand the scale of your production system much more easily.”

One of the problems with live production not yet using IP, while post production already does, is that broadcast infrastructure costs are being duplicated in running realtime and file-based systems in parallel. While he admits that there are good reasons for sticking with SDI for its constant latency, constant bitrate, and seamless switching, it necessitates using a dedicated router and format, which makes it difficult to expand.

However, broadcasters are now looking for scalable systems. “They don’t want to have to buy an SDI matrix and be stuck with a certain amount of inputs and outputs anymore. They want to be able to re-shape their system as much as possible, and make sure they are future proof,” to simplify the move from HD to Ultra HD, and beyond.

That should be a lot easier with IP, which promises to reduce costs, thanks to the use of commercial off the shelf (COTS) hardware, allowing for easy expansion using low-cost IP switches. It should also be format agnostic, fully scalable, using a common platform, making it simpler to set up multi-studio and remote production resource sharing and streamline set up, maintenance, device management and status monitoring.

However, until recently there was no guaranteed latency, leading to synchronisation issues. “But, nowadays, we have several standards and tools that we can use,” said Moreau. “But, for a live sports broadcaster, I don’t see them being OK with a latency of more than five milliseconds.”

Systems using COTS hardware should “minimise the overall cost of an IP production and infrastructure,” and “we want to avoid, as much as possible, our partners having to be locked in with a specific vendor,” which means also ensuring full interoperability between all the products.

Reducing CapEx and OpEx

“In the past, when you wanted to do some maintenance on the camera, you had to be in front of the camera,” but as every device will be connected via IP, anything that needs maintenance will be accessible via the network, so that the service centre can be very remote from the equipment.

Operating expenses can be reduced even further thanks to new workflows, such as remote production. “You don’t have to send all that crew anymore,” using fewer people to cover an event. If you can get all the camera feeds back from the location, mixing, graphics, replays, etc., can be done back at base.

A key reason for outside broadcasts to switch to IP, at least for UHD production, will be the opportunity to reduce the number of cables, from four 3G-SDI cables per device to just a single fibre or copper cable, significantly reducing the weight and complexity of cabling.

Joint Taskforce

Based on the requirements list for standards and interoperability put forward by the EBU, SMPTE and Video Services Forum Joint Taskforce on Networked Media, Sony has developed two core technologies: an LSI chipset and an IP Core that is running on FPGAs; and its Network Manager.

“The idea was to aggregate all the tools, all the standards, all the methods, the techniques, that are needed to do proper IP live production, and to make that available to the industry either through offering an LSI chip that manufacturers can integrate into their system, or offering an IP Core that can run on a reprogrammable FPGA,” said Moreau.

Sony intends to adopt this technology on all of its professional products for live production and, through a third-party alliance, work with other manufacturers (such as graphics vendors) to ensure it is also available in a very wide range of other products.

The general architecture of the system, as proposed by the Joint Taskforce is made up of three layers: the device layer (from cameras, to switchers and monitors – connected together using IP switches rather than SDI routers); the service layer (tools managing AV streams, devices, and network resources); and the application layer (AV stream router, remote control, multi viewer, etc., which allow the operator interact with the system).

To meet these requirements, Sony designed its Networked Media Interface Framework, which comprises of 12 systems:

– Video/Audio mapping, covering how to map SDI signals into a packetised signal. This could use ST 2022-6, but this only applies to SD and HD and is not essence independent, AES67, Sony’s Essence Independent Mapping, or others;

– Compression, which will be necessary for UHD, as most networks at 10Gb, but uncompressed UHD requires 12Gb;

– Synchronisation (ST 2059, PTP);

– High Availability (ST 2022-7, etc);

– Clean Video Switching (destination timed switching);

– Device Management Workgroups, managing production settings by device configuration list;

– Device Management Control – Sony’s networked device control protocol (and, in future, ST 2071);

– Device Management Device Recovery (Session Initiation Protocol architecture);

– Security for live production (TLS);

– Network Management Switch Configuration (VLAN Netconf SDN system design tool);

– Network Management Bandwidth Reservation (DiffServe RFC 2474);

– RESTful API and plug-in SDK – the control API for third-party applications, allowing new devices to be added and controlled.

“Most of the industry right now is very focused on this audio/video mapping, which is very important, but it’s only a tiny bit of what it means to do IP live production,” and all 12 aspects need to be taken care of before any switch to IP, he said. The synchronisation of equipment is equally important, but it needs a very accurate timing protocol spread throughout the network. There are some standards that have been defined (such as SMPTE ST 2059) to make sure that all devices get the very same time.”

Ultimately, for live production, the system will have to be able to recognise a new device as it is added, assign an IP address, and make sure it is visible on the management systems — all of which is taken care of by the NMI system. However, if you are going to be able to add a device, it is essential to make sure the network has the capacity available, “so you need to have a network management bandwidth controller.”

If you want to handle both recorded and live programmes, “you want to make sure your network controller is going to give the priority to the live streams and lower priority to offline content and filing.” In Sony’s system, this is taken care of by the IP Live System Manager (LSM), which interacts with third-party systems via the API layer, and takes care of security, bandwidth reservation, access control, and quality of service.

Another crucial aspect is the ability to switch feeds or streams cleanly, without jitter between frames. There are three techniques in use, each with its own problems. Sony prefers the Destination Timed Switch, which switches at the receiving device, but this does mean that for a very short time both streams are connected to the same IP switch, which requires double the bandwidth. A Source Timed Switch avoids this, but requires very accurate sync control, and is less easily scalable, whereas a Switch Timed Switch is a non-standard vendor-dependent device, and may have problems delivering clean switching, particularly on larger networks.

One looming problem is the fact that Sony, like many companies in the industry, uses its own software-defined network (SDN) and “at the moment, there is absolutely no way to have several SDNs to communicate with each other on the very same network,” he said. There is work being done on getting round this issue, such as the OpenFlow protocol, but it isn’t sufficient yet, and he feels that broadcasters will need to “plan very strongly” to avoid these incompatibilities.

Although it is possible to install a live IP production system now, some components people might want to use are not yet IP enabled. The biggest issue probably being the gateway between audio and video. “There are many protocols and standards in the audio world. Sony has chosen first to implement the Dante protocol and AES67, but there are still other protocols, like Ravenna, which don’t talk together and are not very well connected to the video world.”

Proof of concept

Sony currently has 42 partner companies in the Networked Media Interface Alliance, including: Abekas, Avid/Orad, ChyronHego, Evertz, EVS, Harmonic, Imagine Communications and Snell Advanced Media. These are all developing products that will be compatible with NMI, some of which are already available. Altera, Macnica and Xilinx are also part of the Alliance, as FPGA builders, making the chips available to any manufacturer who wants to use the Networked Media Interface.

Sony’s goal is to enable complete IP live production, using both its LSM-based equipment and third-party systems, such as new PCIe cards from Advantec, Macnica and Matrox. Sony is also launching various IP-based models, including the new NXL-FR318 SDI-to-IP and IP-to-SDI gateway (to allow continued use of SDI cameras or monitors), and its latest “world first” 4K XVS-8000 hybrid SDI/IP production switcher, and Moreau hopes it will soon IP-enable a wider range of equipment, such as mid- or lower-range cameras and monitors.

It has also worked on several proof of concept tests, including one for Brazil’s Globo TV and another for football production.

It has also done two 4K/HD tests: one in collaboration with Imagine, using LSM/Magellan integration, with a Cisco switch; the other with Evertz, with Magnum and LSM hybrid router control, using an Evertz 3080IPX switch and ASPEN audio/video mapping.

The OB market is likely to be the first mover for IP live production, and Sony is hoping to be involved with several new OB trucks in Europe this year, with one it is bidding for being wanted for delivery by June. Moreau mentioned that it is also in talks involving studios using IP, possibly by the end of the year, but they don’t have firm deadlines for those.

Subscribe and Get SVG Europe Newsletters