SVGE Analysis: Wireless workflows, content management and quality of service

Robbie Fleming: “We want our workflow to be as efficient as possible”

Robbie Fleming: “We want our workflow to be as efficient as possible”

New workflows, particularly wireless workflows, will be key to gaining maximum efficiency from any move to IP-based production, according to Sony. “We want our workflow to be as efficient as possible, using our resources to the best of their abilities,” Robbie Fleming, Product Marketing Manager (ODA), Sony Professional Solutions Europe, told attendees at a recent Sony Technology Day at Pinewood Studios.

This will involve: working faster, and better; using automation where practical; reducing unnecessary manual operations; minimising duplication of effort; sharing resources; improving quality while keeping incremental costs low; reducing inventory; and responding to business needs more quickly.

This change is centred on an IT infrastructure “very much driven round a main data core” built using normal, off-the-shelf IT equipment. Using virtual machines means it can be scaled more easily, as can connectivity to cloud services for extra capacity.

Content management

“You need content management to be really simple, easy to use, scalable, and also affordable,” he added. Although Sony had a system, it wasn’t particularly affordable, which is why it has developed Media Navigator, which starts at about €2,500, and will scale up to 50 seats. For users with several bureaux of 50 seats, they can all be interlinked. “It can network really well together,” he claimed.

Fleming believes that most people’s problems are storage of content, finding it, and coping with multiple databases, which is where Navigator (a software product) comes in. It can handle ingest (from tape, live, files — automatically transcoding to a common format if required), do shot logging, add metadata, storyboard, send to edit, etc. It also handles distribution, outputting to required formats, and archiving — it is particularly well integrated with Sony’s Optical Disc Archive.

ODA appears like a hard drive, and works with any format/file type, and will soon be available in 3.3TB per cartridge (1.5TB now). It offers a 50-year life span (at 42ºC in office conditions, so should do even better than that) at half the cost of disk-based systems, with 80% reduction in overall running costs. It is based on the open UDF format, so fully backward compatible. Users include Facebook. Systems range from a networkable desktop system up to a 1.8 petabyte robot (in the next version). The second generation will use Archival Disc – a joint standard with Panasonic (with Sony’s 11-disc cartridges then holding up to 11TB).

“Part of this process is how do you handle all this data, so we support a lot of different MAMs,” with 36 partners so far, he added.

With SR tapes on their way out, Fleming sees ODA as “a very good mastering format,” that will take ProRes, DNx, SR, or other mastering codecs. It can also be used for disaster recovery, perhaps as a supplement to LTO tapes, as having two separate formats, as well as locations, is a good idea. ODA is waterproof, and has survived being driven over by a Toyota HiLux. “They are very solid cartridges,” said Tom Crocker, Product Specialist – Broadcast + Cinematography AV & Media Solutions.

Media access

Tom Crocker: Wireless will allow you “to go live from any camera in your fleet at any time”

Tom Crocker: Wireless will allow you “to go live from any camera in your fleet at any time”

“One of the challenges for production, which people have, to some extent, stopped thinking about is moving media around,” said Crocker. It is an area where significant changes are being made, “either through proxy workflows, or more compressed codecs, or more compressed hi-rez codecs being used as proxies,” which is why “how we integrate that into the equipment that we have and how that lands on a server and is appreciated and then becomes a resource on a central data core and gets out to whoever needs it,” is becoming an important issue.

“We are also seeing a lot more remote workers, including from regional offices,” said Fleming, and these also need access to the central data core, and to search for and play material, to collaborate with others in different locations, and to be able to create once but deliver different versions to several platforms.

Playout “has changed massively over the last few years. It’s not just playing out from servers on a schedule, but now people watch TV differently, whether on NefFlix or Sky Go.”

Sony now has a suite of products for workflow, that work together, ranging from its bespoke Media Backbone Conductor (as used by Turner), to Sonaps (for news), which have both been in use for years. More recently it has introduced Media Navigator, a cost-effective MAM, its Wireless products, the Optical Disc Archive, and its Tape Digitizing Station, which Fleming believes is significant as it will no longer manufacture any tape products from this year. “Tape digitising will still need to be done, as people have massive libraries of tape,” he said. “And that [library] is worth a huge amount of money and will need to be transferred into that central hub,” added Crocker, with many broadcasters having to move hundreds of thousands of hours of tape, and that will require useful metadata, so that any clips you need are easy to find. “The more media you have, the harder it is to handle,” added Fleming.

Wireless workflows

Developments in wireless networking are making it even easier to create ever more media. Crocker sees wireless workflows bringing production and post-production closer together, and said recent changes have made this more feasible, including: WiFi, 3G/4G connections “where the speed is now over the threshold where video can be usefully transferred”; mobile computing – “we now all have fairly powerful mobile computers in our pockets, which significantly increases the scope of what can be done in terms of shot listing, adding metadata, in the field as it is being recorded. This used to be a separate step, but you can now fold a lot of that into the production”; more efficient codecs; and the use of the cloud.

A few years ago, many broadcasters were wary of the cloud because they didn’t own it and it was untested, “but these days a lot of people are more comfortable about putting things in the cloud, and that cloud might be owned by somebody else, but companies like Amazon or Google, or other large-scale cloud hosting providers now have much more rigorous systems in place so that people are comfortable with that.” That means that there are intermediary layers between a camera and a server, or into a PAM or MAM, where anyone, anywhere, you want can access the material and do useful things with it. This considerably increases the flexibility of the workflow.

Wireless technologies are increasing in speed, and with 5G potentially offering speeds of 1Gbps up and down, Crocker wondered: “Are we even going to need [location recording] media for everything, all the time? Does media then become a backup, and everything gets sent back, becoming your primary master?” While this might not be relevant yet, he feels it is important to be aware of the implications it will have on the workflow in five or ten years, “once you can start streaming Raw back to somewhere.” In the meantime, it certainly offers many opportunities to cut time to air.

Sony currently offers wireless workflow-enabled cameras, like the new PXW-X400 (which ships soon) or PXW-X200. These can stream MPEG Transport Stream or using Sony’s new QoS, record proxies and upload those, or record/upload hi-rez. There is also a universal add-on box for equipment (cameras or anything else that outputs an SDI signal) that doesn’t have wireless built in – it connects to a phone or tablet so users can view material. Sony also offers its own Ci cloud services platform, with various broadcast features.

QoS (quality of service) “is a very clever thing that we do with our streaming,” explained Crocker, and has “a lot of different streaming options.” It uses “a lot of the very clever technology that’s in H.264, and some clever technology that we’ve built in our servers, and a really good understanding of how networking works, to be able to get a really high quality stream coming out of the camera. Much, much beyond what MPEG-2 Transport Stream can do, over low bandwidth and unreliable bandwidth,” he said.

Quality of Service

“QoS is a way for us to get the best possible quality over a stream, over what might potentially be an unreliable network,” whether WiFi or 3G/4G links.

It includes a range of technology to improve the picture quality, such as automatic intelligent repeat requests and forward error correction, “which allow you to maintain a really good image even though you are losing quite a lot of packets. Even if you are losing up to 30% of the packets being sent, we can still maintain a really good image. Even at bandwidths down to half- to one megabit,” which you might get over 3G.

“We send our video over UDP rather than TCP, because TCP involves packet checking and it will continue to check that it has all the packets, but in a video streaming environment you have a playout time, so that might not be the most appropriate way to do it, particularly if you can put an intelligent chip on the end that can analyse the incoming video, possibly receive information from adjacent frames in order to rebuild that picture or, if required, request just the packets that it needs back again, but only if it knows it will get them back in time for playout, as well as doing things like scaling the video throughput based on available bandwidth, and forward error correction — sending instructions on how to rebuild lost frames based on limited information,” said Crocker.

“The amount of bandwidth will dictate how much information you can push through, so the quality will drop and increase in a fluctuating bandwidth environment. Now, that said, you have some control over it: if you want you can put the output at a lower fixed point and it will never try to retrieve more than that, which will give you a more consistent picture, or you can try to always shoot for the highest. It depends on the infrastructure, but you can choose a bandwidth from half [a megabyte] up to 6Mbps – at the moment, but there might be some developments with that, as this is a fairly new product.”

It was launched about a year ago, but “we’ve had some significant updates in version 1.2, including the ability to log into remote cameras and pull files from them [or proxies] and bring those into an editing environment, as well as remote camera control through the PWS-100RX1 interface,” which is a new 1RU blade server. “It’ll decode up to two streams at the same time coming in through its Ethernet port and output those over two 2xSDI ports (so you get two copies of each output signal). The servers can be muxed together if you want to get more of those off one GUI. You can also sacrifice one of those outputs if you want it to write a file of that stream, instead of sending out two video streams.”

Sony’s current range of ENG cameras (basically any X model) has QoS built in, while an expansion box can add QoS to any of the previous range of XDCAM cameras and, with some exceptions, to any SDI device — it will be able to build a stream, but may not provide full interface access to the device you are streaming from, so some features, such as camera control, might not be possible.

It doesn’t, currently, include talkback or a return path, and Crocker sees its present iteration primarily as a news production tool, but it has been used for multi-camera events, where streams coming in from widespread locations were edited and output as a webcast, and he sees it as a useful tool for creating second-screen or supplemental content for a main broadcast.

All the wireless technologies “are pretty much in their infancy, so a lot of people have a lot of different ideas about how they use them. What we wanted to do was provide an interface to get images back without a requirement for either multiple SIM cards or a satellite truck. Simply being able to go live from any camera in your fleet at any time, wherever they are is sort of the idea behind it.”

Subscribe and Get SVG Europe Newsletters