Improving file-based workflows: traditional quality control for current-day scenarios
With media organizations worldwide seeking to exploit the opportunities presented by new distribution channels and new media consumption patterns, quality control across all these media forms is becoming a mission critical issue. In this 3-part article, Bruce Devlin, AmberFin’s CTO and co-author of the MXF file format, presents the concept of Unified Quality Control and explore the practical ways in which it can improve file-based workflows. In part 1, Devlin reviews traditional QC methods most often applied in the broadcast industry, and challenges current practices.
The need for quality control has always been at the cornerstone of broadcast operations worldwide and no broadcaster ever knowingly transmits sub-standard content or allows bad files to cause black screens. For four decades, from 1940 until 1980, the norm was an environment where VTR-type devices were connected more-or-less directly to large high-powered TV transmitters. The over-riding need was to protect the transmitters from media faults that could knock a transmitter off-air, cause an analogue TV set to lose the TV signal or even damage the transmitter itself.
In those days, QC was essentially a range of measures performed on the PAL, NTSC or SECAM waveform signal, designed to protect and safeguard the transmitter and the transmitted signal. Most of today’s test and measurement procedures are basically the same as decades ago, but you have to ask whether they are as important in all stages of a digital file-based workflow: do they incorporate the appropriate blend of automation with sufficient human intervention to maintain consistent quality? And in most cases, the answer is ‘no’.
In today’s multimedia landscape, different media facilities require different forms of QC because the QC function is protecting different business models:
- Content owners selling to broadcasters and content aggregators have a central need for high quality content with added interoperability with editing and playout platforms. In this situation, QC is focused on baseband metrics and file syntax.
- Playout facilities require absolute assurance of business continuity 24/7. Their nightmare scenario is breaks in transmission, so assurance that the product within their schedules is fit for broadcast is central to their business model.
- Online portals are an increasingly popular source of media consumption today. Compared to other distribution channels, they are faced with a far wider range of playback devices from PCs, to smartphones and media players. Their ability to provide compatible services with the broadest range of these devices will be a key factor in deciding their success in a fiercely competitive market.
- Historical archives need to be certain that any “residue” from a technical operation today will not be visible in the future. If they process the media today – leaving a footprint from 2012 that has been pure until that point in time – it would undermine their entire raison d’être of creating a safe media repository.
In today’s media world, money talks like never before. The focus is on the financial aspects and the business process rather than the evolution of technology as in previous eras. Organizations are looking to make more money from their assets and to insure against possible financial losses because their media is not fit for purpose. Qc, as a business process costs money to perform, so to ensure a good ROI (return on investment) different file-based measurements should be used in different applications, focused on workflow requirements.