Wikipedia, that fount of all knowledge for our times, defines master control as “the technical hub of a broadcast operation… the final point before a signal is transmitted. Master control is generally staffed with one or two operators around the clock, responsible for monitoring the quality and accuracy of the on-air product, ensuring the transmission meets government regulations, troubleshooting equipment malfunctions and preparing programming for playout. Regulations include both technical ones (such as over-modulation and dead air) as well as content ones (such as indecency).” How much of this is still relevant in our highly automated, machine learning, IP connected age?
The move from analogue to digital more or less removed the need to worry about technical quality at the point of playout. If a piece of content passed QC at the point of ingest then it will be technically identical at the point of delivery (at least in theory). There are no proc amps to be misaligned and pouring out much more than one volt.
Digital files can of course be corrupted along the way. If you lose a byte or two, you are liable to get visual disturbances and unpleasantly loud noises. But smart playout systems are going to include file integrity checks along the way: if the checksum is wrong then the file will not be played.
If we can move quality out of master control, what about accuracy? This is quite a bit harder. Both people and intelligent machines can be reasonably certain that the file is, say, an episode of Game of Thrones. But is it the right episode? Machine learning is probably not yet up to tracking the story arc of a fantasy series to know where this episode fits. Always assuming the artificial intelligence could do better than humans in understanding what it is all about anyway.
Some channels will want to prepare and transmit pre- and post-watershed versions of the same programme. It could be some way into transmission before you get to the first edit, by which time it may be too late.
Live programming is most resistant to complete automation. No-one really knows when a football match will end or – probably the biggest operational challenge in broadcasting today – whether there is time for a 30 second commercial at the end of the next over in test cricket.
So what is the answer to the question? Do we, today, need master control?
In most cases, no we do not. Good asset management will ensure that the right content is lined up and file-checked ready for transmission. Automatic QC on the output will flag up any output error – black or silence – comparing it first to metadata on the file to check if it is supposed to go quiet or black at that point.
Automatic programme junction systems will work out precisely what time needs to be filled between content and around advertising, and play interstitials to fit. If necessary, it will create those interstitials on the fly, using data sources from the planning system to know what is coming next to a weather site to generate a forecast.
Genuinely live programming may need some human intervention, to get into and out of unpredictable events. But in most cases that could be automated, with the remote director sending cues back to the playout automation to say when an event is going to end, or when it is going to a commercial break.
« To the list of news