Over the years, the cost of good television equipment has bucked the general inflationary trend. For example, you can buy a camera and lens today that is far better than a camera and lens from 20 years ago, and today’s gear will cost just 50% or less of what the camera would have cost you in 2003.
When using that camera back in the day, you probably wanted to view its output on a waveform monitor to get the exposure right. Today’s camera is likely to have a waveform display capability built in that can be called up anytime to check the exposure. So today’s production equipment not only often costs less than ever, it does more than ever.
Of course, video networks are far more complex today, and this complexity means that test and measurement/quality control equipment today has to do much more at the network level than was needed for the simpler networks two decades ago. Much of today’s network-level test gear is evolving with new measurement and analysis techniques, as the video streaming environment gets more complicated and demanding over time.
“As delivery systems have become more complex, the demand for versatile and advanced digital monitoring and analysis solutions has increased,” said Anupama Anantharaman, vice president for product management at Interra Systems. “As a result, single-purpose monitoring gear has taken a backseat to more adaptable multifunctional tools. These tools leverage the power of new content creation and delivery standards—and computer networks—to provide comprehensive monitoring capabilities that cover various aspects of stream analysis and signal evaluation.”
The Impact of ST 2110
Signals that conform to the SMPTE ST 2110 standard have multiple components (video, audio and ancillary data) each transmitted as separate streams. An IP-based stream monitor can analyze, evaluate and route these streams and extract the metadata based on the user’s workflow, providing full flexibility.
“Software-based tools used on an IP network can efficiently extract and analyze this metadata, allowing broadcasters to monitor and verify critical information such as format, timing and identification,” Anantharaman said. “In the context of video, waveform monitors can sometimes fail to detect problems due to human error. However, by using IP-based probes, operators can automate the process of stream verification for today’s vast amount of content.”
Human error notwithstanding, standalone testing and QC devices still have value in many operations and workflows.
“Although the underlying infrastructure of how media is transported has changed, the overall tasks that need to be accomplished by operators and engineers are still fundamentally the same,” said Don Kianian, solutions marketing manager for Telestream. “The workload has increased and the need to manage these workflows efficiently is paramount.
“Traditional, familiar waveform monitoring tools are still extremely useful to monitor and measure video and audio signals,” he added. “They may also be complemented by other software tools (such as quality control solutions) that can automatically QC content and alert users to media that does not meet spec.”
Telestream serves both OTT and broadcasters, and Kianian said the company is seeing a shift in how the two are evolving.
“We’re certainly seeing OTA adopting more and more OTT technologies and solutions [e.g., cloud workflows, remote collaboration, AI/ML] as part of a broadcaster/content provider’s overall strategy to reach broader audiences and go toe-to-toe with the increasing number of content providers in the market,” he said. “I expect this tug-of-war to continue well into this year and beyond.”
Upgrade to ATSC 3.0
Traditional broadcasters are in the early phases of a soft transition to the ATSC 3.0 standard, which is much more IP-friendly than the outgoing ATSC 1.0. What does this mean with respect to test and measurement equipment, and what lies ahead?
“As the transition to ATSC 3.0 heats up in the United States, we are seeing increased interest in test and measurement equipment,” said Ralph Bachofen, vice president of sales and marketing at Triveni Digital. “The transition to ATSC 3.0 is an opportunity for broadcasters to upgrade their T&M suite.
“With the latest T&M equipment, broadcasters can ensure outstanding-quality NextGen TV services, better understand the new standard, and simultaneously deliver ATSC 1.0 and ATSC 3.0 transmissions during the simulcast phase.”
Bachofen explained that single-purpose test gear is still in use, but it has an easily understood limiting factor.
“Single-purpose gear is still in use, especially in the RF realm,” he said. “The trend is to adopt multipurpose systems for service quality assurance and enterprise-wide monitoring. Single-purpose gear is much more expensive than having a solution from one vendor that can monitor multiple demarcation points at the same time.”
The sheer volume of production today means that more test, measurement and quality control gear than ever is needed. And yes, that means devices such as waveform monitors.
“The demands of operational television businesses are largely unchanged in that content has to meet in-house delivery standards as well as regulatory requirements,” said Prinyar Boon, product manager at Phabrix. “This remains the case regardless of how it’s been carried: file, streamed, broadcast, compressed, uncompressed, SDI, IP, MT2TS and so on.”
New Signal Types
Although production facilities need to measure and confirm the quality of signals, the relatively recent (and ongoing) industry transition from SDI to ST 2110 means new test gear is needed for the new signal types, according to Boon.
“To complicate the picture, many of these facilities will use both SDI- and IP-based systems,” Boon said. “We’ve been deeply involved for many years with the development of workflows for live HDR production, and measurement tools for ST 2110 systems using precision time protocol (PTP), while also retaining traditional SDI-based toolsets.
“So yes, we’d agree very much that the era of standalone gear is over, and without doubt the long-term trend is to have operational and production tools evolve into on-prem and off-prem cloud-based systems.”
In the days when broadcast OTA was the only game in town, broadcasters could rely on knowing that the signal was properly received on a monitor TV in the studio. If the picture/sound were good on that TV, then all viewers were receiving a quality signal.
In the OTT world, it is much more difficult for a program provider to check what the user experience is—especially when dozens of network operators (Comcast, Verizon, Cox, etc.) actually take the signal from the program provider and deliver it to the customer. Ensuring a quality experience in this environment is complicated… but not impossible.
“Because of the user-centric technology in modern video streaming, measuring the quality of the video transmission sent out doesn’t necessarily offer insight into the quality of the video received by the customer,” said Mathieu Planche, CEO of Witbe. “Furthermore, true video service performance is now defined by many more elements than just the image quality alone. User-centric video services need to be tested with user-centric monitoring technology.”
Location, Location, Location
In other words, sensors from the test system must be placed as close to the user as possible to make sure the network delivers the expected quality to the customer.
“Checking that a FAST channel is available on Comcast NOW TV cannot be verified through simple packet loss,” Planche said. “It involves complex interactions between backend systems, cloud-to-cloud integration, device operating systems, content providers, ad providers and various DRM stacks. OTT providers need user-centric monitoring technology to understand the true quality delivered to their end-users.”
This is where artificial intelligence (AI) could play an important role in monitoring networks and responding to failures faster than a human can react. In the meantime, change and growth is constant in the program delivery process.
“As the weight shifts further to online content and live streaming, having a solid grip on the entire chain becomes more important,” said Rajesh Patel, vice president for sales & solutions EMEA at Mediaproxy. “With an ever-increasing scale of sources, using software-based tools and APIs to optimize monitoring workflows will help reduce overall cost and enhance quality-of-service.
“Teaming up with partners with solutions that straddle multiple disciplines is something to look at as old equipment becomes obsolete,” he continued. “This also includes shifting the thinking towards what the cloud can offer to handle specific pressure points.”
Patel sees that there is still a need for hardware-specific test equipment, even as change is coming.
“Despite the ever-growing availability of software-based tools, traditional hardware-based monitoring equipment is certainly still relevant in today’s monitoring chain,” he said. “This is especially still true for ST 2022-2 as well as for the emerging SDI replacement, ST 2110.”