The Challenges of Testing Live Streaming Broadcasts
In December, YouTube paid nearly two billion dollars for the streaming rights for the National Football League (NFL). It was the largest amount of money paid to a major professional sports league for the rights to stream live content. It exemplifies the dominance of streaming and over-the-top (OTT) content, certainly, but also the investment streaming players are willing to make in live content.
As media giants and OTT platforms look to differentiate themselves and add to the video on demand (VoD) content that has been a hallmark of digital success and other events like Coachella that were not part of traditional linear offerings find their way into streaming, live streaming has emerged as the next frontier. Platforms like Hulu, Paramount+, Peacock, and YouTubeTV are ensuring that anything that airs on network or cable can also be watched live through their streaming platforms. It’s a market that’s forecasted to grow to over
However, live streaming has risks similar to the early days of VoD. The experience is what matters because consumers are fickle - and streaming services have largely been successful in capitalizing on the benefits of digital and transitioning viewers away from traditional television. Last August, for the first time, streaming viewership outpaced cable television.
Unlike VoD, testing for live broadcast is not done in a static environment. A failed live stream is immediately noticeable. And, unlike the heavyweight streaming organisations, most organisations cannot scale to support the number of live streams being produced today.
So, how do OTT providers take advantage of the live streaming opportunity while maintaining the quality experience viewers expect? Doing it right - and at a global scale - requires testers who represent real viewers, a wide variety of devices, and access to data that enables providers to take actionable inputs to improve the experience.
Pitfalls in delivering live broadcast
Validating live event streams and the advertisements displayed during these events means testing on a massive scope. Not only must media providers cover a wide base of subscribers with many devices, OSes and carriers, but they must do so for customers located in various locales around the world.
One of the primary mistakes organisations make when live testing is assuming that staging environment testing will produce the same output as actual live production. Teams sometimes assume that if it's working in staging, it's going to work in production. The inherent challenge in that assumption is that there are so many possible variables that teams have difficulty reproducing them internally and testing them appropriately.
Take for an example, the most watched event in the world: the Super Bowl. It may have been Kansas City facing off against Philadelphia in Glendale, Arizona, but the audience was global. That’s far more complicated than just streaming the game in the U.S. because bandwidth speeds vary from country to country, making network capabilities are going to be very different. If there is any type of latency, package sizes are not sent correctly. As that starts building up, it creates a disruption into the viewers’ side.
Advertising is also a hurdle when considering live broadcast. Server and client-side ads may perform differently and, again, there are regional issues that are even more micro than country to country. For something as big as the Super Bowl, which advertisers pay millions and millions for, they want to ensure that their ads are being targeted correctly not just from country to country but from city to city. If you don’t have testers reviewing this real time, you can’t guarantee that the ads are targeted - possibly turning off advertisers from returning.
Devices are also an issue from country to country. In the United States, you'll see a lot of Apple TV or Chromecast. In Asia, you’re more likely producing content for Amazon Fire Stick or Xiaomi, a device a testing team outside of Asia likely doesn’t have access to and, even if they do, they need to replicate the same type of network conditions that they may be testing in the U.S. with a 4G or 5G connection or high-speed internet.
There’s a lot to consider when testing for a live broadcast - and it’s too much for most in-house teams to do well.
Live broadcast needs a different testing approach
For live streaming to be successful, media organisations need teams of testers in every geography in which you’re streaming live content. If you’re producing the Super Bowl, find testers that are passionate about football; if you’re producing the Oscars, identify testers that live for the arts. Your goal is ultimately to avoid that one failed live event stream that can quickly go viral, causing significant damage to your reputation. This is particularly true in the age of social media, where dissatisfied viewers can quickly spread their negative experiences with a live event stream to a wide audience.
Crowdtesting brings together the right audiences and empowers them to break things in a testing environment based on their specific location, by device type, and based on the network options that are available. Once crowdtesting teams have been implemented, they can be engaged repeatedly.
There’s no substitute for real testers with real devices in real markets. Device farms and VPNs might catch some issues, but that’ll be of little solace if you can’t capture a defect actively affecting a significant portion of live stream viewers. Streaming device fragmentation makes it increasingly challenging for testing teams to evaluate the quality of the stream, playback and ads, due to the many available resolutions, interactive camera views and closed captions availability, just to name a few. In-market testers trained to provide detailed reports and capture Charles logs help quickly identify and resolve these defects, especially when collaborating in real time.
Crowdtesting, which offers in-market testers to cover gaps in internal testing coverage, can help providers ensure they are bringing their best, high-quality experiences to this complex and competitive marketplace.
Adrian Garcia, Solutions Consultant, Applause
As a solutions consultant for Applause, Adrian Garcia works with clients around the world to understand their strategic business objectives and requirements, and align them to the appropriate digital quality and feedback solutions. Adrian is passionate about helping customers ensure high quality for their websites, mobile apps, IoT products and digital experiences via Applause's fully managed crowdtesting services.
[Editor's note: This is a contributed article from Applause. Streaming Media accepts vendor bylines based solely on their value to our readers.]
Related Articles
The battle for audience retention continues to intensify in the highly competitive streaming realm. Anupama Anantharaman of Interra Systems highlights the critical role that end-to-end quality assurance, media QC monitoring, and in-depth video analysis play in meeting the escalating demands of today's video consumers and why the role of these processes is becoming more crucial than ever.
26 Mar 2024