The State of MPEG-DASH 2017
DASH-IF also intends to ensure that the implementations will move forward quickly and efficiently. “As regards interoperability, we need to reuse existing conformance tools and test vectors in DASH-IF for CMAF. We will also support the industry that has committed to use DASH and wants to benefit from CMAF—Digital Video Broadcasting (DVB), 3rd Generation Partnership Project (3GPP), ATSC, and HbbTV, but also proprietary communities and service providers. Finally, we need to extend CMAF to enable the advanced functionalities and experiences that are currently enabled by DASH. All of this work is started in DASH-IF and/or MPEG, but it requires detailed work on the specification and conformance tools to ensure that we can deliver a consistent DASH and CMAF story at the time when CMAF is ready.” That’s definitely a challenge and a significant amount of work, but it’s worth the effort as the workflows rationalization perspective is quite appealing for many companies.
It is also important to note here that, as it happened with ATSC, some consortia might require DASH-IF assistance to help them defining the scope of their CMAF integration in their respective specifications. That’s the case for the new CTA WAVE organization, as Akamai’s Law explains: “CTA WAVE will look to DASH-IF to define the use of CMAF with DASH and will not do so itself. The DASH-IF is looking to harmonize DASH deployments across broadcast and broadband. DASH-CMAF may be the interop-point best positioned to bridge that gap.” Overall, CMAF’s traction is looking strong in both web and connected TV ecosystems, with DASH-IF in the middle acting as a catalyst to make the convergence happen and guarantee that all existing DASH implementations will not be jeopardized.
Preliminary CMAF to DASH Object Models Mapping
DASH Core Spec v3 and DASH-IF Guidelines v4
“We expect the third edition of MPEG-DASH to be published in the first half of 2017,” says Sodagar. What will be included in this core DASH specification version 3? All the amendments and corrigenda approved after the second edition, as stated by the MPEG-116 press release, including the latest Amendment 4 about manifests daisy-chaining (for targeted dynamic pre-roll ads applied to live linear) and preselection of audio elements offered in different adaptation sets. Amendments 1 to 3 include High Profile and Availability Time Synchronisation, Spatial Relationship Description (SRD), generalized URL parameters, Authentication, MPD linking, Callback Event, Period Continuity, and other extensions—all kinds of updates that are required to make DASH manifest even more intelligent and dynamic and allow new application fields like VR with SRD.
The ongoing MPEG Core Experiments—namely, Server and Network Assisted DASH (SAND), Full Duplex DASH (FDH), SAP-Independent Segment Signaling (SISSI), URI Signing for DASH and Content Aggregation, and Playback Control (CAPCO)—do continue on their respective tracks and will eventually be included into MPEG-DASH core specification v4 or a next version, depending on how fast things do progress.
On DASH-IF’s side, the guidelines are currently progressing in two directions: namely, the alignment on MPEG-DASH second edition (which was done with IOPs v3.3), and the integration of UHD, which was almost, but not quite, effective when this article was written. “V4 includes UHD 4K and UHD/ HDR with PQ based on SMPTE-2084. We align the specification with specifications provided by DVB, DECE, 3GPP, and UHD Forum. However, DASH-IF is not in the selection business; we only provide the signaling to enable the UHD and HDR technologies that members of DASH-IF have identified to be commercially relevant. We expect more HDR variants to come with the market growing and stabilizing,” says Stockhammer. It’s worth noting here that it’s the first time that baseline UHD and UHD with HDR capability are standardized in an ABR specification. That’s yet another sign, if we needed one, showing that DASH is the de facto standard for all advanced streaming applications.
DASH-IF Extends DASH’s Footprint
In the past, DASH-IF collaborated with DVB on the DVB-DASH profile and with HbbTV on the DASH test assertion for HbbTV 2.0. In 2016, DASH-IF, in collaboration with ATSC, has been defining the DASH profile for ATSC 3.0. This is an interesting new focus because the profile is deemed for polyvalent use between the broadcast (where it will be multi-casted on a single bitrate) and the broadband (where it will use a traditional unicast ABR approach), whereas DVB-DASH is focused solely on the broadband use case. While the ATSC3.0 profile specification was expected to be ready by the end of 2016, some work remains to be done around conformance and test tools.
As for VP9, things are moving slowly but surely. After the publication of the VP Codec ISO Media File Format Binding specification by Microsoft’s Kilroy Hughes and Netflix’s David Ronca in December 2015, we are now heading toward a specific IOP, as Sodagar confirms: “DASH-IF is currently developing an Interop Point for VP9, and I expect its Implementation Guidelines to be published in the first half of 2017.” It also sounds like a good starting point for the integration of AV1, the new codec taking over from VP9 that is currently in development by the Alliance for Open Media, with a perspective of freezing the bitstream in March 2017.
What’s in the works for 2017 and subsequent years? CMAF integration of course, but also VR, where DASH is well positioned, as Sodagar summarizes: “DASH-IF’s main objective has always been the adoption of DASH-IF. With the recent advancements of DASH-IF IOP guidelines, the consortia’s adoption of DASH (such as DVB-DASH, HbbTV specs, and ATSC 3.0) and the number of vendors working on DASH solution, we expect a significant growth of DASH solutions in 2017 and 2018. VR is adding yet another value proposition of DASH in high-quality effective delivery of 360° Video Experience.”
DASH and VR
VR in DASH is being worked on in three different levels: basic, SRD, and SRD with Tiled HEVC. On a basic approach, DASH can already be beneficial to VR with either equirectangular 360 or FOV 360° (for example, with truncated pyramids). “It’s really easy, just need some metadata for signaling,” says Stockhammer. “Tiling would work as well, but is already ‘advanced.’”
On the next level, Spatial Relationship Description (SRD), we look to implement a first level of data splitting, through the synchronization of several streams done in the player. Sodagar provides an overview of the benefits: “A complete 360° Video requires significant bandwidth for delivery. DASH SRD provides a tiling technique in which the client only streams the portion of 360 Video that the end user is actually watching. Therefore, the bandwidth requirement is significantly reduced, compared to delivering the entire 360 video.” But SRD can go further, says Cyril Concolato, associate professor at Télécom ParisTech and one of the principal contributors to the GPAC project: “SRD was designed to describe spatial relationships between videos in a 2D space, like a mosaic. It can already be used for providing immersive experiences (e.g., when flattening a cylinder), but it might need adjustment for particular 3D relationships.”
With Tiled HEVC combined with SRD, we basically add the ability to cut a 360 image into smaller pieces (the “tiles”) and reassemble a customized field of view that will be decoded by a unique HEVC decoder, thus optimizing the necessary bandwidth. But it goes further in terms of viewpoint intelligence, according to Concolato: “Streaming an entire 360 video with good quality requires a large bandwidth. Tiled HEVC with DASH SRD allows streaming different regions of the videos with different qualities: the regions of interest for the user with a high quality, the region of less interest with lower quality.”
So what are the next milestones for generalizing the use of SRD combined with Tiled HEVC? “The main issues and questions are: how to design ABR algorithms for streaming multiple videos in parallel, what is acceptable for users in terms of quality degradation between neighboring tiles, including after projection, is it possible to trade quality with latency?” adds Concolato. From a standardization point of view, Concolato notes a few actions remain: “The HEVC File Format is finalized, including tools for tiles. MPEG is working on standard storage of VR metadata. Similarly, it is likely that a new DASH amendment will be started to enhance the MPD with improvements to SRD, and metadata mapping, both with the usual timeline for amendments.”
The Evolution of dash.js
The dash.js project’s popularity is rising. “We have 80 contributors to date—15 to 20 being currently active—with Akamai, BBC, Edgeware, Fraunhofer, and Maxdome as the largest contributors in 2016,” says Dan Sparacio, Akamai senior software engineer and dash.js lead developer. When asked about CMAF support, Sparacio says: “CMAF works out of the box with dash.js today. For CMAF low latency chunks, we need to implement the Fetch API in order to stream partial segments.”
2016 was a major year for dash.js, first with an iterative code refactoring from v2.0 to v2.4, but mainly for the introduction of the BOLA (Buffer Occupancy based Lyapunov Algorithm) ABR heuristics rules in dash.js v2.3. “Simplified, BOLA will model the buffer and create utility scores for the bitrates to form a mathematical formula aiming to find the optimal bitrate based on the buffer occupancy,” says Sparacio. “Traditional rules involve a throughput measurement per chunk which is then averaged over a sliding scale (based on past measurements variation) to come up with an optimal target bitrate. With traditional ABR rules, you typically need other rules to support throughput. For instance, emergency rebuffer rule, midflight-abandonment, rich buffer, and a dropped frames rules. With BOLA however, you can do most of this in one rule. This is because the buffer occupancy state tells a lot regarding the player’s state and ability to maintain a certain quality of media. Those buffer observation rules are not so relevant at startup, after a seek, or if you are playing a low latency live stream at the precise live edge segment. This is why we still need the regular ABR rules to be applied partially.”
Will Law, also a project lead on dash.js, gives some insights on dash.js directions in 2017: “BOLA improvements are considered ongoing, they will continue into next year. Although dash.js’s 2017 roadmap will be decided at December face-to-face, I’d say that Playback robustness, Caption/subtitle support, DRM support and ABR improvements will remain the top work items during 2017.”
The DASH Royalty Question
To wrap up, let’s look at how industry players reacted to the MPEG-LA DASH patent pool formation in November 2016. Right now, people are seeing how things evolve; so far, they are refraining from triggering massive migrations from DASH to HLS, as Köpsén explains: “Viaplay has to fully understand the implications, but it is very troublesome with this new information even though it seems like temporary loaded players are excluded and thereby will not require a move from DASH to HLS.”
Rambourg from Canal Plus goes further: “We’ll continue with our work on DASH, as it is a major technical format that we want to promote, while at the same time following its situation on the legal side. We’d be happy to deliver content to all ecosystems in DASH+CMAF, worst case in DASH/HLS+CMAF. If the future was HEVC+DASH, then MPEG-LA’s actions are clearly counterproductive.”
Following this patent pool announcement, DASH-IF published a page on its website where Cisco, Microsoft, and Qualcomm—all major contributors to the MPEG-DASH standard—reiterated their royalty-free positions. We’ll need to watch how the situation evolves in the next 12 months, but nothing can stop DASH’s onward march.
This article was published in the Spring 2017 European edition of Streaming Media magazine.
Related Articles
While it's clear that Flash's time is coming to an end, it's less clear what will replace it. A survey shows DASH support, but its real-world use is around one percent.
23 Jun 2016
The industry is turning away from plug-ins and embracing HTML5 everywhere. Here's how the vendor-independent streaming standard is gaining momentum.
08 Apr 2016
The U.S. OTT service's Baptiste Coudurier talks about the hard work—and black magic—behind the smooth migration to MPEG-DASH, which now accounts for 75% of its traffic
08 Jul 2015
Presenting the past, present, and future of MPEG-DASH, the standard that's transforming the entire video landscape in a quiet revolution.
06 Apr 2015
Companies and Suppliers Mentioned