HLS vs. DASH – Comparing the Most Popular HTTP Video Streaming Protocols

Streaming protocols such as HLS (HTTP Live Streaming) and MPEG-DASH (Dynamic Adaptive Streaming over HTTP) are two of the most popular video streaming protocols for transferring video over the Internet today. Both HLS and DASH come with several years of R&D, innovation, improvements, and they are usually the first choices when it comes to delivering video over HTTP.

HTTP-based streaming using HLS or MPEG-DASH is very popular because of the ease in creating the files needed for streaming using either commercial or open-source packagers. In addition, any reasonably powerful computer can be easily configured as a streaming server due to the use of HTTP and the convenience of using CDNs for caching and delivering the video segments.

In this post, we look at how HLS and DASH work, and then go into the differences between HLS and DASH. Let’s go!

Note: If you are new to video streaming, please read our articles on what is OTT and what is ABR video streaming to better understand the HTTP and ABR-based streaming protocols.

What is Adaptive Bitrate Streaming used in HLS & DASH?

ABR Streaming or Adaptive Bitrate Streaming is a streaming technology that allows video delivery systems to adjust the video quality during video delivery in response to the changes in the network conditions and the player’s buffer.

abr video streaming
Image credit: By Daseddon – Own work, CC BY-SA 3.0

ABR technologies are enabled by two innovations (primarily) –

  1. Multi-rate video transcoding where every video is compressed at different resolution and bitrate combinations (rendition/profile).
  2. Streaming protocols such as HLS and MPEG-DASH – these protocols take a video and chop it up into small segments and enable each of these segments to be delivered independently of each other. This segment-based delivery allows video players and servers to dynamically choose the next video segment from any rendition while adapting to the fluctuations in the network conditions and the player’s buffer.

With that introduction to ABR video streaming, let’s take a look at how HLS (HTTP Live Streaming) works. If you want to learn more abour ABR, please go to our ABR deep dive here.

What is HLS or HTTP Live Streaming?

HLS or HTTP Live Streaming is an HTTP-based ABR video streaming protocol introduced by Apple in 2009 that describes a set of tools and procedures for streaming video and audio over the internet. It is widely used in video streaming, primarily because HLS is the only natively supported protocol on iOS devices and the Safari browser. This makes it a necessity to deliver via HLS to reach an audience on Apple devices.

How does HLS work?

In HLS Video Streaming,

  • A video is split up into segments (either TS or fMP4 files), and the length of each segment is usually around 6 or 10 seconds.
  • The location and sequence of delivery of these segments are described in a set of XML files called playlists (or m3u8 files). 
  • The playlists and video segments are stored on a streaming server (origin/webserver).
  • The video can be played back using an HLS-compliant video player (HTML5 players, AVPlayer in the Apple ecosystem, etc.) to play the videos. Originally, HLS supported only TS files, but Apple introduced support for fmp4 (fragmented mp4) files later in WWDC 2016.
HLS HTTP Live Streaming Architecture
HLS Architecture (credits: Apple)

Apple has extensive documentation on their website – Authoring Guidelines, Introduction to HLS, HTTP Live Streaming, and more. You can also refer to the spec for details and insights into HLS – it has an obvious explanation of the various tags and a good introduction to ABR video streaming.

HLS manifests are of two types – master and the child/media playlists. If you have transcoded a movie at 3 different resolutions – 1080p, 720p, 480p, and then package them using the HLS protocol, you get one master playlist and three child playlists.

Here is an example of a master HLS playlist –

#EXTM3U
#EXT-X-MEDIA:URI="subtitle/lang_en/subtitle_index.m3u8",TYPE=SUBTITLES,GROUP-ID="subtitles",LANGUAGE="en",NAME="English",DEFAULT=YES,AUTOSELECT=YES
#EXT-X-STREAM-INF:BANDWIDTH=893752,AVERAGE-BANDWIDTH=560289,RESOLUTION=854x480,CODECS="avc1.4D401F,mp4a.40.2",SUBTITLES="subtitles"
media-4/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1494976,AVERAGE-BANDWIDTH=891779,RESOLUTION=1280x720,CODECS="avc1.640028,mp4a.40.2",SUBTITLES="subtitles"
media-5/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=3262928,AVERAGE-BANDWIDTH=1894009,RESOLUTION=1920x1080,CODECS="avc1.640028,mp4a.40.2",SUBTITLES="subtitles"
media-6/index.m3u8
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=161304,RESOLUTION=854x480,CODECS="avc1.4D401F",URI="media-4/iframes.m3u8"
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=241392,RESOLUTION=1280x720,CODECS="avc1.640028",URI="media-5/iframes.m3u8"
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=532416,RESOLUTION=1920x1080,CODECS="avc1.640028",URI="media-6/iframes.m3u8"

And, here is an example of a child playlist –

#EXTM3U
#EXT-X-VERSION:4
#EXT-X-PLAYLIST-TYPE:VOD
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-TARGETDURATION:4
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
segment-0.ts
#EXTINF:4.000000,
segment-1.ts
#EXTINF:4.000000,
segment-2.ts
#EXTINF:4.000000,

How do you create HLS Streams (packaging)?

Commercial packagers: There is wide-spread packaging support for HLS in the open-source and commercial spaces. Unified Streaming (USP) & Wowza are two that come to mind that support HLS out of the box.

BuyDRM

Open-source packagers: You have the Shaka packager and FFmpeg that engineers can use for packaging in HLS. You can read OTTVerse’s HLS Packaging using FFmpeg article to learn more.

Related:  What is HLS or HTTP Live Streaming Protocol? How Does HLS Work?

With this introduction to HLS, let us now move over to MPEG-DASH and learn how it works.

What is MPEG-DASH (Dynamic Adaptive Streaming over HTTP)?

MPEG-DASH is one of the most popular video-streaming protocols and is widely used to deliver media either via Video on Demand (VOD) or Live Streaming and to various end-user devices, including smartphones, tablets, SmartTVs, gaming consoles, and more.

To standardize ABR streaming, MPEG issued a CFP in 2009 for an HTTP-based Streaming Standard. Then, with the coordination of several companies and industry organizations, the MPEG-DASH standard was created and published in April 2012. Since then, it has been revised in 2019 as MPEG-DASH ISO/IEC 23009-1:2019.

How does MPEG-DASH Work?

Let’s take a look at how MPEG-DASH works.

  • Assuming that each video is encoded into different renditions/profiles, which are bitrate-resolution combinations.
  • These renditions are sent to a MPEG-DASH packaging service that splits each rendition into small pieces or chunks of a specified duration (e.g., 2/4/6 seconds long).
  • The packager also records how it split the videos, the order in which they are to be delivered, and other important metadata into a text file called an MPD or a manifest. Learn more about MPD files here.
  • The packaged videos and MPDs are stored on a server and served via a CDN.
  • Users can play the video back with a MPEG-DASH compliant video player.
CDN based Live Delivery

How do you create MPEG-DASH Streams (packaging)?

If you have a single video or several renditions of the same video, you can create a MPEG-DASH compatible stream. This process is called packaging and is performed by specialized software called packagers. Learn more about ABR packagers here.

There are many packagers that can perform this and some of the most popular ones are –

Example of an MPEG-DASH Manifest or MPD

Below is a simple example of a MPEG-DASH manifest/MPD used to deliver a video to a DASH-compliant player. To learn more about MPDs, how they are structured and used, please see our guide to the structure of MPEG-DASH MPD files.

<!--  MPD file Generated with GPAC version 0.5.1-DEV-rev5379  on 2014-09-10T13:23:18Z -->
<MPD xmlns="urn:mpeg:dash:schema:mpd:2011" minBufferTime="PT1.500000S" type="static" mediaPresentationDuration="PT0H9M56.46S" profiles="urn:mpeg:dash:profile:isoff-live:2011">
<ProgramInformation moreInformationURL="http://gpac.sourceforge.net">
<Title>dashed/BigBuckBunny_2s_simple_2014_05_09.mpd generated by GPAC</Title>
</ProgramInformation>
<Period duration="PT0H9M56.46S">
<AdaptationSet segmentAlignment="true" group="1" maxWidth="480" maxHeight="360" maxFrameRate="24" par="4:3">
<SegmentTemplate timescale="96" media="bunny_$Bandwidth$bps/BigBuckBunny_2s$Number$.m4s" startNumber="1" duration="192" initialization="bunny_$Bandwidth$bps/BigBuckBunny_2s_init.mp4"/>
<Representation id="854x480 595.0kbps" mimeType="video/mp4" codecs="avc1.42c01e" width="854" height="480" frameRate="24" sar="1:1" startWithSAP="1" bandwidth="595491"/>
<Representation id="1280x720 1.5Mbps" mimeType="video/mp4" codecs="avc1.42c01f" width="1280" height="720" frameRate="24" sar="1:1" startWithSAP="1" bandwidth="1546902"/>
<Representation id="1920x1080 2.1Mbps" mimeType="video/mp4" codecs="avc1.42c032" width="1920" height="1080" frameRate="24" sar="1:1" startWithSAP="1" bandwidth="2133691"/>
</AdaptationSet>
</Period>
</MPD>

Differences Between HLS and DASH

Now that we have a better understanding of HLS and DASH, let’s note some of the important differences

FeatureHLSMPEG-DASH
Created byApple Inc. A consortium of companies led by MPEG
Support HTTP-based ABR streamingYesYes
Supported by major CDNsYesYes
Security and EncryptionYes, via Apple FairplayYes, via Microsoft PlayReady & Google Widevine
Supports CENC (Common Encryption) and fmp4Yes Yes
Native HTML5 PlaybackOnly in Safari and EdgeYes via MSE
Supported by Safari browserYesNo
Low Latency PlaybackLL-HLSLL-DASH
iOS supportYes (only HLS is supported)No
Advertising support Yes (VAST & VPAID)Yes (VAST & VPAID)
Supports H.264/AVC & HEVCYesYes

Conclusion

So which one should you choose for your streaming purposes?

Most people would prefer to choose HLS due to its compatibility with the iOS ecosystem, but that does not mean you should ignore MPEG-DASH, which has a great feature set and is widely supported outside the iOS ecosystem. However, if you decide to use fragmented MP4 files (fmp4) as the intermediate file format, you don’t have to store two sets of files – one for HLS and one for DASH. The same common set of files can be referred to by both protocols, thus reducing storage space.

At the end of the day, it’s best to use a good analytics tool to understand your audience, their device preferences, etc., and then decide which ABR streaming protocol you’d like to use.

Until next time, happy streaming!

Be the first to comment

Leave a Reply

Your email address will not be published.


*