How does HLS Video Streaming Work?

Tags:

HLS or HTTP Live Streaming is an HTTP-based adaptive bitrate video streaming protocol introduced by Apple in 2006 that describes a set of tools and procedures for streaming video and audio over the internet. A video is split up into segments and the location and sequence of delivery of these segments is described in a set of XML files called playlists ending with a file extension = m3u8. An HLS compatible player can be used to play back the videos.

In this article, we take a deeper look into the procedure and tools to perform HLS video streaming. If you are new to video streaming, please read my previous articles on what is OTT and what is ABR video streaming to better understand the HLS streaming protocol.

When was HLS introduced?

Apple launched HLS in 2009 when it released the iPhone 3. It was meant as a way to improve the experience of media streaming for iPhone users and prevent users from facing issues when the bandwidth fluctuates and the streaming gets interrupted. Since then, Apple has been adding regular improvements to make HLS a very reliable, and highly-supported protocol for video streaming. They have extensive documentation on their website – Authoring Guidelines, Introduction to HLS, HTTP Live Streaming, and more. You can also refer to the spec for details and insights into HLS – it has a very clear explanation of the various tags and a good introduction to ABR video streaming.

Architecture for HLS streaming

HLS does not require fancy hardware to serve video. It’s a simple & efficient protocol for which you can use a regular web server to store and distribute the content. However, it requires the data to be in the proper format and client-side software to be able to stream HLS video content. Let’s take a look into what’s the architecture of an HLS-video streaming service.

HLS Architecture (credits: Apple)
  1. Encoder: this can be an encoder that can generate content in H.264/AVC or HEVC and can adhere to the standards specified in Apple’s Authoring Specifications. These guidelines are very comprehensive and very specific. For e.g., they specify the container format for each codec that is supported. For H.264, you must use either fMP4 or transport stream (TS). Please read through the guidelines when you are building out your own HLS streaming server.
  2. Packager: A packager is a software that takes a video and chops it up into segments of short durations like 10 seconds long. For e.g., a 1 hour movie will be chopped up into 10 sec long segments (360 of them). Then an XML file called a playlist is created that contains the names, location, and sequence of playback of these segments (along with metadata that describes the codec, resolution, bitrates, etc). The process of creating these segments is called segmenting or popularly, packaging.
  3. Web server: this can be any web server than is capable of serving files. when requested. The files an HLS server will need to serve are the playlists (m3u8 files) and the actual AV content which are Transport Stream segments.
  4. Player/Client: This is any player that understands the HLS protocol and can play back HLS streams (audio & video). The playback starts with downloading the playlist and then using that playlist to download segments successively and rendering it to the screen.

Sample HLS playlist

HLS manifests are of two types – the master and the child / media manifests. To understand how they are related, let’s take a simple example. Assume that you have encoded the movie at 3 different resolutions – 1080p, 720p, 480p (also referred to as renditions). After you package using the HLS protocol, you will end up with a master manifest and 3 child manifests.

The master manifest will describe the renditions and their specifications ( codecs – both audio and video, languages, bitrates) that are going to be streamed as part of this video. The child manifests will list out all the segments (location, names, sequence) of the respective renditions. So, in this case, you will have 3 child manifests – one for 1080p, 720p, 480p respectively.

I have written an article that lists sample m3u8 playlists that you can use for testing and research. Go here to find the list of m3u8 files.

Here is an example of a master manifest that lists the information about 3 different renditions that make up the encoding & streaming ladder.

#EXTM3U
#EXT-X-MEDIA:URI="subtitle/lang_en/subtitle_index.m3u8",TYPE=SUBTITLES,GROUP-ID="subtitles",LANGUAGE="en",NAME="English",DEFAULT=YES,AUTOSELECT=YES
#EXT-X-STREAM-INF:BANDWIDTH=893752,AVERAGE-BANDWIDTH=560289,RESOLUTION=854x480,CODECS="avc1.4D401F,mp4a.40.2",SUBTITLES="subtitles"
media-4/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1494976,AVERAGE-BANDWIDTH=891779,RESOLUTION=1280x720,CODECS="avc1.640028,mp4a.40.2",SUBTITLES="subtitles"
media-5/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=3262928,AVERAGE-BANDWIDTH=1894009,RESOLUTION=1920x1080,CODECS="avc1.640028,mp4a.40.2",SUBTITLES="subtitles"
media-6/index.m3u8
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=161304,RESOLUTION=854x480,CODECS="avc1.4D401F",URI="media-4/iframes.m3u8"
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=241392,RESOLUTION=1280x720,CODECS="avc1.640028",URI="media-5/iframes.m3u8"
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=532416,RESOLUTION=1920x1080,CODECS="avc1.640028",URI="media-6/iframes.m3u8"

Here is some explanation of the tags that are used in the master manifest.

  • EXT-X-STREAM-INF: this indicates an alternative rendition that is playable for the movie.
  • EXT-X-I-FRAME-STREAM-INF: This is an I-frame only rendition that is used for quick seeking/trick modes and showing thumbnails while seeking (at least these are two of the popular use cases).

Here is a small snippet of the child playlists that belongs to one of the renditions that the master playlist is pointing to.

#EXTM3U
#EXT-X-VERSION:4
#EXT-X-PLAYLIST-TYPE:VOD
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-TARGETDURATION:4
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
segment-0.ts
#EXTINF:4.000000,
segment-1.ts
#EXTINF:4.000000,
segment-2.ts
#EXTINF:4.000000,
segment-3.ts
#EXTINF:4.000000,
segment-4.ts

Here are explanations about some of the common tags that one can see in m3u8 files.

  • EXTM3U: this indicates that the file is an extended m3u file. Every HLS playlist must start with this tag.
  • EXT-X-PLAYLIST-TYPE – this tag can take either of two values – VOD or EVENT. If it is a VOD playlist, then the server cannot change any part of this playlist. If it is an EVENT type, the server cannot change or delete any part of of the playlist, but, only append to it.
  • EXT-X-INDEPENDENT-SEGMENTS tag indicates that every media sample in every segment can be decoded without information from other segments. This is applied to all the segments in the playlist.
  • EXT-X-TARGETDURATION: this specifies the maximum duration of the media file in seconds.
  • EXTINF tag specifies the duration of a media segment. It should be followed by the URI of the associated media segment – this is mandatory. You should ensure that the EXTINF value is less than or equal to the actual duration of the media file that it is referring to.

What is the minimum segment duration for HLS?

Earlier, there were recommendations from Apple to package your videos using a 10 sec segment duration (the EXTINF) value, but, this is being seen less & less today. Content providers are increasingly reducing the segment duration to 4 seconds and 6 seconds and there are primarily a couple of reasons for this:-

  • startup delay / join-time / latency is reduced: Apple has a requirement on the player side that it must buffer up to 3 segments before the playback can begin. What does this mean practically? If you have encoded at 5 mbps, then each second of video “costs” 5 MB. Then, if you have packaged at a segment duration of 10 seconds and you need to buffer 3 segments, then you are looking at downloading 150 MB (5 mbps * 10 seconds * 3 segments) of video even before playback can begin.
  • less susceptible to re-buffering: if it takes 10 seconds to download a segment, then a lot can happen in those 10 seconds to the network conditions. The network speed can drop and this will result in the player’s buffer getting depleted as it downloads the video causing buffering issues. It makes more sense to have shorter HLS segments and get the segments faster and allow the player to react better to the network conditions.

How to package a TS video into HLS

Commercial packagers: There is wide-spread packaging support for HLS in the open-source and commercial spaces. Unified Streaming (USP) & Wowza are two that come to mind that support HLS out of the box.

On the open-source side, you have the Shaka packager and FFmpeg that can be used for packaging in HLS.

Playback support for HLS

Player support: HLS being a very popular format is supported by most major player companies and browsers by default. Some of the player companies that I can think of that support HLS streaming are CastLabs, Bitmovin, THEOPlayer, NexPlayer, Kaltura, JWPlayer, and more. There are open-source players like HLS.js and VideoJS with the HLS.js plugin.

The above mentioned companies also provide apps for iOS/tvOS, Android, etc. to support HLS playback. In addition, Google’s Exoplayer also natively handles HLS playback.

Browser support: HLS playback is also natively supported in the Safari (Apple) (which means that you can take an HLS playlist, put it into the browser, hit Enter, and it will play back the video in the browser without the need of an external player). However, in general, companies use open-source and commercial players (like those listed above) to play back HLS video.

Testing your HLS playlist’s playback?

To test your HLS streams, you can use the reference HLS.js player ( https://hls-js.netlify.com/demo/ ). You can paste your own URL and then check if it conforms to the reference HLS player. Make sure you stream from an https link, or else the player will bork. Alternatively, you can turn off security or use the CORS plugin to force playback. Here is what the page looks like :-

You can also use the tools at the bottom of the demo page to analyze the video streaming performance.



Well, folks, that’s it for now. Hope you understood how the HLS streaming protocol works and what it takes to stream using HLS.

In the following articles, I will show you how to create HLS streams using free, open-source tools and stream it to the world!

Until then, take care and keep streaming!

About The Author

I’m Dr. Krishna Rao Vijayanagar, and I am the Founder and Editor of OTTVerse.com. I've spent several years working hands-on with Video Codecs (AVC, HEVC, MultiView Plus Depth), ABR streaming, and Video Analytics (QoE, Content & Audience, and Ad). I hope to use my experience and love for video streaming to bring you information and insights into the OTT universe. Please use the Contact Page to get in touch with me.

Leave a Reply