Highly Scalable and Redundant Live Streaming Over OTT

ZEE is back on live sports broadcasting with International League T20 and at Zee5, this was another first time for us to work on delivering live sport to end users.

It’s been a hell of a journey to ensure that we got the data flow right on every match day – from receiving the feed from UAE to the addition of SCTE-35/secondary ads on playout to then encoding with presets which provide best-in-class AV experience on live sports and then delivering the content through a CDN architecture which can handle millions of simultaneous requests been excruciatingly enjoyable and sometimes maddeningly frustrating!

This article has been authored by Suneel Khare, Senior Vice President and Head of Video Engineering and Rahul Banerjee, Principal Architect and supported by Falak Sangani, Gautam Kumar and Rakesh Rajan.

Suneel Khare

Suneel Khare is currently working as Senior Vice President in Zee5. Suneel has more than 2 decades of experience leading product development and operations in the media industry and has played various leadership roles in large M&E Enterprises.

Rahul Banerjee

Rahul Banerjee is currently working as Principal Architect on Zee5. Rahul has more than 18 years of experience of designing and developing solutions for the OTT space and embedded systems.

Murphy’s law kept coming back to haunt us, but our video engineering pipeline ensured that we always had an answer to challenges and ensured that no interruption to the end-user AV experience.

Each match was streamed in three different languages

  • English
  • Hindi
  • Tamil

On each of the streams pertaining to a given match, the video was the same across three channels, and the audio commentary and graphics (scoreboard, etc.) were pertaining to one of the aforementioned languages. The addition of commentary and relevant graphics was happening as part of production right at the stadium. Post-production, all these streams are carried over ISP network to India (more on it in later sections!)

Also, we had several double headers, so basically, it was like six different channels simultaneously on double header days!

The challenge to the Zee5 Video Engineering team was to provide the best in-class, seamless sports experience to every end user. That mandated us to build a fully redundant and scalable solution catering to large simultaneous requests.

The following sections delve into details about requirement and various aspects of live sports architecture.

Requirements

Requirements can be broken down largely into three parts as follows –

  • Streams characteristics related to requirements
  • Delivery of streams
  • Streaming content to various types of consumers (i.e., end users, B2B customers et. al.). One requirement on the stream side (i.e., content that the solution would consume) and various features required from the solution.

Stream Characteristics related Requirements

  • There will be a redundant stream for each match and language. So for a single match, 6 streams (3 + 3) would be made available to system.
  • Each stream would include one video (AVC encoded) and one audio (AAC-encoded) stream.
  • Each of the aforementioned 6 streams would be 25 Mbps SRT stream.
  • Each match would also have two secondary SRT feeds, each with 15 Mbps bit rate. One feed would have English commentary and graphic and other one would have Hindi language and graphics.
  • For double headers, parts of the first match and the second match would coincide, thus meaning capacity requirement becomes double for that phase.
  • SRT is chosen because it is fast and has inherently error resilient.

Delivery of streams

  • As alluded above, there will be 3 + 3 SRT streams for each match. Post-production streams were of j2k format (almost uncompressed), and was passed as an input to Zee Satellite broadcast chain.
  • This j2k streams was encoded into 25 Mbps SRT streams at ISP POP, and then through MPLS link was pushed to AWS data centre at Rabble, Mumbai.
  • For a given language, as described above, there were 1 + 1 redundancy. The primary stream was “hitless” (i.e., the ISP ensured that the delivery of primary stream was inherently redundant as this streams is further divided into two streams and merged just before reaching the endpoint), whereas the backup/secondary was not so.
  • These streams were received into AWS environment through AWS Direct Connect (a private virtual interface was used here for security reasons).
  • Also, from the same j2k stream, 15 Mbps streams were generated and pushed over the internet through a completely different path.
  • These internet streams are received on AWS environment using AWS MediaConnects.

Streaming Content to Various Types of Consumers

  • The first and foremost requirement is to provide best in class sports AV experience to every end users.
  • The solution must be redundant from start to finish, there shall not be any downtime for end users.
  • Support viewing of content on Zee5 platform.
  • Allow monetization by inserting SCTE-35 markers to general entertainment channels broadcasting on Zee5 through server side ad insertion (these channels by default doesn’t have SSAI enabled, because the stream doesn’t carry SCTE-35 marker).
  • Stream SRT to B2B customers with right format and logo.
  • Provide SRT stream to our partner to generate near-live highlight generation.
  • Stream to be served to end customers was HLS TS and clear (not DRM encrypted).
  • Every device consuming the stream would receive 1080p (1920×1080) as highest resolution!
  • Server Side Ad Insertion (SSAI) shall be possible on stream.

The following sections depict the architectural details of how we ended up achieving all the requirements and more!

Solution

This section would be explaining how we achieved aforementioned high level requirements one by one culminating at full and final architecture diagram (meeting all requirements) at the end of this section.

Redundant Streaming Architecture Enabling Immersive AV Experience

The following diagram depicts redundant data path depicting mechanism to generate single channel (i.e., stream pertaining to a particular language, English on this diagram) providing immersive sports experience for end users.

live streaming 
live sports streaming
  • As explained in the Requirements section, AWS direct connect is gateway to AWS system for streams.
  • SRT Streams (Primary and Backup) pertaining to a particular language would be received on two MediaConnects stationed at different Availability Zones.
  • Also the stream over internet is received on a separate MediaConnect.
  • Next hop for streams were playout. Playouts were responsible for following
    • Adding filler programs
    • Adding SCTE-35 markers based on director’s assistant track.
    • Adding secondary ads (Aston and L-Bands).
  • The playout solution was provided by a partner and it was running on AWS EC2.
  • As can be seen on above diagram, for each language stream used two playout instancesspread across availability zones.
  • Main playout would receive three inputs
    • Feed received on primary media connect.
    • Feed received on secondary media connect.
    • Feed received on media connect receiving over internet
  • Backup playout would receive two inputs
    • Feed received on backup media connect.
    • Feed received on media connect receiving over internet
  • The stream received through SRT was 1920×1080 (50 fields per second, interlaced), playout was programmed to generate 1920×1080 (25 frames per seconds progressive).
  • Playout’s output stream was going to the Encoder, and we used AWS MediaLive standard distribution to generate renditions to be served to users.
  • AWS MediaLive standard distribution comes with two NTP (Network Time Protocol) synced MediaLive encoders stationed on different availability zones being NTP synced ensured switching reliably to consume output of one encoder to another in case first encoder has some issue.
  • MediaLive also allows switching to another input stream from primary in case it finds any anomaly on input. But for that, input stream has to be RTP, so we used playlets to convert SRT streams to RTP and fed to MediaLive.
  • MediaLive encoder comes with plethora of encoding options to control encoding quality and bandwidth of generated stream. We undertook an exercise to come up with renditions and best quality that can be achieved for each renditions while keeping a keen eye on various QoS metrics by fine tuning relevant levers on MediaLive.
  • Generated renditions from MediaLive was pushed to origin, which for us was Akamai MSL. Akamai MSL is widely used across the M&E industry to serve sports events with high concurrent viewership. MSL also enables high redundancy by being multi region.
  • Finally the manifests and segments were served to end users through Akamai CDN (AMD).

Each brick on this solution was chosen after carefully analyzing performance, fitment, and redundancy.

How did all this talk about redundant systems transpire in reality?

Proof of the pudding is really in its test! Throughout the one month of channels, we had a significant number of interruptions on the source side of the stream, namely in the following manner

  • We had a sudden mainstream of a feed going down regularly and switched to using a backup feed as a primary source to playout input to circumvent this issue.
  • Multiple times, all main and backup feeds coming through AWS Direct Connect repeatedly went down during match. At that point, we used feeds coming over the internet as primary input on playout to circumvent the issue.

We are extremely proud that despite repeated issues on input stream side during almost every match (of the aforementioned nature), the immersive AV experience of the end customer was never disturbed!

Monetising from Sports streaming on Entertainment Channels through SSAI

IL T20 matches were also streamed into around ten entertainment channels (i.e., Zee TV et al.). Usually, the entertainment channel streams does not include SCTE-35 markers and SSAI thus is not done. However, there was a requirement from the business team to monetize the streaming of IL T20 matches on these channels through the insertion of SCTE-35 markers.

This requirement came at the last minute and thus forced us to be really innovative for coming up with the solution!

live streaming 
live sports streaming

The following diagram depicts the solution –

The regular streaming path on Zee5 is shown as a simplified (removed the redundant connections for the sake of simplicity) version, and data path can be followed through link 1 and 2.

Point of interest for this section really starts from link 3. At this point, we have a 25 Mbps SRT stream with SCTE-35 and secondary stream is available. Let’s follow the data path.

  • For each language stream, two media connects (for redundancy, in different availability zones) were provisioned to receive output from playouts (remember for each language stream, we have two playouts on different AZ).
  • So we have one media connect per playout, downstream to playout and receiving content (please see link 3). For sake of simplicity, on above diagram we shown only primary connection.
  • We already have a direct connect provisioned between Delhi NCR data centre and AWS Mumbai region through a 650 Mbps MPLS for regular in house channel delivery for packaging to AWS Media packager. When we started this work, almost 80% of link capacity (520-530 Mbps) was used.
  • However, if we had to send 25 Mbps stream per channel, we would be well over the capacity. Increasing link capacity was not an option as it takes time!
  • This is where the team came up with a very innovative solution as described below
    • We sent only three streams (one stream each with audio commentary and graphics pertaining to English, Hindi and Tamil) through Direct connect to Delhi NCR data centre and they were received on elemental; encoders through link 5.
    • Also we have ensured redundancy over here by sending same 3 streams over another link to same on Prem encoder(s) (not shown on diagram). So on Prem encoders were receiving 3 streams (with appropriate 1 + 1) redundancy).
    • Now the on Prem elemental encoder (say the one receiving English stream) was working as UDP multicast server, to which, any number of elemental encoders responsible for streaming match with English commentary and graphics can connect over multicast.
  • With the above mechanism, by using 75 Mbps bandwidth from MPLS, we have been able to serve around 10 channels (out of this 10 channels, some have a requirement to serve English commentary, some Hindi, and some Tamil).
  • Now, let’s take the example of Zee TV and see how it used to work.
    • Elemental encoder responsible for generating renditions for Zee TV (on Zee5) has now two inputs, one from regular on-prem playout (not shown on diagram) and second input is UDP multicast as depicted through link 6 on above diagram.
    • Data over this second input is only available while the match telecast is on.
    • As part of the agreed run order and at a specific time on each match day, we switched Zee TV Elemental encoder to start consuming data from the second input (over UDP multicast containing SCTE-35 marker). That ensures now SSAI can be down on Zee TV as the stream contains SCTE-35 marker!
    • Once renditions are generated, those renditions were pushed to Mediapackager through predefined and existing Zee5 live path.
    • At the end of the match again, as per agreed run order and at a specific time, we switched Zee TV elemental encoder to consume data from regular playout input.
  • This switching between worked in an absolutely seamless and frame-accurate manner. Adhering to a pre-defined run order always ensured switching never caused any abrupt experience issue to end user.

The above mechanism enabled the achievement of the business requirement of monetizing sports streaming on entertainment channels through SSAI. The above solution ensured we reused the existing infrastructure to achieve the requirement.

The above solution worked throughout all 34 matches in an absolutely seamless manner!

One more level of redundancy

While the design and solutioning is quite bulletproof, we wanted to have another disaster recovery mechanism. The idea was that this mechanism will re-use as few components shown in previous diagrams as possible.

So, this was the streaming chain that we have built for absolute disaster recovery. This chain was built to be used only for English language and it was not meant to be subjected to SSAI.

live streaming 
live sports streaming

The data flow is as follows –

  • Source to this data flow is English SRT stream arriving at AWS over the internet. This ensures that dependency on AWS Direct Connect on Mumbai as single source (gateway) for data to AWS system is no more there.
  • From Media connect (with 1 + 1 redundancy) data is passed on to on Prem elemental encoder through Airtel link (not using the usual direct connect between Data centre in Delhi NCR region to AWS Mumbai region as down for earlier use cases).
  • The rest of the flow is the same as regular live flow barring CDN usage.
  • Usually, for live flow, AMD is the client-facing CDN. However, to not have AMD (even though it has 100% SLA) as a single point of failure, DR flow uses Cloudfront as client-facing CDN.

By default, this flow remained switched off and was only meant to be enabled when the main flow failed. Thankfully it was never required to be switched on!

Noida is a city in Delhi NCR. This is where the Zee data center is located.

And finally …

So we have so far seen various aspects of solutioning. How does it look when everything gets combined into a single architecture document?

live streaming 
live sports streaming

It looks really complex, and to be truthful, it was!

On a lighter note, our design almost had 100% branch coverage in terms of being tested, as we had several glitches (some being as prolonged as a few minutes) on incoming SRT streams to the AWS environment.

Experience throughout the tournament

For most of us within the video engineering team, this was our first experience running live sports streaming of this magnitude. There were a lot of learnings for all of us. The experience has been excruciating at times, but at end it was all worth it!

While we meticulously prepared various use cases, created designs, tested various scenarios under our control, the actual streaming path (i.e. setting up MPLS, setting up hosted connection, getting stable SRT through ISP network through direct connect to AWS environment), and worked late evening on 11th January IST (tournament started on 13th January mid day IST)!

The wait to get the whole system working was sometimes nerve-wracking, but at the end, all the accolades that the team received because of the fantastic AV quality and flawless delivery of content to each client left all the team members happy, eager, and motivated for the next challenge (and surely they are on the way)!

Last but not the least, end to end latency of live sports streamed over OTT is always the elephant in the room.

We measured the latency from the time a given frame arrived in the AWS environment till the encoded form of it was shown on various clients. We have seen that on Apple iPhone, the aforementioned latency was around 10 seconds, and for most of the other clients, it was around 15 seconds or so!

This is again best in class by far!

Suneel Khare
Suneel Khare
Senior Vice President at Zee5

Suneel Khare is Senior Vice President at Zee5. Suneel has more than 22 years of experience in building and driving the next generation products in the video technology space. Suneel played key roles in various large media companies like Cisco, Huawei, Samsung, HCL & Packet Video Corporation. In his current role, Suneel is heading Video Engineering, CDN and Content technologies at ZEE5.

Rahul Banerjee
Rahul Banerjee
Principal Engineer at Zee5

Rahul Banerjee is a Principal Engineer at Zee5. In his current role, he is spearheading continuous AV experience improvement for end users. Prior to working at Zee5, Rahul filed a patent on optimizing I-frame delivery in ABR content and is the author of multiple published papers.

Previously, Rahul worked at Synamedia/CISCO/NDS, Marvell Semiconductor, and LSI Logic.

Pallycon April NAB 2024

Leave a Comment

Your email address will not be published. Required fields are marked *

Enjoying this article? Subscribe to OTTVerse and receive exclusive news and information from the OTT Industry.