GET A DEMO

How Low Latency Improves Live Streaming?

Published On August 30th, 2023 4155Tech Talk
Learn More About Live Streaming

Latency is the time between the capturing of a video frame and its playback. In a live streaming context, it is measured by the delay between the time a visual is recorded in real-time and when it is seen on the screen. While choosing a streaming technology, latency is an important concern because the main decision will be the point of a tradeoff between scalability and low latency.

It is especially important in interactive live video streaming platform as high latency can have a significant impact on user experience.

A live streaming workflow consists of a number of components that contribute to latency. Here are the top 5 components of the streaming workflow:

Encoding
The latency is sensitive to the configuration and quality of the desired output signal. It is also affected by streaming protocols as they can only output a portion of media that has been ingested

Transmission
The latency for an encoded video to transmit over the Internet is affected by the bitrate, proximity to the VDS and bandwidth. A lower bitrate usually means lower latency.

CDN
In order to scale up the delivery of your content leveraging content delivery networks are the most popular option. Content will be propagated to different caches, adding to the latency.

Network speed
Be it a cabled network, a WiFi hotspot, or a mobile connection to access the content a user’s internet connection has a significant impact on latency. The geographical location and the proximity of CDN endpoints could add to latency.

Player buffer
Video players buffer media for smooth playback. The length of buffers is defined by media specifications including some flexibility. By optimizing buffer configuration, latency will be significantly impacted.

The Importance Of Latency 

  1. Ensures scalability in the platform – It can be limited by streaming protocols. RTMP would need complex server setups to reach a large audience. This can result in scaling issues or even the systems collapsing under the heavy user load. Hence HTTP based protocols have become popular. 
  2. Better quality of playback – For a higher quality of playback, higher bandwidth is required due to factors like higher resolutions and frame rates, and transcoding. Even though player buffers could be reduced, it can significantly impact the user experience.
  3. Improves viewing Experience – A low latency live stream goes a long way in delivering impressive user experience. Despite using top-end technologies, most online video struggle with the “last mile” service to viewers because of latency caused by the factors listed above.
  4. Enables interactivity – Low latency streams enable users to interact in real-time communication which is especially useful for an interactive user experience like sports, auctions, medical intervention, etc. 
  5. Synchronized video feed – Real-time video and chats are necessary to deliver an interactive experience while streaming live. Low latency provides users with seamless live streaming.
  6. Supports quick reactions in medical procedures – Medical professionals around the world choose to integrate low latency video communications for clinical procedures, communication and collaboration. Live surgeries can even be streamed for training purposes.

When does latency matter?

Latency sensitive applications are those which focus on real-time interactivity. Be it communication, sports or any other live VOD streaming application, the importance of latency is in the client satisfaction with the last-mile delivery of service.

Real-time communication

For real-time communication, user experience starts to degrade beyond 200ms. As latency increases, challenges relating to noise and echo cancellation also become significantly more complex. In the real-time communication space, protocols and applications are generally designed to compromise on visual quality to ensure that a minimal level of audio quality and latency are consistent and available.

To build a real-time application, ensuring predictable and low latency processing is key. While achieving zero latency may not be possible, the goal is to deliver information in the shortest duration of time possible. Factors like network and disk i/o can reduce or manage the consequences of low latency streaming. Network I/O implies that the closer your client is to the server, the lower the network latency.

Since real-time applications are data-intensive and require a database to service the real-time request, latency is added in case of live communication. The disk i/o manages this aspect.

Sports & eSports

Most real-time sporting events are being simulcast online alongside more traditional content delivery chains like cable, satellite or terrestrial broadcasts. The target is to be around the same latency as users experience in a traditional broadcast chain. However, it is common to introduce an artificial latency of about 5 ms to ensure that content can be censored before the actual broadcast.

This is especially helpful in avoiding potential fines from regulatory authorities which can run into steep amounts. eSports are less sensitive to latency than many live sports due to lack of comparable linear broadcast experience. With an increasing number of social media and news services, high latency could mean that your viewers are updated on the scores taken by their favorite team even before the streams reached them over the platform.

Auctions and gambling

Large traditional auction houses such as Sotheby’s have started integrating live streaming experiences. The creation of online-only live streaming auctions is also an upcoming trend. This opens up the auction to bidders around the world for a real-time bidding experience.

Live streaming experiences are also being built specifically for gambling. Many websites are building casino blackjack, roulette, or poker experiences with real-life dealers. These streams are based on interactivity and allow spectators to interact and watch their favorite gamers play online.

Conclusion

There are several ways to minimize video latency while live streaming and while maintaining HD quality for the visuals. Whether you choose a hardware encoder and decoder combo or select a video transport protocol delivering high-quality video at low latency over the internet has become a no-brainer as live streaming gains importance in various niches. Technologies like WebRTC, HLS and SRT have evolved to keep latency to a minimum. 

Building a live streaming platform optimized for scalable viewing experience and optimized for low latency requires several integrations which can be time-consuming to begin from scratch. However with a SaaP provider like Vplayed, creating a unique platform with the latest technological integrations like transcoding and CDN will help you not just live stream, but also to build your own brand.

How Low Latency Improves Live Streaming

Sundar Krishnamorthy

Hi I'm Sundar. Interested in digging deep into business management tools and IPTV media technologies. Love to blog, discuss and share views on business management tips and tricks.

Leave a Reply

Your email address will not be published.

Request Demo
Request Demo