All you need to know about Low Latency in Video Streaming

Video Streaming

Let’s assume that you are watching a soccer game via your over-the-top (OTT) streaming service. Meanwhile, your neighbor in the adjacent house is watching the same game on a television set, loudly cheering, and enjoying goals and distressing over penalties for which you have to wait for another 30 seconds to watch.

Alternatively, perhaps you are watching a live match, your eagerness for the winners is revealed, when your Facebook or Twitter feed – majorly generated by TV audiences ruins the whole excitement 15 seconds before you see it.

In most general terms, Latency is the time interval between stimulation to a system and its reaction. When it comes to the video world, latency is the amount of time between the moment a frame is captured and the moment that frame is projected. With time-sensitive video content such as games, TV sports, news, or mere OTT content like interactive shows and e-sports, viewers hope to watch the events instantly as and when they unfold.

What Causes Video Latency?

There are numerous factors that contribute to the delay or ‘latency’ that you witness while watching a program live over the internet. For a lot of applications of IP communications, the major contributor to latency is the routing between the sender and receiver. This is caused due to individualized caches, ingest, and packaging operations duration, settings, and congestion on intermediate forwarding nodes.

Factors affecting video latency

– Ingest and packaging operations

– Segment length

– LAN Switch and Router Buffers

– Content delivery network (CDN)

– Video encoding pipeline duration

– Input device latency (e.g. microphones)

Achieving Low Latency Video in Live Streaming Applications

There’s no specific absolute value that defines ‘low latency.’ When humans interact through a live video conference or while playing a game, the latency that is lower than 100ms is measured to be low, since most humans do not remark a delay that small. But when it comes to an application where the system interacts with video – as is common in a lot of medical and industrial systems, in this case, the latency requirements can be much lower: 30ms, 10ms, or at times, a millisecond, depending on the system.

Video buffering is generally enforced whenever processing needs to be held until some specific amount of data is available. The amount of data buffering needed can vary from a few video lines, to a few pixels, or even to a number of whole frames.

Achieving lower latency is all about providing content quicker to the client. There are multiple techniques to get your video streaming solution into the low latency section, but most of them come with their own set of drawbacks that needs to be considered pre-handedly.

– Constantly monitoring and measuring video latency

– Choose the right segment duration since it is variable affects the video latency

– Optimize or video player(s)

– Form the right architecture

Summary

There’s an important consideration when it comes to low latency video streaming process. The most obvious is to offer consistent low latency to the online players, so that they stream the action in near real-time from any corner of the world, at any time. And while you do this, you need to understand that low latency comes with a hefty cost, and then accordingly make the choice of implementation.

Related posts

Leave a Comment