How to Manage Video Latency on Low Bandwidth Networks
by Pavan D (Manager, Field Applications – System Solutions)
If delivering true real-time interactive experiences is high on your agenda, then you need an effective strategy for minimizing video streaming latency. This is especially important if there is a high-precision or mission-critical task at stake. However, low latency streaming on low bandwidth networks is a challenge. For instance, consider a low-bandwidth mobile network being used for remotely flying a drone. Such a drone captures video from an on-board camera and streams it to a ground station. In a scenario like this, the latency of transmission to the ground station needs to be low. At the same time, the video being streamed needs to be clearly intelligible for accurate navigation. Meeting both these are not straightforward when the channel has a limited capacity.
Challenges in Low Latency Streaming on Low Bandwidth Networks
Here are some of the key challenges in low latency streaming on low bandwidth networks:
The real challenge is to meet all of these three goals while maintaining low streaming latency (For more insights, read our blog that discusses these trade-offs in latency, bandwidth and video quality).
Let us consider how you can address your latency challenges.
Avoiding Burst Transmission
The natural solution for avoiding bursts in consumed bandwidth is Constant Bitrate (CBR) streaming.But CBR streaming can result in higher latency when compared to Variable Bitrate (VBR) streaming (inherently bursty in nature) due to additional buffering for transmission rate control. Therefore, the right alternative to ensure low latency is Constrained VBR (CVBR) streaming.
In CVBR streaming, the peak encoded bit rate for any portion of video is constrained within an upper bound, ensuring the bursts on the network are kept in check. At the same time, it allows the encoder to vary its bitrate to accommodate changes in video complexity while not needing additional buffering.
Handling Data Loss
If the network is loss prone, you can minimize the impairments on the decoder end by doing the following:
You can leverage sub-frame encoding (slice encoding) – an effective solution that offers error resilience as well as reduced latency. Slices are independently decode-able thereby reducing the dependency on lost packets from other slices. And since the time taken to encode/decode and process slices is lower, it reduces the overall latency too.
Managing Video Quality
The third concern is: how do you maintain consistent video quality? Here is how:
Decide the best resolution required for the scenario. You can also resize or crop relevant portions of the frame, once you have determined the right resolution.
Tune the quantization parameters of encoder according to the video input.
Select an appropriate video encoding format (Eg: HEVC can be used instead of H.264) to reduce bitrate consumed for similar quality video.Further, by smartly deciding the operating frame rate for the target application, you will also be able to reduce latency. Higher the frame rate, lower is the latency and vice versa.
Though it looks challenging to balance these trade-offs, Ittiam’s adroitLive SDK does just that and supports ultra-low latency Full-HD streaming – achieving as low as 50 milliseconds glass-to-glass. This solution is available on different embedded platforms, including Android/iOS players for hand-held devices.