What is Video Latency?: The Ultimate Guide for 2023
Ensuring that your streaming setup provides the optimal latency for your use case is absolutely key in online video streaming. This is because latency plays a crucial role in curating your live streaming experiences.
In this article, we’ll cover the ins and outs of video latency. We’ll talk about why video latency is important and what the optimal latency is in different use cases. From there, we will discuss how to measure the latency of your streaming setup and how to reduce latency if necessary.
Table of Contents
- Video Latency: The Basics
- How to Conduct a Video Latency Test
- How to Reduce Latency
- How Throughput and Bandwidth Relate to Latency
- Stream on Maestro
What is Video Latency?
Video latency is the lag between the time a live stream is captured and when it reaches the audience. This is also called the “glass-to-glass delay,” referring to the glass of the camera that captures the stream and the viewer’s screen that plays the video.
There are different levels of latency that are important for streaming, ranging from standard broadcast latency to real-time latency, but we will dive into that a little later.
Why Does Latency Matter in Streaming?
Latency matters in streaming because it can affect the viewers’ experience. This is especially true in instances where the audience actively participates or interacts amongst themselves.
For example, that include Q&As or panels require as little latency as possible since there is a back-and-forth interaction between the audience and the presenters on the stream.
Optimal Latency for Different Use Cases
Level of Latency
Under one second
Education, Q&A sessions, panels, auctions, betting
Five seconds or fewer
Most virtual events (concerts, conferences, etc.)
Around 30 seconds
4K streaming and events with built-in error repair (presidential address, events with visually exciting production elements, etc.)
Video Latency vs. Built-In Delay
It is important to differentiate between latency and built-in delay. Latency is heavily connected to the capabilities of your streaming setup, but a built-in delay is a time gap that is intentionally set up in a stream.
By adding a built-in delay, broadcasters can avoid situations like the 2004 Super Bowl wardrobe malfunction that reached an audience of 150 million. This time buffer gives broadcasters the leeway to quickly change the stream sources or black out the screen to avoid this sort of spectacle reaching the viewers at home.
How to Conduct a Video Latency Test
Conducting a video latency test is pretty straightforward. All you need is to run a test stream with a timestamp and have a second device to view the stream.
Put the screen with your stream preview and the screen with the user-facing video player side-by-side and take a photo. Subtract the time between the actual time stamp and the one on the user-facing screen, and that is your latency.
If you don’t have a way to incorporate a timestamp as an overlay, you simply use the latency test countdown timer on hamtv.com.
How to Reduce Latency
Since lower latency is valuable in some use cases, it is important for some broadcasters and production directors to understand how to reduce latency for their streams in case the need arises.
In order to understand how to reduce latency, you must first understand the three factors that contribute to latency: encoder settings, internet speed, and streaming protocols.
1. Optimize Encoder Settings
One of the quickest ways to reduce your video latency is by optimizing your encoder settings, starting with the bitrate.
By lowering the bitrate of your stream, you can lower your latency. However, when you lower your bitrate, you’ll also lower your resolution, which affects the quality of your stream.
In some situations, it is worth sacrificing a little bit of the quality to reduce your latency, and in other situations, maintaining the quality of the stream is worth having higher latency.
Please check out our recommended encoder settings guide for specific details for streaming on Maestro.
2. Get Faster Internet
Improving your internet speed is another way to reduce your video latency. Many providers offer extenders, enhanced modems, and other equipment that are used to strengthen internet connections.
If your provider doesn’t offer tools to upgrade your current speed, you may want to consider switching to a provider that can guarantee faster speeds.
3. Choose the Right Streaming Protocols
The last technique you can use to reduce video latency is to choose the most appropriate streaming protocols. If you’re not familiar, protocols are the technology that carries video signals throughout the streaming process.
Different protocols can be used for carrying the signal from the source to the encoder, the encoder to the online video platform, and the online video platform to the viewer. Your encoder and OVP will determine which streaming protocols are used, so if your setup is yielding too high of latency, you may consider opting for other technology for a more optimal setup.
How Throughput and Bandwidth Relate to Latency
While we’re on the topic of video latency, it is important to mention throughput and bandwidth. In fact, these two metrics directly correlate with latency.
Bandwidth is the amount of data that can be transmitted in a stream, and throughput is the capacity of data that can be transmitted over a given time. As we’ve covered, latency is the amount of time required for this transmission.
Stream on Maestro
Are you looking for a powerful solution for live and on-demand video streaming? Look no further because Maestro’s got you covered.
Sign up for Maestro to start streaming in no time at all!
Join our Discord server to learn more tips and tricks for streaming on Maestro.