<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Is there a solution to latency in live broadcast?

4 minute read

Anyone who's watched a news report with a live insert from a neighbouring continent will have noticed that there tends to be a long delay...

...between a question being asked and answered. Sometimes that's been called satellite delay, but the situation for live streamers taking questions from viewers is often even worse, with tens of seconds between the streamer speaking and the audience hearing the words. What's going on?

There are many sources of delay and latency in modern video setups, and they exist on different scales. Anyone working live job might complain about delays of one to a few frames between the camera and, say, a big LED video wall. Often, the only people with the ability to influence that will be the firmware engineers who work on the equipment; it's too easy for people in those positions to buffer a frame here and there, ending up with a visibly laggy system. Pressure from users can foment change, and companies have released updates to cameras that improve their responsiveness by a few frames, which is a significant upgrade in the eyes of a seasoned camera operator.

But the delays involved in long-distance broadcasts are often much, much longer.

Perhaps the first thing to realise is that, as we've hinted, satellite delay isn't really caused by satellites. Any kind of satellite uplink where the dish doesn't have to be motorised to follow the satellite around in the sky is using satellite in a geostationary orbit. That's orbit that takes exactly one day to circle the earth, so the satellite appears to remain in a static position in the sky. That also means they have to be at a very specific distance - further away, and they'd fly off into space, while any closer and they'd risk falling out of orbit if they tried to travel at that same speed. They also have to be directly above the equator.

Considering they won't be directly overhead unless you're actually standing on the equator at exactly the right longitude, the range will vary, but the minimum time taken for a radio signal to reach a geostationary satellite is only a bit more than a tenth of a second - say a quarter of a second, round trip. It's not much more even if you're a long distance from the equator, given the radius of the planet is only about a sixth of the distance to the satellite. It's barely noticeable. Satellites like SpaceX's Starlink system, which fly overhead quickly and hand off service provision automatically as they do so, orbit much lower, and could achieve even smaller delays.

Low orbit satellites like Starlink could help reduce latency delays. Image: Shutterstock.

Codecs, compression and latency

The most noticeable delay for satellites, then, is compression. Many satellite uplinks have been using at least MPEG-2 compression for decades, but no matter what the latest mathematics is, most of the codecs used for low-bandwidth applications make use of the similarities between frames to achieve better compression. To do that, they need to be able to look at more than one frame; often, frames are considered in groups-of-pictures - GOPs - of anywhere from six to fifteen. The codec can't calculate the differences between a group of fifteen frames if it doesn't have fifteen frames to consider, so that adds half a second of delay at 30fps.

Yes, that's why some video monitoring links, like the Hollyland Mars 400S Pro, that use H.264 compression have lower latency when they're transmitting higher frame rate video; five frames at 60fps is less wall-clock time than five frames at 24fps (read the review of the Teradek Bolt 4K 750 for more joint source-channel coding, which achieves near zero latency - for a price). Sometimes GOP length is a factor we can leverage when setting up streaming software such as OBS; some programs will let us set the GOP length. Inevitably, shorter GOPs make the codec less efficient, and we must trade off some image quality for lower latency, or allow more bandwidth.

For many one-to-many multicast broadcasting, though, these sub-second delays still pale into insignificance against the time it takes for the content distribution network to farm out the data. Sending a message to someone doing a livestream on YouTube is an exercise in patience for a few reasons. Even if we assume the message gets to the streamer quickly, which it generally will, then that person's webcam will immediately encounter the group-of-pictures problem. That webcam might create a high-bandwith stream that the computer might then re-encode to send it over the internet, incurring another set of delays.

Sending that signal over the network takes time, although on modern networks not much. Round-trip times between someone's home computer and the server farms used by the likes of YouTube are likely to be reasonably low; under 100ms. Things get complicated once it gets to the distributor: we're used to being able to view streams on a variety of devices, and if the stream is available in several resolutions and in standard and high dynamic ranges, or at different frame rates, it will inevitably be reencoded again, probably using a very long GOP to maximise efficiency and minimise the distributor's bandwidth costs.


Perhaps the biggest delay, though, is due to reliability. At every stage, streaming media devices and services buffer video. The more buffering, the more resilience the system has against momentary dropouts; buffer ten seconds, and the average bitrate over ten seconds has to be high enough to keep the stream moving. Buffer half a second, and if there's a problem lasting half a second, the glitch is visible. That, overwhelmingly, is the reason behind streams being slow. With much of the audience watching on a cellphone sharing a tower with dozens of others, reliability is low, and latency must be high.

Clearly, the internet can stream video with low latency; every Zoom and Skype call does it, but problems show up almost immediately. Either way, the lion's share of the delay, in buffering and reencoding, is not under the control of anyone but Twitch or YouTube in any case, so in the end, there's not much the average mortal can do about it.


Tags: Studio & Broadcast Live