Looks like you've been waiting a while. Try refreshing or keep waiting a few more seconds.
Services
Live Streaming
Multisite
Upload and Automate
Content Library
Features
Streaming Hardware
Entry-Level
RAY Encoder
Server-Grade
Encoders & Decoders
Streaming Software
ProPresenter Stream
Streaming Solutions For
Church Streaming
Enterprise Streaming
Education Streaming
Event Streaming
Sports Streaming
From Our Blog
See how Hillsong uses Resi to live stream to a global audience.
Resi Resources
Blog
Podcast
Glossary
Testimonials
Webinar
Resi Equip
Calculate how much bandwidth you'll need to stream.
Contact Us
Studio Docs NEW
Control Docs
Latency is the time it takes for data to reach its destination and back. In the context of live streaming video, latency measures the delay when transferring a single frame from the camera to end-users.
There are numerous steps along the way where delays can be introduced, from encoding and uploading the video content to an over-the-top (OTT) video platform to cloud transcoding and video delivery. During the first stage of a live stream, the video content is usually compressed to reduce the amount of data that needs to be transmitted to an online video platform or other video delivery software. This encoding process, often called the first mile, can contribute to latency depending on the type of encoder used. In general, hardware encoders have higher encoding speeds and lead to lower latency than software encoders because they’re devices that are purpose-built for encoding.
The streaming protocol used can also have an enormous impact on latency. HTTP Live Streaming (HLS) and Dynamic Adaptive Streaming of HTTP (MPEG-DASH) are the two main protocols used for live streaming because they deliver video content over HTTP and are compatible with most devices. More recently, both protocols have developed low latency updates that are based on the MPEG Common Media Application Format (CMAF).
In addition, content delivery networks (CDNs) can help reduce latency by leveraging thousands of servers distributed around the world. A CDN can shorten last mile delivery by routing video content through its network using the fastest path possible, which can significantly reduce latency, buffering, and other issues. CDNs also help to distribute the load amongst a larger number of streaming servers for large scale broadcasts.
Finally, latency can be introduced during playback because the viewer’s internet bandwidth and device processing power impact how fast the stream can be downloaded. That’s why many broadcasters offer streams at multiple bitrates and leverage delivery protocols that support adaptive bitrate streaming (ABS). This approach allows video players to choose a video bitrate that maximizes quality without leading to buffering or high latency.
Resi uses a combination of robust hardware encoders and cutting-edge scalable cloud infrastructure to ensure optimized low-latency video processing. While low-latency streams are sometimes important in mission-critical situations where as close to real time is necessary, Resi’s Resilient Streaming Protocol (RSP) takes a different approach. By introducing a short delay, complete and error-free audio and video delivery can be confidently delivered over unpredictable networks or even wireless cellular hotspots.
Resi’s Live Stream Platform will then stream video to any device using low-latency protocols like HLS and MPEG-DASH. The cloud-based streaming platform also includes modern capabilities like cloud transcoding and ABS for a complete end-to-end solution. With Resi, broadcasters have easy-to-use tools for quickly delivering ultra-reliable live streams at scale.
Just tell us a bit about your streaming needs.
Resi demos are the best way to get a full walkthrough of Resi’s streaming features. Ask questions, get pricing, and more to get you streaming quickly and reliably.