Understanding Latency

November 24, 2020
Written by
Reviewed by

Understanding-Latency.png

What is latency?

In the digital world, latency does not describe a skill or desire that has not yet been actualized nor does it describe an inactive pathogen. Instead, it refers to the time it takes for data to transfer. To get more specific, let’s look at the different types of latency.

Network latency

Network latency is the delay in your action and the Internet’s response to your action. For example, if you click a web page and it takes a while to load, the time it takes to load is the latency (also known as lag).

To get a bit more technical, network latency can be described as the time it takes for a data packet to travel roundtrip from the sender to the receiver then back to the sender. This is also called round trip time (RTT). The time is often completed in milliseconds, but those milliseconds can add up to create a seemingly slow experience for the end-user.

(What’s a datapacket? Glad you asked. When data is transmitted, it’s broken down into packets of similar data types to make it easier to send. The data is then put back together once all of the data has arrived at the destination.)

Latency and IoT

In the context of the Internet of Things (IoT), network latency impacts the performance of internet-connected devices, dependent on the use case. For example, latency could refer to the time it takes for one device to make a request and for a second device to respond. Take a smart lock on your home: you’ll use your phone (first device) to make a request to the lock (second device) to open your door. If there is a long lag between requests, it can have negative consequences like not being able to get into your home or taking too long to lock your home.

Latency and VoIP

With Voice over Internet Protocol (VoIP), latency has an extra layer of delay due to speed of sound and the time it takes for your voice to reach your phone or computer. Once the sound reaches your phone/computer, it is affected by network latency—the voice packet is transmitted, processed by the receiving server, and then sent to its destination. Anything under 150 milliseconds is considered good quality because the receiver won’t notice the lapse in time. Once you hit 300 milliseconds in latency, you’ll notice a lag in speech and audio quality.

Fun fact: The speed of sound (343 meters per second) is so much slower than VoIP. If you screamed in San Francisco and the sound managed to carry hundreds of miles, it would take almost 30 minutes for someone in Los Angeles to hear you.

What is low latency?

Low latency is what all businesses and users ideally want. Low latency means that there is very little time between your action and the response to your action—short enough that you don’t notice the delay!

In particular, low latency is important for industries like:

  • Video gaming: Having low latency with video gaming means a faster reaction time, which can be a substantial advantage against other players.
  • Streaming: Whether you’re the platform providing streaming for TV, music, or live events, or the end-user watching, low latency is important for a smooth watching experience.
  • Financial trading: With market updates and stock prices changing each second, low latency is incredibly important to ensure traders have the exact numbers to inform their decisions.
  • Telecommunications: Phone and video calls need low latency so that there isn’t a lag between speech or visuals.
  • Security: From monitoring security footage to locking your home, this IoT-enhanced industry needs low latency to ensure that businesses and homes are secure.

While there are some industries that need lower latency than others, low latency is important across all industries in order to provide an enjoyable user experience. A slow website or an app that doesn’t load are frustrating experiences no matter the use case.

Latency vs bandwidth vs throughput

Latency, bandwidth, and throughput all affect the quality of the experience or communication, so it’s important to understand the nuances between the three.

Bandwidth measures the maximum amount of data that can be sent through the network at any given time.

Think about it in terms of a straw. If you drink a soda with a small and narrow stir straw, it will take you a lot longer to finish your glass than if you were drinking with a large, wide straw. The wider the straw (bandwidth), the more soda (data) can be passed through at any time.

Latency measures how quickly the data can reach the client or the time it takes for the liquid at the bottom of the straw to reach your mouth.

Throughput is the average amount of data that is transmitted over a specific period of time or, in straw terms, the average amount of soda you can drink with the size straw you have. Unlike bandwidth, throughput factors in latency (the time it takes for data to reach the client).

What causes latency?

Now that we know what latency is, let’s talk about what causes the delay.

  1. Distance: The distance between the client making a request and the server responding to that request can have a big impact on latency. The longer the propagation (the time it takes a data packet to travel at the speed of light), the greater effect it will have on latency. If you’re located in Denver and you click on a website where the server is hosted in Denver, the server should respond very quickly to your request. If that server was instead hosted in Frankfurt, Germany, it would not respond nearly as quickly.
  2. Website construction: If the website is heavy with content, like high-resolution images, or pulling a lot of information from third parties, you could also experience a fair amount of latency.
  3. Transmission medium: This refers to the physical components that make up the journey from client to server and back to the client. For example, copper cables are a lot less efficient than fiber optics, and ethernet cables tend to be more reliable than WiFi.
  4. Packet size: Data packets often cross multiple networks and Internet Exchange Points (IEP). At each IEP, routers process and analyze the data packet and often break down the packet into smaller chunks. The more the routers need to break down the packet, the more time is added to the latency.
  5. Data access: Depending on the physical medium the data is stored in, latency can be added by less efficient storage types like traditional hard drives vs. solid state drives.

How to reduce latency

Due to the speed of light and/or sound, any physical distance will cause some sort of latency. Unfortunately, that means latency will always exist to some degree, but there are ways to substantially reduce it.

  1. Reduce content on web pages: Avoid using huge images or content from multiple third-party sites to help improve your load time. If you’re pulling information from another server, this external request will increase latency depending on the quality of the third-party server.
  2. Use a Content Delivery Network (CDN): CDNs decrease latency by caching resources close to the user and delivering them over private networks. This creates more efficient paths for data packets to travel. When using Twilio services, such as our voice or messaging APIs, Twilio Interconnect provides secure connections between your network and Twilio’s. These offer a much more consistent network experience than the open Internet.
  3. Understand your level of latency: It’s helpful to understand the degree of latency you’re experiencing. Twilio offers Voice Insights to help you monitor your IP communications with Twilio. The insights tool takes a look at latency as well as jitter (the difference in latency between data packets) and packet loss.
  4. Choose low-latency products: For example, Twilio’s Voice Conference offering provides low latency by routing each call through the closest of 7 geographical regions. In addition, Twilio’s Programmable Video product uses 9 media servers worldwide to offer global low latency.
  5. End-user latency: Sometimes it’s not the network but the user’s setup that is causing latency due to low bandwidth or a poor internet connection. If you as the user frequently experience high latency, it may be worthwhile to increase bandwidth or use an ethernet connection instead of WiFi.

TL;DR: your latency recap

To review, latency is the time it takes for data to be transmitted roundtrip from the sender to the receiver.

Latency is something all businesses and users experience. A small amount of latency is expected, but high latency needs to be addressed because it causes lagging, delays in loading, or, for VoIP, deterioration in sound quality. All of the negative consequences of high latency can cause friction with how your customers experience your service and perceive your brand.

While you can’t completely get rid of latency, you can set yourself up for success by reducing latency as much as possible. Look at the construction of your website to see if you can minimize load times by limiting the external content on the site. And, consider low latency solutions for your voice and video calls, like Twilio’s Voice Conference or Programmable Video.

For more information on latency, take a look at the following resources.