What Is Network Latency and how can we reduce it?

What is latency and what can we do to reduce it?

What is latency?

Introduction

In general terms, latency is the time delay between a user’s action and the response to the action. In an online setting, latency would be the time delay between an action from a user who is interacting with a website (e.g. clicking on a link) or a software application (e.g. tapping send on WhatsApp) and the response (complete loading of web page or double-tick delivery receipt on WhatsApp).

What Is Network Latency and how can we reduce it

Although data on the Internet can travel very quickly (at the speed of light for fiber optic cables), latency cannot be completely eliminated due to reasons such as distance and networking routing. In a worst-case scenario, latency can become so high that packets end up being lost -- thus, keeping Internet latency to a minimum is important. Lower network latency increases website performance and user satisfaction; it can even affect search engine optimization (SEO) performance.

What Is Network Latency and how can we reduce it

Network latency, throughput, and bandwidth

Network latency, throughput, and bandwidth can all affect the performance of your website or web application. Although they are highly-related, they are distinct terms. Bandwidth refers to the maximum amount of data that can pass through in a given amount of time. For example, a 100 Mbps bandwidth connection allows a theoretical maximum of 12.5 (100 / 8 = 12.5) megabytes of data to pass through every second. Throughput refers to the average amount of data that actually passes through. Throughput may not reach the maximum theoretical bandwidth because of latency. Increased latency means time is wasted on waiting for responses instead of data transmission. Thus, maximum throughput is a function of bandwidth (the higher the better) and latency (the lower the better).

Factors affecting latency

Several different factors can contribute to increased latency in website and application performance.

  1. Physical transmission medium: Fiber optic cables are the most responsive, allowing data transmission at the speed of light. Copper wires are slower. Wireless transmissions also exhibit higher latency compared to wired transmissions.
  2. Geographical distance: Even though fiber optic cables can allow data transmission at the speed of light, the delay in data transmission using fiber optic cables between two machines a hundred miles away from each other and halfway around the globe is several times faster.
  3. Routing: The internet is not a single network. It is made up of many smaller networks that are connected to one another. Data packets going across the internet often have to cross different networks to get to their eventual destination. When data packets have to cross networks, routers have to process these data packets to send them across. The processing time adds to the overall latency.
  4. Web application design: The way a website or web application is designed can also contribute to the latency experienced by the end user. For example, content-heavy websites that use a database-driven content management system increases latency because in order to load a web page, the content management system has to perform many operations to retrieve content and render the content into a webpage for display to the end user.

How to reduce internet latency

Internet latency can be reduced at two ends: the user end and the web server end. On the user’s end, switching to fiber optic connections and using wired Ethernet connections instead of wireless connections will significantly reduce the amount of Internet latency.

On the web server’s end, to mitigate the latency effects resulting from geographical distance and routing, content delivery networks (CDNs) can be used. CDNs often have multiple servers in different continents, which greatly reduces the physical distance that content has to travel to the end user and the number of different networks that data packets have to be routed.

Web designers can also reduce the effects of latency by using several techniques. They can optimize images to make them smaller, eliminate render-blocking resources by moving <script> or <link> tags to the bottom of a page’s HTML code, or minify JavaScript and CSS files.

Did you find this article helpful?

0 out of 0 Bunnies found this article helpful

Glossary

Latency

A measure of the time between a request being made and fulfilled. Latency is usually measured in milliseconds.

Packet Loss

The percentage of packets that get lost between a destination and a source.

Prove your Knowledge.
Earn a bunny diploma.

Test your knowledge in our Junior and Master Quizes to see where you stand. Prove your mastery by getting A+ and recieving a diploma.

Start the QuizBunny with a diploma.