What is latency?​

Fiber Optical cables connected to an optic ports and Network cables connected to ethernet ports

Latency is a delay between a user’s action and an application’s response to it, i.e., the time it takes for a data packet to go back and forth. This happens in all data communications between two points, which can be logging into a website, sending an email, or when you upload a packet to the cloud, to name a few examples.

How is it measured, and what causes it?

We measure latency in time, so milliseconds (ms) are generally used to calculate it. As to what causes latency, it is an unavoidable characteristic of networks, as the information travels to create communication between devices. What can be achieved is to reduce and mitigate it through various strategies, but to do so, it is necessary to understand the factors that affect latency.

4 main aspects that impact network latency

Fiber Optical cables connected to an optic ports and Network cables connected to ethernet ports

Form of transport

The physical path between the starting and end points and the type of data transport can influence latency. For example, older networks based on copper wires have a higher latency since their transport layer used to be through electricity. In contrast, fiber optics, based on light beams, have much less latency and provide additional security benefits.

global network around Earth, information technology concept

Propagation and connections

The farther apart the two connection points are, the higher the latency will be since latency depends on the distance between these points that communicate with each other. Therefore, the solution to reduce latency is to provide more connection points (POP or point of presence) to help the packet travel faster back and forth.

Close-up on engineering man hand trying to checking LAN to switch layer 2 for sharing file and networking at data center operating telecommunication for technology industry and business concept

Routers

The router is basically in charge of bringing the internet connection to the devices (Computer, electronic tablet, phone, etc.). All routers add latency to connectivity because they require processing the data it receives. Therefore, the efficiency of the router model and type can help reduce connection latency.

It is as essential to choose a router model that processes data efficiently as the network design to reduce hops between routers as much as possible since each piece of equipment involved will need to process data and therefore increase latency.

Server racks in computer network security server room data center, 3d rendering.

Server-side processing

Accessing information stored on servers, especially when dealing with complex databases and searches within these databases, will generate latency since the data packet cannot be sent back until the server obtains the requested information. Reducing latency requires data transport infrastructure, efficient data centers, and optimized and properly structured databases.

Latency, throughput, and network bandwidth.

Optimal network performance depends on latency, bandwidth, throughput, and jittering, and it’s important not to confuse them.

Bandwidth is the maximum amount of data that can pass through the network at anytime. At the same time, throughput is the average amount of data passing through the network in a given time.

Jitter means the variation in latency of data packets traveling between predetermined origin and termination points. This measure is expressed in milliseconds and measures the variation of latencies between an entry point and an exit point of the network.

Together with latency, these elements allow us to understand what network performance means, with throughput being a unit of measurement that integrates the information provided by bandwidth (amount of data) with latency (time).

How can we minimize latency?

In addition to addressing the four points we shared previously, you can use dedicated networks that speed up the transit of your packets in the network and provide direct communication between the connection points. This will allow you to create a private network and avoid the saturated public Internet network, obtaining a more efficient route with less latency and providing more security to your packets as they travel through the network.

At Flō Networks, we have thousands of kilometers of highly interconnected fiber optic network across the continent that give you the speed, reliability, and security you need to connect to the cloud and critical services that depend on the network, all with one of the lowest latencies in the market.

Learn more about our network solutions for your business.

Table of Contents