What is Latency?

Intermediate – Computing

Reading Time – 1 minute, 35 seconds

What is latency? It’s the time delay between the point of stimulation and the point of response triggered. It has different definitions based on the context. In computing, it often refers to network latency which is the time taken for a request or a data packet to travel from the sender to the receiver across the network and receiver to process the request to send a response back to the sender.

In the computer networking context, the sender generally is a browser and the receiver is a server. It is the round trip time (RTT) from the browser to the server and back again. Latency is a measure of delay. In computer networking, it is preferable to keep this delay at zero. However, few other factors affect this measure. Latency has a significant impact on the network performance and consequently on the overall performance of a computer system that utilizes the network. Hence, low latency is a positive user experience while the opposite adds to poor user experience.

Disk latency is also another type covered in computing. It is the delay between the time the data is requested from a storage disk and the time the requested data begins to return. Factors affecting the disk performance are the rotational latency and the seek time. A traditional hard disk drive (HDD) rotates around its axis while the drive head moves up and down on the sectors reading or writing data. This HDD rotation adds a rotational latency. It causes a delay in reading or writing a large number of files as opposed to writing a single contiguous file.

Random-access memory (RAM), central processing unit (CPU), audio, video, and optical fiber latency are few other types in the computing domain. The common ground is that they all refer to a delay due to a bottleneck in their respective mechanical engineering area.

Get Started Today With V2 Cloud!