hmu.ai
Back to Developer Dictionary
Developer Dictionary

Latency

Definition

The delay before a transfer of data begins following an instruction for its transfer.

Deep Dive

Latency refers to the delay before a transfer of data begins following an instruction for its transfer, or more generally, the time it takes for a data packet to travel from its source to its destination across a network. It is a critical metric in network performance, often measured in milliseconds (ms), and directly impacts the responsiveness and perceived speed of applications and services. High latency can lead to a frustrating user experience, especially for real-time applications.

Examples & Use Cases

  • 1Experiencing "lag" during an online multiplayer video game due to the time delay between your actions and the server's response
  • 2A website taking several seconds to load its initial content, despite having a high-speed internet connection
  • 3A noticeable delay in voice transmission during an international video conference call.

Related Terms

BandwidthThroughputPing

Part of the hmu.ai extensive business and technology library.