Bandwidth vs Latency: What Is the Difference
Bandwidth is the maximum amount of data that can be transferred at once, while latency is the delay before data starts to travel. Both affect the speed and quality of internet connections.
Bandwidth vs Latency
Bandwidth and latency both affect your internet experience, but in very different ways. Bandwidth is how much data can flow through a connection at once; latency is how long it takes for that data to start moving. Understanding both helps you diagnose slow connections and choose the right solution.
What Is Bandwidth
Bandwidth is the maximum amount of data that can be transferred over a network connection per second. A common analogy is the width of a pipe. A wider pipe allows more water to flow through at the same time, and a higher bandwidth connection allows more data to move simultaneously.
Bandwidth is measured in megabits per second (Mbps) or gigabits per second (Gbps). When your internet provider advertises a 100 Mbps plan, that is the theoretical maximum download speed available to you. In practice, actual speeds are usually somewhat lower due to network overhead, congestion, and hardware limitations.
- A 100 Mbps connection can theoretically download 12.5 megabytes of data per second
- Bandwidth is shared across all devices and applications on your network simultaneously
- Upload bandwidth and download bandwidth are often different, especially on consumer internet plans
- Streaming a 4K video, running a backup, and making a video call at the same time all compete for the same bandwidth pool
Bandwidth matters most for tasks that involve moving large amounts of data, such as downloading files, streaming high-definition video, or uploading content to a cloud service.
What Is Latency
Latency is the time it takes for a single packet of data to travel from your device to a server and back again. It is measured in milliseconds (ms). Where bandwidth describes how much data can move, latency describes how quickly the connection responds to a request.
Latency is also referred to as ping or round-trip time (RTT). If you open a terminal and run ping google.com, the number reported back is your latency to Google's servers. A result of 20ms means your device receives a response within 20 milliseconds of sending a request.
- Latency under 20ms is considered excellent and typical for wired connections to nearby servers
- Latency between 20ms and 100ms is good for most applications including video calls
- Latency between 100ms and 200ms becomes noticeable in real-time applications
- Latency above 200ms causes visible lag in gaming and choppy audio or video in calls
Latency matters most for applications where real-time responsiveness is important, such as online gaming, live video calls, financial trading platforms, and remote desktop sessions.
Bandwidth vs Latency: Key Differences
| Feature | Bandwidth | Latency |
|---|---|---|
| Definition | Data transfer capacity per second | Delay before data starts flowing |
| Units | Mbps, Gbps | Milliseconds (ms) |
| Analogy | Width of a pipe | Length of the pipe |
| High value means | Faster bulk transfers | More delay (lower is better) |
| Most important for | Streaming, downloads, uploads | Gaming, video calls, real-time apps |
| How to measure | Speed test download and upload result | Ping command or speed test ping result |
| How to improve | Upgrade your ISP plan | Use closer servers, a CDN, or a wired connection |
Typical Values for Common Activities
Different online activities have very different requirements for bandwidth and latency. A task that needs high bandwidth may tolerate high latency, and vice versa. Buffered video streaming, for example, downloads data ahead of time, so a slight delay before playback starts is acceptable. Online gaming sends tiny packets constantly and needs a near-instant response, but uses very little bandwidth overall.
| Activity | Bandwidth Needed | Acceptable Latency |
|---|---|---|
| HD Video Streaming | 5+ Mbps | Up to 500ms (buffered) |
| 4K Video Streaming | 25+ Mbps | Up to 500ms (buffered) |
| Video Calls (HD) | 3 to 5 Mbps | Under 150ms |
| Online Gaming | 3 to 5 Mbps | Under 50ms (ideal) |
| Basic Web Browsing | 1+ Mbps | Under 200ms |
| Large File Downloads | As high as possible | Not critical |
| Remote Desktop | 5 to 10 Mbps | Under 100ms |
What Causes High Latency
Several factors contribute to high latency, and not all of them are within your control. Understanding the causes helps you decide where to focus when trying to reduce lag.
- Physical distance: Data travels at roughly the speed of light through fibre-optic cables, but even at that speed, longer distances add meaningful delay. Connecting to a server on the other side of the world adds tens of milliseconds compared to a local server.
- Network hops: Each router a packet passes through on its journey adds a small amount of processing delay. More hops between you and the destination means higher latency.
- Network congestion: When a network link is heavily loaded, packets queue up and wait their turn. This queuing delay can significantly increase latency during peak usage hours.
- Satellite internet: Geostationary satellites orbit around 35,000 kilometres above Earth. A signal must travel up to the satellite and back down, adding roughly 600ms of round-trip delay by physics alone. Low Earth orbit satellite services like Starlink reduce this significantly but still have higher latency than ground-based connections.
- Wi-Fi interference: Wireless connections introduce their own latency compared to wired Ethernet. Interference from other devices, walls, and distance from the router all add milliseconds.
- Server processing time: Latency includes not just travel time but also the time the server takes to process your request and prepare a response. A slow or overloaded server increases the total round-trip time.
What Causes Low Bandwidth
Bandwidth problems are usually easier to identify than latency problems, and the solutions are more straightforward.
- ISP plan limits: Your subscribed plan sets the ceiling for available bandwidth. If your plan offers 50 Mbps and you need 200 Mbps, upgrading is the only real solution.
- Too many simultaneous users: Every device on your network shares the same bandwidth. Multiple streams, downloads, and video calls happening at once will each get a smaller share.
- Throttling: Some ISPs reduce speeds for certain types of traffic, such as video streaming or torrents, after a usage threshold is reached.
- Old hardware: An outdated router or network card may be incapable of sustaining the full speeds your ISP provides.
Frequently Asked Questions
- Can high bandwidth compensate for high latency?
No. Bandwidth and latency are independent dimensions of network performance. High bandwidth allows more data to transfer per second, but it does not reduce the initial delay before data starts arriving. A connection with high bandwidth and high latency will download large files quickly but will still feel sluggish for real-time applications like gaming or video calls. - Why does my fast internet still feel slow for gaming?
Gaming is almost entirely sensitive to latency rather than bandwidth. Most online games send and receive only a few kilobytes of data per second, so bandwidth is rarely the bottleneck. What matters is how quickly the server responds to your inputs. A 10 Mbps connection with 20ms ping will play significantly better than a 1 Gbps connection with 200ms ping. - What is throughput and how is it different from bandwidth?
Bandwidth is the theoretical maximum capacity of a connection. Throughput is the actual amount of data successfully transferred per second under real conditions. Throughput is almost always lower than the advertised bandwidth because of network overhead, packet loss, congestion, and protocol inefficiencies. When you run a speed test, the result you see is closer to throughput than to raw bandwidth. - Does a wired connection reduce latency compared to Wi-Fi?
Yes. A wired Ethernet connection typically has lower and more stable latency than Wi-Fi. Wireless signals are subject to interference, signal degradation, and the overhead of the wireless protocol itself. For latency-sensitive applications like competitive gaming or video calls, plugging directly into your router with an Ethernet cable is one of the simplest improvements you can make. - How do CDNs help with latency?
A Content Delivery Network (CDN) stores copies of content on servers distributed around the world. When you request a resource, the CDN serves it from the location closest to you rather than from a central origin server. This reduces the physical distance the data must travel and lowers latency. Major websites and streaming platforms use CDNs to deliver fast, consistent performance to users globally.
Conclusion
Bandwidth and latency are two separate dimensions of network performance that affect your experience in different ways. Bandwidth determines how much data can flow through a connection at once, which matters most for downloads, uploads, and streaming. Latency determines how quickly a connection responds to a request, which matters most for gaming, video calls, and any real-time interaction. Most connectivity problems are caused by one or the other, and identifying which one is the issue leads you to the right solution faster. To go further, learn how CDNs reduce latency by bringing content geographically closer, and explore how routing affects the number of hops your data travels.
