Throughput

Definition
Throughput is the actual rate at which data is successfully transmitted over a network or processed by a system in a given period of time. It is usually measured in bits per second, megabits per second, or gigabits per second. Unlike bandwidth, which is the theoretical capacity, throughput represents the real-world performance users experience.
Throughput can be affected by latency, packet loss, congestion, and system efficiency. High throughput is essential for applications that handle large volumes of data or require consistent performance.
Advanced
At an advanced level, throughput is measured using tools that calculate the volume of data transferred end-to-end. It accounts for successful transmissions while excluding failed or dropped packets. Techniques such as load balancing, traffic shaping, and error correction are used to optimise throughput.
In distributed systems and cloud environments, throughput is a critical factor for scaling services, managing APIs, and handling workloads like streaming, backups, or analytics.
Why it matters
Use cases
Metrics
Issues
Example
A streaming platform improves throughput by deploying a CDN and optimising compression. The result is faster video delivery with fewer interruptions, leading to higher user satisfaction.