Latency is the amount of time a message or data packets takes
to traverse a system. You must have heard of Network Latency, so in computer networking
it is an expression of how much time it takes for a data packet to get
from one source system to destination another. It is more accurately measured
as the time required for a packet to be returned to its sender over a
interconnected network.
So if latency is low then network performance
is good and on the other hand if latency is high then there supposed to be some
problem with communication medium. In fact Latency depends on the speed of the
transmission medium (e.g., copper wire, optical fiber or radio waves) and the
delays in the transmission by devices along the way (e.g., routers and modems).
Latency and throughput are
the two very important terms to confuse about in data communication over the
network. In fact both are fundamental measures of network performance with
slightly different context. The latency measures the amount of time between the
start of an action and its completion; throughput is the total number of such
actions that occur in a given amount of time. Latency is measured in time (e. g.
seconds,milliseconds) whereas throughput is measured in volume of data per unit
time (e.g. gb/hr).
No comments:
Post a Comment