Delay refers to amount of time between two events.
A delay is simply the difference in time between two events.
This may refer to the start and stop of a single event (e.g. data processing delay) or the execution of two separate, independent events.
A delay may be manually implemented by the programmer using sleep
or pause
functions common in many programming languages.
In the case of computer networks, delay between sending and receiving of data packets is referred to synonymously as latency.