I'm implementing a client/server application and I trying to check the elapsed time from client side. But it's sometimes less than the elapsed time in the server side. As shown in the code snippet bellow, the client (object delay creates a socket communication) sends a request to the server side that allways spend 500ms to execute and than respond to the client.
On the client side: In the code snippet bellow, the client sends 10 requests
public class ThreadTest {
public static void main(String[] args) {
long b = 0;
long e = 0;
for(int i = 0; i < 10; i++){
b = System.currentTimeMillis();
delay.delay(); // this class creates a socket and sends a request
e = System.currentTimeMillis();
System.out.println("Elapsed time: " + ((e - b)) + "ms");
}
}
}
On the server side (this is the class executed by the Server socket application):
public class Delay {
public void delay(){
long ini = System.currentTimeMillis();
long end = System.currentTimeMillis();
while((end - ini) < 500)
end = System.currentTimeMillis();
System.out.println("Request arrived " + (end - ini) + "ms.");
}
}
Both client and server are running on different KVM virtual machines.
The result on screen: Client side:
root@camid00:~/camid# Elapsed time 744ms
Elapsed time 572ms
**Elapsed time 452ms**
Elapsed time 701ms
Elapsed time 580ms
Elapsed time 592ms
**Elapsed time 468ms**
**Elapsed time 424ms**
Elapsed time 588ms
Elapsed time 632ms
**Elapsed time 380ms**
Server side:
Elapsed time on server side 500ms.
Elapsed time on server side 504ms.
Elapsed time on server side 500ms.
Elapsed time on server side 500ms.
Elapsed time on server side 500ms.
Elapsed time on server side 500ms.
Elapsed time on server side 500ms.
Elapsed time on server side 500ms.
Elapsed time on server side 500ms.
Elapsed time on server side 501ms.