I want to see how a HTTP client reacts to connection timeouts, i.e. where there's a server listening on a port, but the process of setting up the connection is so slow that the client gives up and returns a connection timeout. The connection shouldn't be refused, and it shouldn't be accepted and then followed by a socket timeout.
So far, I've attempted to introduce network delay by overriding ServerSocket
, imagining that I would be able to write something like this...
public SlowServerSocket extends ServerSocket {
// (This method doesn't actually exist).
@Override
public void processBytesPassedOnByOperatingSystem(byte[] bytes) {
Thread.sleep(delay);
// Client has already returned a connection timeout.
super.processBytesPassedOnByOperatingSystem(bytes);
}
}
...but I run into a dead end when I get to that level of abstraction (it seems to be hidden in a native method). My fake server accepts connections and then does nothing, causing a socket timeout.
I've also looked for solutions online, and I came across SlowSocket
from the JMeter library, but it seems to be used on the client-side (and I don't think I'll be able to override the client so that it uses SlowSocket
).
Any suggestions?