Background
I have an implementation of an HTTP server in C#. Using ab I discovered a weird performance issue. Each request took 5 ms with Keep-Alive Off but 40 ms with Keep-Alive on!
The testpage is generated into a single byte[] which get sent as reply using a single socket.Send call.
The cause is as far as I can tell Nagle's algorithm used in the TCP stack.
TCP Flush?
So far I am using the NoDelay property in the end of every HTTP request served.
socket.NoDelay = true;
socket.NoDelay = false;
Which does solve the problem for now. But I have no documentation to backup my discovery.
This was tested on a linux/mono system.
Is there a standard way of flushing the TCP connection?
Related
This answer is addressing the same issue. The difference here is that I am looking to only temporarily disabling the algorithm.