I've noticed, on some websites, that there are recurrent http get requests taking equally long times to download a very small amount of data (around 5 lines of text).
Like a heartbeat, these requests are also chained in such a way that there is always something going on in the background.
It is present in multiple well known websites. For example, see Gmail and facebook are both using this technique for their hearthbeats
How can one reproduce this behaviour?
And why would someone use this technique on his website?
Edit:
My hypothesis is that they can now control the refresh times of all clients by adjusting a single value in the server application