0

I've noticed, on some websites, that there are recurrent http get requests taking equally long times to download a very small amount of data (around 5 lines of text).

Like a heartbeat, these requests are also chained in such a way that there is always something going on in the background.

Consistently long content download times

It is present in multiple well known websites. For example, see Gmail and facebook are both using this technique for their hearthbeats

enter image description here

How can one reproduce this behaviour?

And why would someone use this technique on his website?

Edit:

My hypothesis is that they can now control the refresh times of all clients by adjusting a single value in the server application

icosamuel
  • 328
  • 4
  • 11

1 Answers1

0

Most likely this is an implementation of long polling. It's arguably a hack to simulate push updates to the browser, enabling real time updates of the page as soon as something of importance happens on the server.

deceze
  • 510,633
  • 85
  • 743
  • 889
  • Thanks! It seems so obvious now. A wise use of web technologies. Quickly found myself reading this one: http://stackoverflow.com/questions/333664/how-to-implement-basic-long-polling – icosamuel Oct 20 '15 at 14:15