I have a site that loads many individual pieces of data from a remote server - that is, a public API server separate from my webserver. The site makes 17 Ajax calls on load, which goes way beyond browser limits, so using direct Ajax calls would cause even more severe performance problems.
Some of these Ajax calls take a long time - the three longest take about 28, 20, and 19 seconds respectively - most take 4-8 seconds, and a few take less than a second.
The site currently batches many of these Ajax requests to a PHP script, which acts as a middleman: the client sends the server a list of URLs, the webserver (which has no such concurrent HTTP request limit) makes the requests on the client's behalf, and the webserver forwards the API server's responses to the client.
//sent by browser to webserver
{
"events_data": {
"url": api_url + "/events/properties/events_data",
"requestType": "GET",
"headers": {},
"parameters": {}
},
"transactions": {
//...
},
//...
}
//returned by webserver to browser after all requests resolve
{
"events_data": {
"response": [
{ /* ... */ },
{ /* ... */ },
// ...
],
"status": {
"http_code": 200,
"content_type": "application/json; version=0.1",
//...
},
"responseHeaders": {
"Content-Length": 12345,
"Connection": "keep-alive",
//...
}
},
"transactions": {
//...
},
//...
}
One big issue I have with this workaround is that the client has to wait for the webserver to receive every HTTP response in the batch before it can see any of the data. I know I can fix this issue by using something like Socket.io, but that seems like overkill.
If this is a common issue, what solutions exist to deal with it?
If it's not, how do sites that rely heavily on data from APIs avoid or mitigate this issue?