7

I am not really sure it is possible in JavaScript, so I thought I'd ask. :)

Say we have 100 requests to be done and want to speed things up.

What I was thinking of doing is:

  • Create a loop that will launch the first 5 ajax calls
  • Wait until they all return (success - call a function to update the dom / error) - not sure how, maybe with a global counter?
  • Repeat until all requests are done.

Considering browser JavaScript does not support thread, can we "exploit" the async functionality to do that? Do you think it would work, or there are inherent problems doing that in JavaScript?

Frunsi
  • 7,099
  • 5
  • 36
  • 42
johnjohn
  • 4,221
  • 7
  • 36
  • 46
  • Added php tag, since it might be relevant for a good solution. – BGerrissen Nov 21 '10 at 23:17
  • BGerrissen: Ok, PHP is widely used for web server side scripting and such, but this question is absolutely not related to PHP! – Frunsi Nov 21 '10 at 23:25
  • @frunsi For Java DWR speeds up concurrent Ajax request quite nicely, there must be a PHP equivalent. So PHP can most definatly be relevant, the question is mostly "want to speed things up" and there might be other PHP developers that can help this PHP developer. – BGerrissen Nov 21 '10 at 23:34
  • Wait, I'm confused. Should every client side Ajax question be tagged with PHP then? – JJJ Nov 21 '10 at 23:40
  • @Juhana, Nah, in the comment thread of my awnser, PHP on the backend was mentioned ;) – BGerrissen Nov 21 '10 at 23:42

3 Answers3

3

Yes, I have done something similar to this before. The basic process is:

  1. Create a stack to store your jobs (requests, in this case).
  2. Start out by executing 3 or 4 of the requests.
  3. In the callback of the request, pop the next job out of the stack and execute it (giving it the same callback).
Chris Laplante
  • 29,338
  • 17
  • 103
  • 134
  • 4
    This page is pertinent: http://stackoverflow.com/questions/561046/how-many-concurrent-ajax-xmlhttprequest-requests-are-allowed-in-popular-browser It discusses how browsers limit the number of concurrent Ajax requests each browser allows – Dancrumb Nov 21 '10 at 22:51
  • Interesting link, thank you. I would assume that the browser temporarily suspends Ajax requests that occur over the limit. Something else to consider. What I don't know, however, is if the browser suspends it synchronously or asynchronously (i.e, does it lock the UI while it is waiting for the other Ajax requests to finish) – Chris Laplante Nov 21 '10 at 22:55
  • 1
    The request is queued (i.e doesn't start until another request completes). For a sync. request, the browser will be locked for the duration of the queue, and the request. There is no locking during either for async. – Matt Nov 21 '10 at 22:58
2

I'd say, the comment from Dancrumb is the "answer" to this question, but anyway...

Current browsers do limit HTTP requests, so you can even easily just start all 100 request immediately, and the browser will take care of sending those requests as fast as possible, but limited to a decent number of parallel requests.

So, just start them all immediately and trust on the browser.

However, this may change in the future (the number of parallel requests that a browser sends increases as end-user internet bandwidth increases and technology advances).

EDIT: you should also think and read about the meaning of "asynchronous" in a javascript context.. asynchronous here just means that you give up control about something to some other part of a system. so "sending" an async request just means, that you tell the browser to do so! you do not control the browser, you just tell it to send that request and please notify me about the outcome.

Frunsi
  • 7,099
  • 5
  • 36
  • 42
  • So, a simple loop to execute them all and let the browser handle the limiting/queuing itself. Sounds nice. :) Thank you. – johnjohn Nov 21 '10 at 23:13
  • @john: exactly :) at least try this first and check if it works fine for your application (whether server can handle it and whether browsers parallel request limits are satisfactory for your usage scenario) – Frunsi Nov 21 '10 at 23:25
0

It's actually slower to break up 100 requests and batch post them 5 at a time whilst waiting for them to complete till you send the next batch. You might be better off simply sending 100 requests, remember JavaScript is single threaded so it can only resolve 1 response at a time anyways.

A better way is set up a batch request service that accepts something like:

/ajax_batch?req1=/some/request.json&req2=/other/request.json

And so on. Basically you send multiple requests in a single HTTP request. The response of such a request would look like:

[
   {"reqName":"req1","data":{}},
   {"reqName":"req2","data":{}}
]

Your ajax_batch service would resolve each request and send back the results in proper order. Client side, you keep track of what you sent and what you expect, so you can match up the results to the correct requests. Downside, it takes quite some coding.

The speed gain would come entirely from a massive reduction of HTTP requests. There's a limit on how many requests you send because the url length has a limit iirc.

DWR does exactly that afaik.

BGerrissen
  • 21,250
  • 3
  • 39
  • 40
  • Thank you. It is interesting and would be a solution if my back-end was Java. But since it is PHP and there are no threads there, I would be looking for some pseudo-threads implementation on PHP. I hoped to avoid that. – johnjohn Nov 21 '10 at 23:09
  • Don't need to implement pseudo threads, you can just batch handle the requests one by one in a php service and output the results as json/xml/whatever. Again, the speed gain can be mostly gotten from reducing http requests. – BGerrissen Nov 21 '10 at 23:13
  • I am not sure I understand. If the backend script finishes 1 task in 5 seconds, what good is making a /ajax_batch script that will call the backend sequentially? Won't it be 5*5 = 25? – johnjohn Nov 21 '10 at 23:17
  • 1
    5 seconds sounds like an aweful lot for a backend script... In any case, the server roundtrip for each request typically takes longer then the backend processing. By reducing the roundtrips (even with async overlaps) you can get a speed gain of 50+% at the very least. By sending requests 5 at a time, you lose speed since you get rid of any beneficial paralel requests and basically put in a pauze in between request strains as well. – BGerrissen Nov 21 '10 at 23:24
  • Let me put it this way, if a browser runs max 10 paralel requests at any given moment, you might think that running 10 request at a time might be a solution, but you'd forget that if a request is finished, the browser pops the next one from the queue. That queue popping is what you will lose when you send 5 request each time and only send the next 5 when the first 5 are done. Ergo, it will actually take longer to resolve all the requests. By concatenating all requests into a single request, you circumvent the queue entirely since you only have 1 http request. – BGerrissen Nov 21 '10 at 23:29
  • Oh and if your backend script takes say 5 seconds each request, then it's safe to say that time spent is constant however you implement your ajax requests. It's alltogether another bottleneck. – BGerrissen Nov 21 '10 at 23:30
  • Ah,yes, now I see what you mean. :-) Unfortunately, in my case the backend script does indeed take a few seconds because it fetches and performs analysis of web pages. So, if a third-party server is slow, the script gets slower. The idea was to speed things up by parallel requests (so, in effect Apache handles the "threading") from the javascript front-end (that would eventually put more strain on the web server, but it is acceptable). – johnjohn Nov 21 '10 at 23:32
  • Hmm, I see, then I suggest you reform you question to include possible existing PHP solutions and add PHP tag (yourself...). What you can use most is a PHP version of DWR. – BGerrissen Nov 21 '10 at 23:38
  • Thank you very much for you interest in this. I will eventually look around for a PHP based solution since it is probably more robust that way. For the time being, I will go with frunsi 's suggestion. – johnjohn Nov 21 '10 at 23:47