0

I'll get straight to the point!

My javascript sends about 20 AJAX requests to my PHP file to respond to (via an external web API) when the user submits their search. The results are stored in an array in the session array.

I've read that browsers will only allow 2 simultaneous requests to a server.

My first problem is that while there are still more than one requests still waiting for a response the AJAX "add to basket" request won't work since it's still waiting for the other requests to complete.

My second (and more annoying) problem is that the 2 requests that are being handled simultaneously seem to be over writing each other so that when all the responses are complete only half are in the session array. Either all the odd ones or even ones depending on whether the final request is odd or even.

I'd prefer not to have to send requests individually (ie only send the next when the last has finished) as that would slow things down for the user a fair bit.

Is there a solution to this session overwriting or should I be using a completely different approach altogether?

Thanks all!


Edit:
It's for checking domain availability. The user searches for "mydomain" and results for com, net, org, etc are eventually presented.

Sending a single request and having the script search for all tlds in one go means that a response isn't returned until all results are in. The result for some tlds seem to take upto and over 30 seconds during which the user is getting no feedback save for a swirly icon and "Please Wait" (this is what happens when javascript isn't enabled).

Seperate requests allow me to display the domains availability as they come in.

I'm currently thinking along the lines of sending a single request and then using javascript's setinterval to repeatedly check the session until all results are in.

Nick
  • 229
  • 5
  • 12

8 Answers8

4

I think you should start refactoring your solution:

  1. All the performance guidelines states that you should minimize the number of HTTP requests. 20 is too much
  2. If you have a shared resource you need to lock an unlock the parts you manipulate it to prevent that two or more requests update it at the same time
Daniel Silveira
  • 41,125
  • 36
  • 100
  • 121
  • Is sending so many AJAX request any different to, say, a gallery page which has to load numerous images? – Nick Dec 08 '08 at 15:47
  • 2
    It's VERY different from a performance standpoint. The amount of processing that your webserver has to do to answer requests for, and return the contents of, 20 static files (images or whatever) is absolutely minimal in comparison to what it must do to answer 20 discrete AJAX requests. The amount of processing that the browser has to do for static files is also comparatively minimal. There is a superficial similarity in that the number of HTTP requests issued from the client is the same in both cases, but really - that number is not what you should be basing your decisions on here. – glomad Jul 23 '09 at 17:18
  • 1
    It would be far more economical to issue one AJAX request, let your back end do whatever number of searches it needs, return the result set as JSON or some such, and deal with it in the browser. Imagine 500 users simultaneously using your current system - that's 10000 requests in a very short span of time. Your performance will become exponentially worse as you scale up. – glomad Jul 23 '09 at 17:19
1

Requests are processed in parallel, which means that's similar to concurrent programming (with threads), including race conditions etc.

My suggestion would be to just send the search action (assuming the user performs only one search, and not 20) to the server, and split it there over the 20 actions that you want it to perform. That allows you to execute them in sequence, preventing them from overwriting each other.

Linor
  • 1,850
  • 2
  • 17
  • 17
1

http://php.net/manual/en/function.session-write-close.php

Store the data from your session in local variables, then call this function to unlock your session for other files. (though it has to be said that 20 AJAX calls probably is not the best solution)

I.devries
  • 8,747
  • 1
  • 27
  • 29
0

The problem is that between simultaneous request, the session is started. even when it is open in another context, due to the stored session reading, and same cookie, the session opened by each request is the same. Then if one of the current requests ends first its task, it will close the session as it happens always when a script has ended its task, but since the other script (the other called) may still running, it is not aware that the session has been closed already, so there are some solutions to this. Before ending the task, copy the session var to a sessionholder var, restart the session, restore the seesion values session = sessionholder and session_write_close Which session will start or end first? sometimes there is no way to know, maybe the first that started the session will take longer to end his task, while the other comes and goes, anyway, take it as it is, session are not more than saving a variable called session, type array with key values, the treatment of recovering the stored values, then updating the saving of those values at the end of the script task is what makes session automatic in this matter, so there is no difference if you start a session (recover its stored values, assign new values to it, then save them again, the process is start session, assign values, session write close. Of course, be aware that if you restart a session without holding its actual values, they will be lost (overwritten if they exist) so that is why you must first hold its actual values, restore, reasign current values, then writeclose. The other process inherent that specific session variable that php does, is to assign a reference (session ID) which it sends to the client (usually via cookie) so when the cookie is back with that reference ID identifies the session variable stored to recover the previous values. Remember, that one same call may do the same, since ajax is async, if for some reason one call is delaying, and the users clicks again, the new request will not end the current and it will happen the same problem, so if using ajax prepare to handle sessions as needed depending on your application. And finally, remember that the call to may be to a different file, or same file, one ajax call request prices.php the other products.php but both will use the same session stored variable, the session var is only one all time for all the scripts you have. Avenida Gez. Thanks for reading

Avenida Gez
  • 409
  • 5
  • 5
0

Sorry it's not really the answer you're after but 20 requests sounds like far too much for a single search. Having implemented something similar ie. a brief search history stored in the session we opted not to use AJAX at all. There's a time and a place for it but not if it's going to kill your server with requests when your traffic increases.

roborourke
  • 12,147
  • 4
  • 26
  • 37
0

Try Building a Request QUEUE for your ajax calls. Every call will be made after the previous one ends.

After the 2nd request is made you cannot be sure of what will happen since as you said only 2 simultaneous requests can be send. After that number the 3rd Request will most likely replace the 2nd, etc.

0

I assume this is some sort of auto-complete search box. If you are using Scriptaulous' Ajax.Autocompleter, you can simply specify a "minChars" parameter. It won't send the AJAX request until at least that many characters have been typed in.

You could also adjust the 'frequency' parameter, which changes how frequently (in seconds) the input field should be polled for changes before firing an AJAX request.

More details here

lo_fye
  • 6,790
  • 4
  • 33
  • 49
0

As others have pointed out your first approach should be to reduce the number of requests. If this is not an option you can use subdomains to increase the number of parallel requests.

Configure your DNS (for say mydomain.com) to accept all subdomains *.mydomain.com and to send them to the same server.

You can then send the different AJAX requests to different domains (a.mydomain.com, b.mydomain.com, ...)

This approach is used by mapservers such as google maps, to increase the number of maptiles that are downloaded in parallel by the browser.

Tomas
  • 5,067
  • 1
  • 35
  • 39