2

I have a website that fetches approximately 20 pages, each page is different. Currently, it takes about 1.2 minutes to load.

Some days back this task only took 11-15 seconds. Now it takes 1.2 minutes. What can be the reason for sudden change?

Is there any solution other than merging some of these to reduce the number of requests? Can the limit of number of requests be somehow altered?

enter image description here

The above screenshot is from Firebug. the gray portions of the bars represent "Blocking".

Alejo
  • 1,913
  • 3
  • 26
  • 44
Prashant Singh
  • 3,725
  • 12
  • 62
  • 106
  • In jQuery, you can use the "beforeSend" event to abort a Ajax-request if the limit has been reached. – Šime Vidas Aug 11 '12 at 20:32
  • Did the size of the pages (length in Bytes) change? Open the "Net" tab in the browser's dev tools to see where the time is spent. – Šime Vidas Aug 11 '12 at 20:33
  • Yeah, I tried to find out. The main problem seems to be that results are now coming sequentially. Try it urself. http://compare.buyhatke.com/products/apple-ipod-touch – Prashant Singh Aug 11 '12 at 20:38
  • That is true. The browsers are **blocking** the Ajax-requests. It seems that they are enforcing some limit. – Šime Vidas Aug 11 '12 at 20:52
  • So, what could be the reason ? How can I avoid it ? – Prashant Singh Aug 11 '12 at 21:11
  • If you could combine those requests into one, that would be great. Btw, you can get more info here: http://stackoverflow.com/questions/561046/how-many-concurrent-ajax-xmlhttprequest-requests-are-allowed-in-popular-browse The max number of connections per hostname appears to be 6 in most browsers. – Šime Vidas Aug 11 '12 at 21:13
  • @ŠimeVidas Does providing lots of conditions in .htaccess may be a problem for this ? I mean it would go through those conditions again n again. Can it sufficiently lower the performance ? – Prashant Singh Aug 11 '12 at 21:20
  • The process is slow because the browsers do not allow more than 6 concurrent Ajax-requests to the same host name. – Šime Vidas Aug 11 '12 at 21:27
  • So, I think , using 3 sub-domains will then resolve the issue. I will distribute them among three sub-domains. That as a result will enhance our performance and speed – Prashant Singh Aug 11 '12 at 21:30
  • Yes, sub-domains should do the trick. – Šime Vidas Aug 11 '12 at 21:32
  • @ŠimeVidas But we forgot an important point, that, ajax request is not possible between cross-domains – Prashant Singh Aug 11 '12 at 22:15
  • It is possible with CORS. Your web-server has to provide a `Access-Control-Allow-Origin` response header. – Šime Vidas Aug 11 '12 at 23:16
  • @ŠimeVidas I tried CORS. Can you please, help me in this ? http://stackoverflow.com/questions/11919732/setting-up-access-control-for-cross-domain-ajax-request – Prashant Singh Aug 12 '12 at 06:33
  • @ŠimeVidas How can CDN servers help me in speeding up the response ? Since CORS is not supported by my server – Prashant Singh Aug 12 '12 at 16:23
  • You can manually implement CORS on your server. You just have to send the `Access-Control-Allow-Origin` header in the HTTP-response. Read the thread that I suggested in your other question. – Šime Vidas Aug 12 '12 at 18:02

2 Answers2

3

Somebody should have suggested me to use multi-CURL instead. In that I just need to make a call to a single PHP file and that will automatically call to each one of them, that too in parallel.

Take a look here

Prashant Singh
  • 3,725
  • 12
  • 62
  • 106
  • Please include the important parts of the linked content in your answer itself. – Nic Jan 10 '17 at 12:29
1

Most webkit browsers will only allow for 2 simultaneous AJAX connections at any given time. The best practice would be to have the website receive the data as a JSON string, and for the JavaScript to then put all of the data where it belongs. Given your screenshot, I would assume that the backend is all in PHP.

Start with json_encode() on the PHP side, and eval() on the JavaScript side, from there it should be a walk in the park.