377

In Firefox 3, the answer is 6 per domain: as soon as a 7th XmlHttpRequest (on any tab) to the same domain is fired, it is queued until one of the other 6 finish.

What are the numbers for the other major browsers?

Also, are there ways around these limits without having my users modify their browser settings? For example, are there limits to the number of jsonp requests (which use script tag injection rather than an XmlHttpRequest object)?

Background: My users can make XmlHttpRequests from a web page to the server, asking the server to run ssh commands on remote hosts. If the remote hosts are down, the ssh command takes a few minutes to fail, eventually preventing my users from performing any further commands.

Michael Gundlach
  • 106,555
  • 11
  • 37
  • 41
  • Thinking about your situation, what is the feasibility of pinging the remote hose to see if it is up or down? This won't answer your question, but this may be a better workflow. – Bob Feb 18 '09 at 13:44
  • 1
    Thanks Bob, that's one of the two approaches I had planned to fix this problem -- I considered mentioning it in the Question but decided it was off-topic. (Another approach is to have the server, which I control, timeout the ssh requests.) – Michael Gundlach Feb 18 '09 at 13:56
  • 1
    I think you pretty much have your answer... it's more than safe to assume Safari and Chrome support at least 2, so you can always assume 2. – Rex M Feb 19 '09 at 01:12
  • 1
    Using Chrome 2.0.172.28 on Windows Vista I got 6 concurrent connections. – Callum May 29 '09 at 01:35
  • 2
    I just found this page http://www.stevesouders.com/blog/2008/03/20/roundup-on-parallel-connections/ which gives a few more numbers and a discussion about this. – David Johnstone Feb 15 '10 at 05:40
  • In FF, you can get the max connections in `about:config` under the `network.http.max-connections-per-server`, and there's a `max-persistent-connections-per-server` sister value as well. – Marc B Apr 12 '11 at 22:45
  • [Also related](http://stackoverflow.com/q/5751515/693207) – Jürgen Thelen Aug 04 '12 at 16:13
  • Is it true from http2 context? – unknown_boundaries Oct 14 '21 at 03:07

9 Answers9

148

One trick you can use to increase the number of concurrent connections is to host your images from a different sub domain. These will be treated as separate requests, each domain is what will be limited to the concurrent maximum.

IE6, IE7 - have a limit of two. IE8 is 6 if you have a broadband - 2 (if it's a dial up).

Bob
  • 97,670
  • 29
  • 122
  • 130
  • Thanks, Bob. So are you saying that FF actually limits *all* requests to 6, not just AJAX requests? – Michael Gundlach Feb 18 '09 at 13:36
  • 4
    No, the limits are imposed on the domain. So you could technically get FF up to 12 connections if you had a subdomain in addition to your site. – Bob Feb 18 '09 at 13:39
  • Just to clarify, the browser does the limiting (server can technically have its own) based on the name of the domain. – Bob Feb 18 '09 at 13:40
  • 1
    So if I understand you, FF limits *all* requests (to a single domain) to 6 -- not just XmlHttpRequests to a single domain. And other browsers do the same thing with different limits. Correct? – Michael Gundlach Feb 18 '09 at 13:54
  • 1
    Ohh yes, If you have a page with a thousand images, it will download them in groups of six. I believe most other mainstream browsers work the same way. – Bob Feb 18 '09 at 13:59
  • 9
    Wow. This is a good trick. This also explains why tile-servers for map engines create a number of fake sub-domains (typically something like maps1.whatever.com, maps2.whatever.com, maps3.whatever.com) to accelerate things. – meawoppl May 23 '13 at 17:33
  • 1
    will the browser wait for the whole group to finish loading before it will continue to the next group fetch or will it keep the stack of concurrent requests full and replace a finished task with another? – AMember Nov 24 '13 at 12:16
  • 2
    @AMember, the browser keeps in parallel its maximum number of concurrent ajax allowed all the time. Try my answer below if you want to see in action – Saic Siquot Jun 10 '14 at 19:12
  • 1
    Would be nice if this got updated with more recent browsers. – Tom Apr 18 '16 at 20:30
101

The network results at Browserscope will give you both Connections per Hostname and Max Connections for popular browsers. The data is gathered by running tests on users "in the wild," so it will stay up to date.

Kevin Hakanson
  • 41,386
  • 23
  • 126
  • 155
26

With IE6 / IE7 one can tweak the number of concurrent requests in the registry. Here's how to set it to four each.

[HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Internet Settings]
"MaxConnectionsPerServer"=dword:00000004
"MaxConnectionsPer1_0Server"=dword:00000004
brianegge
  • 29,240
  • 13
  • 74
  • 99
  • 16
    -1. OP said `without having my users modify their browser settings`. Also, it not practical since one would have to do this on each client. – Razort4x Apr 12 '13 at 06:30
  • 23
    This is nonetheless a very useful thing to know, related to this issue. Perhaps it would have been better posted in a comment than as an answer? – JD Smith Apr 29 '13 at 18:37
7

I just checked with www.browserscope.org and with IE9 and Chrome 24 you can have 6 concurrent connections to a single domain, and up to 17 to multiple ones.

LeftyX
  • 35,328
  • 21
  • 132
  • 193
xmorera
  • 1,933
  • 3
  • 20
  • 35
6

Wrote my own test. tested the code on stackoverflow, works fine tells me that chrome/FF can do 6

var change = 0;
var simultanius = 0;
var que = 20; // number of tests

Array(que).join(0).split(0).forEach(function(a,i){
    var xhr = new XMLHttpRequest;
    xhr.open("GET", "/?"+i); // cacheBust
    xhr.onreadystatechange = function() {
        if(xhr.readyState == 2){
            change++;
            simultanius = Math.max(simultanius, change);
        }
        if(xhr.readyState == 4){
            change--;
            que--;
            if(!que){
                console.log(simultanius);
            }
        }
    };
    xhr.send();
});

it works for most websites that can trigger readystate change event at different times. (aka: flushing)

I notice on my node.js server that i had to output at least 1025 bytes to trigger the event/flush. otherwise the events would just trigger all three state at once when the request is complete so here is my backend:

var app = require('express')();

app.get("/", function(req,res) {
    res.write(Array(1025).join("a"));
    setTimeout(function() {
        res.end("a");
    },500);
});

app.listen(80);

Update

I notice that You can now have up to 2x request if you are using both xhr and fetch api at the same time

var change = 0;
var simultanius = 0;
var que = 30; // number of tests

Array(que).join(0).split(0).forEach(function(a,i){
    fetch("/?b"+i).then(r => {
        change++;
        simultanius = Math.max(simultanius, change);
        return r.text()
    }).then(r => {
        change--;
        que--;
        if(!que){
            console.log(simultanius);
        }
    });
});

Array(que).join(0).split(0).forEach(function(a,i){
    var xhr = new XMLHttpRequest;
    xhr.open("GET", "/?a"+i); // cacheBust
    xhr.onreadystatechange = function() {
        if(xhr.readyState == 2){
            change++;
            simultanius = Math.max(simultanius, change);
        }
        if(xhr.readyState == 4){
            change--;
            que--;
            if(!que){
                document.body.innerHTML = simultanius;
            }
        }
    };
    xhr.send();
});
Endless
  • 34,080
  • 13
  • 108
  • 131
6

I have writen a single file AJAX tester. Enjoy it!!! Just because I have had problems with my hosting provider

<?php /*

Author:   Luis Siquot
Purpose:  Check ajax performance and errors
License:  GPL
site5:    Please don't drop json requests (nor delay)!!!!

*/

$r = (int)$_GET['r'];
$w = (int)$_GET['w'];
if($r) { 
   sleep($w);
   echo json_encode($_GET);
   die ();
}  //else
?><head>
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
<script type="text/javascript">

var _settimer;
var _timer;
var _waiting;

$(function(){
  clearTable();
  $('#boton').bind('click', donow);
})

function donow(){
  var w;
  var estim = 0;
  _waiting = $('#total')[0].value * 1;
  clearTable();
  for(var r=1;r<=_waiting;r++){
       w = Math.floor(Math.random()*6)+2;
       estim += w;
       dodebug({r:r, w:w});
       $.ajax({url: '<?php echo $_SERVER['SCRIPT_NAME']; ?>',
               data:    {r:r, w:w},
               dataType: 'json',   // 'html', 
               type: 'GET',
               success: function(CBdata, status) {
                  CBdebug(CBdata);
               }
       });
  }
  doStat(estim);
  timer(estim+10);
}

function doStat(what){
    $('#stat').replaceWith(
       '<table border="0" id="stat"><tr><td>Request Time Sum=<th>'+what+
       '<td>&nbsp;&nbsp;/2=<th>'+Math.ceil(what/2)+
       '<td>&nbsp;&nbsp;/3=<th>'+Math.ceil(what/3)+
       '<td>&nbsp;&nbsp;/4=<th>'+Math.ceil(what/4)+
       '<td>&nbsp;&nbsp;/6=<th>'+Math.ceil(what/6)+
       '<td>&nbsp;&nbsp;/8=<th>'+Math.ceil(what/8)+
       '<td> &nbsp; (seconds)</table>'
    );
}

function timer(what){
  if(what)         {_timer = 0; _settimer = what;}
  if(_waiting==0)  {
    $('#showTimer')[0].innerHTML = 'completed in <b>' + _timer + ' seconds</b> (aprox)';
    return ;
  }
  if(_timer<_settimer){
     $('#showTimer')[0].innerHTML = _timer;
     setTimeout("timer()",1000);
     _timer++;
     return;
  }
  $('#showTimer')[0].innerHTML = '<b>don\'t wait any more!!!</b>';
}


function CBdebug(what){
    _waiting--;
    $('#req'+what.r)[0].innerHTML = 'x';
}


function dodebug(what){
    var tt = '<tr><td>' + what.r + '<td>' + what.w + '<td id=req' + what.r + '>&nbsp;'
    $('#debug').append(tt);
}


function clearTable(){
    $('#debug').replaceWith('<table border="1" id="debug"><tr><td>Request #<td>Wait Time<td>Done</table>');
}


</script>
</head>
<body>
<center>
<input type="button" value="start" id="boton">
<input type="text" value="80" id="total" size="2"> concurrent json requests
<table id="stat"><tr><td>&nbsp;</table>
Elapsed Time: <span id="showTimer"></span>
<table id="debug"></table>
</center>
</body>

Edit:
r means row and w waiting time.
When you initially press start button 80 (or any other number) of concurrent ajax request are launched by javascript, but as is known they are spooled by the browser. Also they are requested to the server in parallel (limited to certain number, this is the fact of this question). Here the requests are solved server side with a random delay (established by w). At start time all the time needed to solve all ajax calls is calculated. When test is finished, you can see if it took half, took third, took a quarter, etc of the total time, deducting which was the parallelism on the calls to the server. This is not strict, nor precise, but is nice to see in real time how ajaxs calls are completed (seeing the incoming cross). And is a very simple self contained script to show ajax basics.
Of course, this assumes, that server side is not introducing any extra limit.
Preferably use in conjunction with firebug net panel (or your browser's equivalent)

Saic Siquot
  • 6,513
  • 5
  • 34
  • 56
6

According to IE 9 – What’s Changed? on the HttpWatch blog, IE9 still has a 2 connection limit when over VPN.

Using a VPN Still Clobbers IE 9 Performance

We previously reported about the scaling back of the maximum number of concurrent connections in IE 8 when your PC uses a VPN connection. This happened even if the browser traffic didn’t go over that connection.

Unfortunately, IE 9 is affected by VPN connections in the same way:

Kevin Hakanson
  • 41,386
  • 23
  • 126
  • 155
0

I believe there is a maximum number of concurrent http requests that browsers will make to the same domain, which is in the order of 4-8 requests depending on the user's settings and browser.

You could set up your requests to go to different domains, which may or may not be feasible. The Yahoo guys did a lot of research in this area, which you can read about (here). Remember that every new domain you add also requires a DNS lookup. The YSlow guys recommend between 2 and 4 domains to achieve a good compromise between parallel requests and DNS lookups, although this is focusing on the page's loading time, not subsequent AJAX requests.

Can I ask why you want to make so many requests? There is good reasons for the browsers limiting the number of requests to the same domain. You will be better off bundling requests if possible.

cbp
  • 25,252
  • 29
  • 125
  • 205
  • 1
    My XmlHttpRequests cannot go to different domains as you suggest, due to the Same Origin Policy. (Perhaps this is an argument for using jsonp to get around this problem.) This page is a command-and-control dashboard for many computers; thus a request is spawned per execution requested by the user. – Michael Gundlach Feb 18 '09 at 13:52
0

A Good reason to move to http 2.0

With http2.0 the maximum number of connections per host is virtually unlimited: Is the per-host connection limit raised with HTTP/2?

Raskolnikov
  • 121
  • 2
  • 7