1

In the comment by the link Jquery multiple Ajax Request in array loop the author said that making ajax requests in a loop may end up DDOS-ing ourselves.

Does that apply only a loop or multiple ajax calls in general? I mean may the risk of DDOS-ing be as well if I make multiple ajax requests via recursive function like

ajax(0);

ajax(index) {
    var xhr = new XMLHttpRequest();
    xhr.onreadystatechange = function() {
        if(this.readyState == 4 && this.status == 200) {
            ajax(index+1)
        }
    };
    xhr.open('POST', 'http://example.com/ajax_handler.php');
    xhr.setRequestHeader('Content-type', 'application/x-www-form-urlencoded; charset=UTF-8');
    xhr.send();
}

ps. I understand that we can "congregate all data together then send that in a single request to the server", but I need to run generating static pages passing data from the client to the server. So if there are dozens of thousands of pages I must to pass to the server via AJAX, they can't be passing as one single request because of limit of POST requests.

Why so? I would just like to keep all the logic of the generator at the client and call at the server only standard operations like reading and writing files. That is the client reads templates and content via ajax and server reading function, build page html according to its logic and pass the whole html to the server to be written in a html file

stckvrw
  • 1,689
  • 18
  • 42
  • 1
    why would you need to pass dozens of thosands of pages from the client to the server? shouldnt it be the other way around? Regarding the ddos it absolutely depends on the number of clients and requests as well as the server-infrastructure, so for one client, doing several ajax-requests should be fine, but wht if there were thousands of clients with thousands of concurring request – john Smith May 11 '20 at 09:05
  • For recursive calls I don't think it will cause DOSing yourself, since call stack have limits (the depth of function called in one function, to say), usually the error for infinite recursion is `Maximum call stack size exceeded` – ROOT May 11 '20 at 09:06
  • 1
    about how many ajax-requests per client do we talk? looking at your code it would actually never stop / infinite requests – john Smith May 11 '20 at 09:07
  • @johnSmith it's just a simplified example. I know we need a condition to stop the process at some point I just not written to the code. As your first comment, see the last paragraph added to my post – stckvrw May 11 '20 at 09:19
  • 1
    Well AJAX-calls are asynchrone so doing it like you want to, you should set xmlhttprequest to "synchrone" so that they will be made in order. Yes but without information about how many potential clients and requests per client per minute and the amount of payload its hard to tell (your concept seems unefficient to me, in my opinion you should post the desired configuration and generate html on serverside) – john Smith May 11 '20 at 09:43
  • `request.open('GET', '/bar/foo.txt', false); // `false` makes the request synchronous` – john Smith May 11 '20 at 09:43
  • @johnSmith but CertainPerformance has just mentioned in its comment that the synchronous calls may cause the DDOS problem even via recursive function – stckvrw May 11 '20 at 09:51

1 Answers1

1

The problem Rory McCrossan was describing was if you make multiple requests at once. If you have lots of requests, you might overload the server (and/or your network connection) - you shouldn't make tons of requests at once. (Probably best to not send more than 5 request a second to a server, or something like that.)

But in your code, you're not sending out the requests at once; you only have at most one request active at any time, so the issue he was describing isn't something you need to worry about.

That said,

dozens of thousands of pages I must to pass to the server via AJAX

is a pretty odd requirement and will require a lot of bandwidth even if you don't overload the network. Consider if there's any more elegant solutions to the problem, such as generating/sending a page only when that page is requested.

CertainPerformance
  • 356,069
  • 52
  • 309
  • 320
  • Thanks. However, Rory McCrossan just has replied at the page and said that _"making [multiple ajax requests] in a recursive function has the same problem"_ that is DDOS-ing – stckvrw May 11 '20 at 09:37
  • In a *synchronous* recursive function, yes, but not in an asynchronous one. – CertainPerformance May 11 '20 at 09:40
  • 2
    If the recursive function calls itself immediately after initializing the last request, you'll be making lots of parallel requests. But if the recursive function calls itself only after the last request *finishes*, like in your case, there'll only be one request going on at any time. You'll be consuming server resources, but there aren't any *inherent* problems with that – CertainPerformance May 11 '20 at 09:57