3

I'm updating thousands of rows in my database, I would like to know which is faster to use that will not overload my server,

I have here an ajax request that retrieves thousands of json data, under the success: function(response){} i have another ajax request that will update each record.

Just take a look at my code,

should i execute it one by one? or should i pass it as a whole?

$.ajax({
                url: '', /* API Url of JSON Data from Hosts */
                type: 'GET',
                dataType: 'json',
                xhrFields: {
                    withCredentials: true
                },
                success: function(response){  
                    for(i=0; i < response.length; i++)
                    {
                       $.ajax({
                            url: '/api/update_record',
                            type: 'GET',
                            dataType: 'json',
                            data: { 'sample_data': response[i] },
                            success: function(response){
                                        console.log(response);
                            }
                        });
                    }   
                }
            });

Thank You,

apelidoko
  • 782
  • 1
  • 7
  • 23
  • 1
    I would really suggest to do it in single request. As Multiple Request will incur excess bandwidth consumption. https://stackoverflow.com/questions/3138371/very-large-http-request-vs-many-small-requests – keysl Oct 20 '17 at 04:45
  • Thank you for your help, i have a followup question, expecting there are multiple users simultaneously using the system, Would there be big delay on their side if I'll be using only One Request that will update thousands of rows? I'm a little concerned on how other user's will be affected if I use a single request with big data or multiple request with one by one update approach,.. thank you! – apelidoko Oct 20 '17 at 06:18
  • Well, let's say a user will update a thousand rows on multiple request. Each request will have it's own header responses, each request will perform after preceding request was completed.(This is depends on your set up, if you're approach is parallel request then this problem will lessen). The said SINGLE user will make a THOUSAND of request. Notice that I highlighted the word SINGLE and THOUSAND. This means, if you have 100 of users then you have to process 100,000 of request. Which I find it very heavy in the client side. – keysl Oct 20 '17 at 06:56
  • On the other hand, a single request will have a better compression,faster (if use correctly), and is easy to maintain. My point is in your case, I find multiple request to be fault tolerant in data. I mean what if the network goes down when you're updating at the 500 row. The other 500 will be updated but the other will be not. Albeit if you use single request and same thing happened. The data will be rerolled and will not be comitted. – keysl Oct 20 '17 at 07:05
  • Thank You for your answer, so probably the right answer would still depend on where and how will I use it, either single request or multiple request, each one has its pros and cons, In my case only the administrator has access to update thousands of records wherein even if the network goes down it doesn't matter which one is updated or not as long as i will be able to update all of them, I'll probably use multiple request since only the admin will be updating thousands of records. Thank You for your help, – apelidoko Oct 20 '17 at 07:16
  • firing hundreds/thousands of request to the server is not good practice. Considering you server can face the load successfully, on a real machine, in a real data-center this could be interpreted as an attack and your request may be blocked. Final, do not consider firing thousands of requests. :) – Paun Narcis Iulian Oct 20 '17 at 11:43

0 Answers0