0

This is the code am running which returns the Range Maximum call stack size exceeded error. // to insert 10000 values on to mongodb using node.js

            var MongoClient = require('mongodb').MongoClient;
            var mongoServer = require('mongodb').Server;
            var serverOptions = {
                'auto_reconnect': true,
                'poolSize': 100
            };
            var i=0;
            var async =require('async');
            var mongoClient = new MongoClient(new mongoServer('localhost', 27017, serverOptions));
            var db = mongoClient.db('test');
            var collection = db.collection('new_file_test');

               mongoClient.open(function (err, mongoClient) 
                {

                   if(err){console.log(err)};
                   function start(i,call) 
                {
                    if(i<10000) {
                        call(start);
                    }
                }
                function pass(callback) 
                {
                    Insert(save);
                    i++;
                    callback(i,pass);
                }

                start(i,pass); 
                 });    
                function Insert(callback) {
                     console.log("Inserting" );
                     var doc={
                'trip_paramid':i,
                'tripid':'116', 
                'lattitude':'12.8929183',
                'longitude':'77.63627',
                'speed':'2',
                'heading':'0',
                'altitude':'80469',
                'address':'qwertyasdfgxcvbn',
                'engine_status':'Normal',
                'oil_pressure': '83.12',
                'water_temp': '28',
                'fuel_content':'0',
                'brake':'Normal',
                'creation_time':'2013-08-31 23:22:17',
                'brakelight_status':'Normal',
                'battery_status':'12.68',
                'event_code':'8',
                'dbinsert_time':'2013-08-31 23:24:59',
                'gsm_status':'-51',
                'cell_id':'45',
                'vehicle_id':'123456',
                'distance':'0'}
                        callback(doc);          
                }   

                function save(doc)  
                {
                    collection.insert(doc, function(err) 
                    {
                    if (err) 
                    { 
                        console.log('Error occured'); 
                    }
                    else
                        console.log("Saved");
                    });
                }

If the condition is to insert 1000 rows it works fine and the error throws only when the condition goes beyond 10000.

Farid Nouri Neshat
  • 29,438
  • 6
  • 74
  • 115
sabari
  • 91
  • 2
  • 14

2 Answers2

2

Looping over 10000 times and performing insert is really a bad idea. But still you can do with async library which might help you fix the issue. I have came across this situation before and i used async.queue to overcome the issue.

Async.js module.

mohamedrias
  • 18,326
  • 2
  • 38
  • 47
1

The problem comes from the recursive loop you made:

function start(i, call) {
    if (i < 10000) {
        call(start);
    }
}

function pass(callback) {
    Insert(save);
    i++;
    callback(i, pass);
}

start(i, pass);

You should change it to something like this:

for (var i = 0; i < 10000; i++) {
   Insert(save);
}

Simplifying your code you have this:

var i = 0;
function pass() {
    if (i < 10000) {
        Insert(save);
        pass(i);
    }
    i++;
}

pass();

The problem comes from the part that you are calling this function recursively, and since javascript doesn't have tail recursion elimination, the callstack keeps growing. V8(nodejs javascript engine) has it's limits, the callstack once reached to the maximum defined size the error will be thrown.

You can also have look at the following questions for more information:

This is all about fixing Maximum call stack size exceeded error. But 10000 looks like a huge number. I just ran that and it took about 3 seconds on my machine, to finish the loop using monk. Using mongo shell it took about 1 second. If you are running a server, when the loop is running your application is unresponsive.

I suggest instead, insert in batches, and use node's setImmediate function to schedule the next batch to be run after pending I/O events(like handling new web requests):

function insert10000(i) {
    insert100();
    i++;
    if (i < 100) {
        setImmidiate(insert10000, i);
    }
}

function insert100() {
    for (var i = 0; i < 100; i++) {
        Insert(save);
    }
}

And since we came on the topic of batching insert calls, collection.insert method, supports an array of documents instead of just one to be inserted.

So when we currently have something like following:

collection.insert(doc1);
collection.insert(doc2);

It can be changed to this:

collection.insert([doc1, doc2]);

And that actually is faster. So you can change the code to this:

function insert10000(i) {
    insert100(i);
    i++;
    if (i < 100) {
        setImmediate(insert10000, i);
    }
}

function insert100(i) {
    var docs = [];
    for (var l = i + 1000; i < l; i++) {
        docs.push({
            'trip_paramid':i,
            'tripid':'116', 
            'lattitude':'12.8929183',
            'longitude':'77.63627',
            'speed':'2',
            'heading':'0',
            'altitude':'80469',
            'address':'qwertyasdfgxcvbn',
            'engine_status':'Normal',
            'oil_pressure': '83.12',
            'water_temp': '28',
            'fuel_content':'0',
            'brake':'Normal',
            'creation_time':'2013-08-31 23:22:17',
            'brakelight_status':'Normal',
            'battery_status':'12.68',
            'event_code':'8',
            'dbinsert_time':'2013-08-31 23:24:59',
            'gsm_status':'-51',
            'cell_id':'45',
            'vehicle_id':'123456',
            'distance':'0'
        });
    }
    collection.insert(docs, function(err) {
        if (err) {
            console.log('Error occurred', err); 
        }
    });
}

I measured this, it was faster twice faster than the original case.

Community
  • 1
  • 1
Farid Nouri Neshat
  • 29,438
  • 6
  • 74
  • 115
  • Farid i tried this code , but still at a certain level when i increases the volume of the data to be inserted the mongod disconnected throwing an fatal assertion error and i am using default pool size of 5, mostly i need to insert the data one by one as like in the above code which i have mentioned, to set an vale to the doc and to insert the value, and i need to insert such data about 1 crore, can u please suggest me some ideas to reuse the established connection. – sabari May 16 '14 at 04:37
  • That's surprising. How much did you increase it to? The batch insert has a limit of 16 megabytes. Make your array size for each insert command less than that. I ran this example on safe mode, with inserting 35000 documents array each time over and over again with setImmidiate in between and it worked fine till I ran out of disk space. What is the exact assertion error you got? You sure it wasn't disk space error? :P Is this is a server or a command line app? – Farid Nouri Neshat May 16 '14 at 14:15
  • You can also call `setImmidiate(insert10000, i)` in the callback of `collection.insert`, if you want to give database some time to write and not let memory usage grow. But it will be slower. – Farid Nouri Neshat May 16 '14 at 14:24
  • I tried calling insert100 for 10000 times, The connection is lost when it tries to write the 3048 kb of file in to the mongodb. While writing on to the db saying If connection not compatible error has been throwned and the connection is lost. – sabari May 17 '14 at 03:27
  • I can't replicate that. It worked just fine for me. It took a while, but it did insert all 1000000 documents successfully. – Farid Nouri Neshat May 17 '14 at 03:35
  • Maybe MongoDB goes down in between. Maybe it could be using too much memory and it is killed by OOM killer, and that's why the connection is lost. Try calling the next insert100 on the callback of `collection.insert` in safe mode. – Farid Nouri Neshat May 17 '14 at 03:45
  • Farid 10 lakh recors seems to be ok but the maximum record getting inserted is 12 lakh 77 thousand and the connection getting terminated. – sabari May 17 '14 at 04:38
  • Am running mongo db 32 bit on an windows xp 32 bit system. – sabari May 17 '14 at 04:45
  • I'm having 64bit on linux. That makes sense because I think you are hitting the 2GB storage limitation on 32bit platform: http://blog.mongodb.org/post/137788967/32-bit-limitations . Use a 64bit system instead. Also Windows XP 32bit is super old now, you might run into more limitation like this. – Farid Nouri Neshat May 17 '14 at 05:50
  • Can you accept this answer and close this question?, since I have already solved the `Maximum call stack size exceeded` error. Create a new questions if you still have specific problems to solve, so experts will see it too to help. – Farid Nouri Neshat May 17 '14 at 08:26
  • Sure farid, i crossed the maximum call stack size exceed error and found that only 2 gb of data can only be stored using the 32 bit version of mongodb and so i installed 64 bit version and its working perfectly fine. – sabari May 20 '14 at 05:06