1

I'm using the mssql package for node.js (v4.4.5 for windows). I have a query that returns 1.000.000 rows and it's very very slow... more than 30 seconds. I have tried with:

sql.connect(sqlConfig).then(function() {
    new sql.Request().query('select * from myTable').then(function(recordset) {
        //console.dir(recordset);
        console.log('done');
    }).catch(function(err) {
        // ... query error checks
        console.log('query error: '+err);
    });
}).catch(function(err) {
    // ... connect error checks
    console.log('connection error: '+err);
});

Also tried with a stored procedure, and also tried with streaming:

var i = 0;
sql.connect(sqlConfig, function(err) {      
    var request = new sql.Request();
        request.stream = true;
        request.input('id', sql.Int,myId);
        request.execute('myStoredProcedure');

        request.on('recordset', function(columns) {
        // Emitted once for each recordset in a query 
            //console.log('RECORDSET:');
            //console.log(columns);
        });

        request.on('row', function(row) {
            // Emitted for each row in a recordset 
            if(i%100000==0)console.log('ROWS:'+i);
            ++i;
            //console.log(row);
        });

        request.on('error', function(err) {
            // May be emitted multiple times 
            console.log('ERROR:');
            console.log(err);
        });

        request.on('done', function(affected) {
            // Always emitted as the last one 
            console.log('DONE:');
            console.log(affected);
        });
});
sql.on('error', function(err) {
    // ... error handler 
    console.log(err);
});

But there is no way to get this working (fast enough).

I am doing something wrong? Or is there another way to do it or a better package?

P.S. i am using this same procedure with php and mssql driver and it's working fine (+- 5 seconds).

Thanks! ;)

charly rl
  • 845
  • 2
  • 7
  • 15
  • You're loading *everything*, and for large tables this will be slow. There's no way around that. Is this strictly necessary? Usually that's a sign of a problem at the design level. Try [limiting how many rows you fetch](http://stackoverflow.com/questions/971964/limit-10-20-in-sql-server). – tadman Jun 21 '16 at 17:51
  • Thanks for your input. I need to plot all the points on a graph. ( 1.000.000 are not too many points.) What surprises me is that node.js( or node-mssql package) is sooo slower than php ?¿.. – charly rl Jun 21 '16 at 18:44
  • Have you tried using [Sequelize](http://docs.sequelizejs.com/en/latest/)? I've found it to be quite fast. Do try and select *only* the columns you need. – tadman Jun 21 '16 at 19:07
  • Already tried, it's the same. Thanks. – charly rl Jun 22 '16 at 07:28

0 Answers0