2

I work on a web application in NodeJS, and now I want add a module for exporting massive records from my SQL Server's database (10.000.000 records) according to stream/socket to a .CSV file.
If 100 users downloading data (.csv) from server, increased lot of memory usage in server / client.
I want if possible, save data(.csv) into hard drive(client), no into server/client Memory(RAM).

Server.js

var http = require('http');
var express = require('express');
var app = express();
var server = http.Server(app);
var io = require('socket.io')(server);

io.on('connection', function (socket) {
    console.log('Connection is ready!')
    socket.on('get_records', function (data) {
    var connection = new sql.Connection(config, function (err) {
        if (err) {
            console.log(err.message);
        }

        var request = new sql.Request(connection);
        request.stream = true;

        request.query("Select * from my_table");
        // ... error checks
        request.on('recordset', function (recordset) {
            // Emitted once for each recordset in a query
        });

        request.on('row', function (record) {
            // Emitted for each row in a recordset
            socket.emit('recieve_records', record); //send record by record to client
        });

        request.on('error', function (err) {
            console.log(err.message);
        });

        request.on('done', function (returnValue) {
            // Always emitted as the last one
        });
    });
});

Edit: See below post
StreamSaver.js - Error in downloading (Network failed)

Community
  • 1
  • 1

1 Answers1

1

If you transfer a file using socket.io there is no easy/robust way to initiate the download dialog. I found 3 solutions to save the file:

  1. This answer. But you'll have to keep the whole file in RAM before save. + it has max blob size restriction.
  2. FileSaver module. Same idea, wrapped into a module (5k stars at GitHub). Still restricted to blob size, and keeps everything in memory.
  3. StreamSaver module. Doesn't have blob size restriction. But doesn't work at all in Firefox, IE, Edge.

That is why I suggest you to use simple HTTP for file transfers.

Then you could simply use <a href="path/to/your/endpoint"> tag to download it or use some tricks from here.

So in case you have Node.js Readable stream that emits objects you can use 'csv' module and convert it to 'csv' on the fly. Then simply pipe it to Express response object.

var csv = require('csv');

router.get('/csv', function (req, res, next) {
 //Handle the connection here, you might decide to use connection pool is supported
 new sql.Connection(config, function (err) {
  if (err) {
   console.log(err.message);
  }

  //You can pipe mssql request as per docs
  var request = new sql.Request(connection);
  request.stream = true;
  request.query("Select * from my_table");

  var stringifier = csv.stringify({header: true});
  //Then simply call attachment and pipe it to the response
  res.attachment('sample.csv');
  request.pipe(stringifier).pipe(res);
 });
});

Also checkout the csv-stringify docs as there are useful options such as headers: true (to add headers row) and others.

Community
  • 1
  • 1
Antonio Narkevich
  • 4,206
  • 18
  • 28