66

I am struggling to find a way to write data to a CSV in Node.js.

There are several CSV plugins available however they only 'write' to stdout.

Ideally I want to write on a row-by-row basis using a loop.

Blaszard
  • 30,954
  • 51
  • 153
  • 233
Phil Bottomley
  • 2,387
  • 1
  • 17
  • 20
  • 2
    *"however they only 'write' to stdout"* That seems **really** surprising. They won't write to any writeable `Stream`, it **has** to be `stdout`?! – T.J. Crowder Apr 19 '12 at 11:35
  • Could you include links to the modules you've tested, so others can review them and/or know which alternates to suggest? – Joe White Apr 19 '12 at 11:54
  • 1
    there is a tutorial on generate CSV using nodejs. http://programmerblog.net/generate-csv-using-nodejs/ – Maz I Nov 07 '17 at 09:26

9 Answers9

55

You can use fs (https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback):

var dataToWrite;
var fs = require('fs');

fs.writeFile('form-tracking/formList.csv', dataToWrite, 'utf8', function (err) {
  if (err) {
    console.log('Some error occured - file either not saved or corrupted file saved.');
  } else{
    console.log('It\'s saved!');
  }
});
John Vandivier
  • 2,158
  • 1
  • 17
  • 23
35

The docs for node-csv-parser (npm install csv) specifically state that it can be used with streams (see fromStream, toStream). So it's not hard-coded to use stdout.

Several other CSV parsers also come up when you npm search csv -- you might want to look at them too.

Joe White
  • 94,807
  • 60
  • 220
  • 330
28

Here is a simple example using csv-stringify to write a dataset that fits in memory to a csv file using fs.writeFile.

import stringify from 'csv-stringify';
import fs from 'fs';

let data = [];
let columns = {
  id: 'id',
  name: 'Name'
};

for (var i = 0; i < 10; i++) {
  data.push([i, 'Name ' + i]);
}

stringify(data, { header: true, columns: columns }, (err, output) => {
  if (err) throw err;
  fs.writeFile('my.csv', output, (err) => {
    if (err) throw err;
    console.log('my.csv saved.');
  });
});
cbaigorri
  • 2,467
  • 25
  • 26
  • What is the advantage here compared to John's solution? Make sure it is `string` that is written? – Timo May 15 '21 at 10:50
  • This solution helped me out. I needed to have different column names than the JSON field names. Defining columns as that object did the trick. – GoForth Dec 25 '22 at 22:21
23

If you want to use a loop as you say you can do something like this with Node fs:

let fs = require("fs")

let writeStream = fs.createWriteStream('/path/filename.csv')

someArrayOfObjects.forEach((someObject, index) => {     
    let newLine = []
    newLine.push(someObject.stringPropertyOne)
    newLine.push(someObject.stringPropertyTwo)
    ....

    writeStream.write(newLine.join(',')+ '\n', () => {
        // a line was written to stream
    })
})

writeStream.end()

writeStream.on('finish', () => {
    console.log('finish write stream, moving along')
}).on('error', (err) => {
    console.log(err)
})
Centillion
  • 321
  • 3
  • 7
6

In case you don't wanna use any library besides fs, you can do it manually.

let fileString = ""
let separator = ","
let fileType = "csv"
let file = `fileExample.${fileType}`

Object.keys(jsonObject[0]).forEach(value=>fileString += `${value}${separator}`)
    fileString = fileString.slice(0, -1)
    fileString += "\n"

    jsonObject.forEach(transaction=>{
        Object.values(transaction).forEach(value=>fileString += `${value}${separator}`)
        fileString = fileString.slice(0, -1)
        fileString += "\n"
    })

fs.writeFileSync(file, fileString, 'utf8')
Matt Ke
  • 3,599
  • 12
  • 30
  • 49
Gabriel Borges
  • 105
  • 1
  • 5
3

Writing a CSV is pretty easy and can be done without a library.

import { writeFile } from 'fs/promises';
// you can use just fs module too

// Let's say you want to print a list of users to a CSV
const users = [
  { id: 1, name: 'John Doe0', age: 21 },
  { id: 2, name: 'John Doe1', age: 22 },
  { id: 3, name: 'John Doe2', age: 23 }
];

// CSV is formatted in the following format 
/*
  column1, column2, column3
  value1, value2, value3
  value1, value2, value
*/
// which we can do easily by
const dataCSV = users.reduce((acc, user) => {
    acc += `${user.id}, ${user.name}, ${user.age}\n`;
    return acc;
  }, 
  `id, name, age\n` // column names for csv
);

// finally, write csv content to a file using Node's fs module
writeFile('mycsv.csv', dataCSV, 'utf8')
  .then(() => // handle success)
  .catch((error) => // handle error)

NOTE: If your CSV content has , in it, you must escape it or use another delimiter. If that's the case, I suggest using a library like csv-stringify

Haseeb Anwar
  • 2,438
  • 19
  • 22
  • great it works nicely, but this only writes one row of data. how to append multiple rows to this file, but writing the header row only once – abdp May 04 '23 at 12:29
  • the reduce is doing exactly this, you must be doing something wrong with mapping data for CSV – Haseeb Anwar May 05 '23 at 07:44
1

For those who prefer fast-csv:

const { writeToPath } = require('@fast-csv/format');

const path = `${__dirname}/people.csv`;
const data = [{ name: 'Stevie', id: 10 }, { name: 'Ray', id: 20 }];
const options = { headers: true, quoteColumns: true };

writeToPath(path, data, options)
        .on('error', err => console.error(err))
        .on('finish', () => console.log('Done writing.'));
diogo.silva
  • 448
  • 4
  • 13
Ron____
  • 782
  • 1
  • 6
  • 6
1

**In case you don't wanna use any library besides fs, you can do it manually. More over you can filter the data as you want to write to CSV file **

router.get('/apiname', (req, res) => {
 const data = arrayOfObject; // you will get from somewhere
 /*
    // Modify old data (New Key Names)
    let modifiedData = data.map(({ oldKey1: newKey1, oldKey2: newKey2, ...rest }) => ({ newKey1, newKey2, ...rest }));
 */
 const path = './test'
 writeToFile(path, data, (result) => {
     // get the result from callback and process
     console.log(result) // success or error
   });
});

writeToFile = (path, data, callback) => {
    fs.writeFile(path, JSON.stringify(data, null, 2), (err) => { // JSON.stringify(data, null, 2) help you to write the data line by line
            if (!err) {
                callback('success');
                // successfull
            }
            else {
                 callback('error');
               // some error (catch this error)
            }
        });
}
0

this is the code that worked for me in nest js

import { Parser } from "json2csv";

 const csv = require('csvtojson');

      const csvFilePath = process.cwd() + '/' + file.path;

      let csv data  = await csv().fromFile(csvFilePath); /// read data from csv into an array of json 
          


/// * from here how to write data into csv *


          data.push({
                label: value,
                .......
          })                 
        }

        const fields = [
         'field1','field2', ... 
        ]
        
        const parser = new Parser({ fields, header:false }); /// if dont want header else remove header: false
        const csv = parser.parse(data);
        appendFileSync('./filename.csv',`${csv}\n`); // remove /n if you dont want new line at the end 
      
Rahul Somaraj
  • 251
  • 2
  • 9