0

I want to create a CSV file using csv-writer , and upload that csv to azure blob, I am able to create the csv, store it on local system, and then read from local and upload to blob using azure-storage npm. But I dont want to create/store the CSV on local filesystem (because of some issues that I am running into on Prod), is there any way to create the CSV and directly feed to azure blob storage, without writing the csv to local file system.

Some code for reference

const csvWriter = createCsvWriter({
        path: `__dirname/${blobName}`,
        header: [
          { id: "id", title: "name" },
        ],
      });

      await csvWriter
        .writeRecords(csvData)
        .then(() => console.log("file successfully written")); 

And once this csv is created on local, read it from there using fs module, and upload to blob using "blobService.createBlockBlobFromStream" function.

Can you please suggest how can I directly give path of azure blob storage to csvWriter? or is there any other way to achieve this?

JSDrogon
  • 31
  • 8
  • Have you tried [this](https://stackoverflow.com/questions/64041193/stream-upload-file-to-azure-blob-storage-with-node-and-graphql?noredirect=1&lq=1) ? – Apoorva Chikara Mar 16 '21 at 11:05
  • No actually, but how can that be relevant? I am using csv-writer package, which needs a path of where to create the csv, and I want to upload that directly to azure blob storage. Can you please elaborate? – JSDrogon Mar 16 '21 at 11:12

1 Answers1

3

Please try the code below.

const {BlobServiceClient, StorageSharedKeyCredential} = require('@azure/storage-blob');
const createCsvStringifier = require('csv-writer').createObjectCsvStringifier;
const accountName = 'account-name';
const accountKey = 'account-key';
const container = 'container-name';
const blobName = 'text.csv';

const csvStringifier = createCsvStringifier({
    header: [
        {id: 'name', title: 'NAME'},
        {id: 'lang', title: 'LANGUAGE'}
    ]
});
const records = [
    {name: 'Bob',  lang: 'French, English'},
    {name: 'Mary', lang: 'English'}
];
const headers = csvStringifier.getHeaderString();
const data = csvStringifier.stringifyRecords(records);
const blobData = `${headers}${data}`;
const credentials = new StorageSharedKeyCredential(accountName, accountKey);
const blobServiceClient = new BlobServiceClient(`https://${accountName}.blob.core.windows.net`, credentials);
const containerClient = blobServiceClient.getContainerClient(container);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const options = {
    blobHTTPHeaders: {
        blobContentType: 'text/csv'
    }
};
blockBlobClient.uploadData(Buffer.from(blobData), options)
.then((result) => {
    console.log('blob uploaded successfully!');
    console.log(result);
})
.catch((error) => {
    console.log('failed to upload blob');
    console.log(error);
});

Two things essentially in this code:

  1. Use createObjectCsvStringifier if you don't want to write the data to disk.

  2. Use @azure/storage-blob node package instead of azure-storage package as former is the newer one and the latter is being deprecated.


Update

Here's the code using azure-storage package.

const azure = require('azure-storage');
const createCsvStringifier = require('csv-writer').createObjectCsvStringifier;
const accountName = 'account-name';
const accountKey = 'account-key';
const container = 'container-name';
const blobName = 'text.csv';

const csvStringifier = createCsvStringifier({
    header: [
        {id: 'name', title: 'NAME'},
        {id: 'lang', title: 'LANGUAGE'}
    ]
});
const records = [
    {name: 'Bob',  lang: 'French, English'},
    {name: 'Mary', lang: 'English'}
];
const headers = csvStringifier.getHeaderString();
const data = csvStringifier.stringifyRecords(records);
const blobData = `${headers}${data}`;

const blobService = azure.createBlobService(accountName, accountKey);
const options = {
    contentSettings: {
        contentType: 'text/csv'
    }
}
blobService.createBlockBlobFromText(container, blobName, blobData, options, (error, response, result) => {
    if (error) {
        console.log('failed to upload blob');
        console.log(error);
    } else {
        console.log('blob uploaded successfully!');
        console.log(result);
    }
});
Gaurav Mantri
  • 128,066
  • 12
  • 206
  • 241
  • Thanks this worked for me in local, but when I am deploying the solution on azure, it is running into some issues : Unhandled Rejection at: Promise Promise { TypeError: Cannot read property 'MAX_LENGTH' of undefined at Object. (D:\home\site\wwwroot\node_modules\@azure\storage-common\src\PooledBuffer.ts:11:52) at Module._compile (module.js:570:32) Can you please help me in this? – JSDrogon Mar 17 '21 at 11:14
  • I looked up the source code and it seems the error is coming from the following line of code `var maxBufferLength = require("buffer").constants.MAX_LENGTH;`. I am not sure why though. Where in Azure are you running the code - Function, WebApp, or something else? – Gaurav Mantri Mar 17 '21 at 12:23
  • You may want to create a new issue here: https://github.com/Azure/azure-sdk-for-js/issues. – Gaurav Mantri Mar 17 '21 at 12:30
  • Sure thanks, it helps, Can the same implementation done with the help of "azure-storage" package? I was trying to convert the above code to use the "azure-storage" package, but couldnt, can you help me with this one? – JSDrogon Mar 17 '21 at 12:35
  • Updated my answer and provided sample code using `azure-storage` package. Not sure if you would get the same error when the code runs in Azure. HTH. – Gaurav Mantri Mar 17 '21 at 13:21
  • how can we update existing csv file to add a new row ? It is always getting overriden with latest data – Sharath Jallu Apr 11 '22 at 17:17