0

I'm trying to write a controller that uploads a file to S3 location. However, before upload I need to validate if the incoming file type is a csv or not. And then I need to read the file to check for header colummns in the files etc. I got the type of the file as per below snippet:

    req.file('foo')._files[0].stream  

But, how to read the entire file stream and check for headers and data etc?There were other similar Qs like (Sails.js Skipper: How to read the uploaded file stream during upload?). But the solution mentioned is to use skipper-csv adapter(which i cannot use as I already use skipper-s3 to upload to s3).

Can someone please post an example on how to read the upstreams and perform any validations before the upload?

Bhuvan
  • 473
  • 1
  • 4
  • 9

1 Answers1

0

Here is how my problem got solved: I'm making a copy of the stream to validate before actual upload. And then checking my validations on the original stream and once passed, I upload the copied stream to my desired location.

For reading the Csv stream, I found a npm package: csv-parser(https://github.com/mafintosh/csv-parser) , which I felt easy to handle events like headers, data.

For creating the copy of the stream, I used the following logic:

const upstream = req.file('file');
const fileStreamMap = {};
const fileStreamMapCopy = {};
_.each(upstream._files, (file) => {  
     const stream = PassThrough();
     const streamCopy = PassThrough();
     file.stream.pipe(stream);
     file.stream.pipe(streamCopy);
     fileStreamMap[fileName] = stream;
     fileStreamMapCopy[fileName] = streamCopy;
 });
 // validate and upload files to S3, if Valid.
 validateAndUploadFile(fileStreamMap, fileStreamMapCopy);
}

validateAndUploadFile() contains my custom validation logic for my csv upload.

Also, we can use aws-sdk(https://www.npmjs.com/package/aws-sdk) for s3 upload. Hope, this helps someone.

Bhuvan
  • 473
  • 1
  • 4
  • 9