I am trying to write chunks of CSV formatted data to a file in Amazon S3 instead of writing to a temporary file through the WriteStream then creating a ReadStream on that file and sending it to S3. My program pulls rows of data from a database, processed it then formats each row in a CSV like so using S3's upload() api
let recordsCSVFormatted;
let offset = 0;
const batchSize = 500;
const writer = fs.createWriteStream('./someFile.csv')
do {
recordsCSVFormatted = await getRecords(limit, offset); // gets records from DB, formats it in CSV string
writer.write(recordsCSVFormatted);
offset += batchSize;
} while (typeof recordsCSVFormatted === 'undefined' || (recordsCSVFormatted && recordsCSVFormatted.length))
const reader = fs.createReadStream('./someFile.csv');
// just assume here that Key and Bucket are provided in upload, they are in actual code
await new AWS.S3({...s3Opts}).upload({Body: reader}).promise() // pass the readable in here for AWS
How can I skip the step of creating a temporary file and then passing the file to AWS as a stream? I want to be able to stream the chunks of CSV information directly.