2

The idea here is to create a XML file using xmlbuilder and use a xmlbuilder's streamWriter feature to pipe the output directly to S3.

My approach here is to pipe the input of the writable stream into a readable stream and upload the readable stream to S3. I am using a PassThrough stream for this as mentioned in this answer.

But when I try to upload the stream used by xmlbuilder, the S3 upload function does nothing (Even the error-object is empty).

Nevertheless I am able to use the stream as an readable stream when I pipe it to outStream.

So here is the code:

const fs = require('fs');
const stream = require('stream');
const AWS = require('aws-sdk');
const xmlbuilder = require('xmlbuilder');

const s3 = new AWS.S3({
  secretAccessKey: 'SECRETACCESSKEY',
  accessKeyId: 'ACCESSKEYID',
  region: 'eu-central-1',
  apiVersion: '2006-03-01',
});

const BUCKET = 'mybucket';
/**
 * Code for writeable stream from:
 * https://medium.freecodecamp.org/node-js-streams-everything-you-need-to-know-c9141306be93
 */
const outStream = new stream.Writable({
  write(chunk, encoding, callback) {
    console.log(chunk.toString());
    callback();
  },
});

/**
 * Try to upload stream from XMLBuilder
 */
const pass1 = new stream.PassThrough();
const pass2 = new stream.PassThrough();

const writer = xmlbuilder.streamWriter(pass1);
const xml = xmlbuilder.create('root', { version: '1.0', encoding: 'UTF-8' }).end(writer);

// Fails
const params1 = { Bucket: BUCKET, Key: 'sitemap/test1.txt', Body: pass2 };
s3.upload(params1, function(err, data) {
  console.log(err, data);
});

// Convert writeable stream in readable
pass1.pipe(pass2);

// But why is this working?
pass2.pipe(outStream);
ofhouse
  • 3,047
  • 1
  • 36
  • 42

0 Answers0