0

I've been searching for a way to write to a JSON file in a S3 bucket from the pre signed URL. From my research it appears it can be done but these are not in Node:

Not finding a Node solution from my searches and using a 3rd party API I'm trying to write the callback to a JSON that is in a S3 bucket. I can generate the pre signed URL with no issues but when I try to write dummy text to the pre signed URL I get:

Error: ENOENT: no such file or directory, open 'https://path-to-file-with-signed-url'

When I try to use writeFile:

fs.writeFile(testURL, `This is a write test: ${Date.now()}`, function(err) {
  if(err) return err
  console.log("File written to")
})

and my understanding of the documentation under file it says I can use a URL. I'm starting to believe this might be a permissions issue but I'm not finding any luck in the documentation.

After implementing node-fetch I still get an error (403 Forbidden) writing to a file in S3 based on the pre signed URL, here is the full code from the module I've written:

const aws = require('aws-sdk')
const config = require('../config.json')
const fetch = require('node-fetch')
const expireStamp = 604800 // 7 days

const existsModule = require('./existsModule')

module.exports = async function(toSignFile) {
  let checkJSON = await existsModule(`${toSignFile}.json`)
  if (checkJSON == true) {
    let testURL = await s3signing(`${toSignFile}.json`)
    fetch(testURL, {
      method: 'PUT',
      body: JSON.stringify(`This is a write test: ${Date.now()}`),
    }).then((res) => {
      console.log(res)
    }).catch((err) => {
      console.log(`Fetch issue: ${err}`)
    })
  }
}

async function s3signing(signFile) {
  const s3 = new aws.S3()
  aws.config.update({
    accessKeyId: config.aws.accessKey,
    secretAccessKey: config.aws.secretKey,
    region: config.aws.region,
  })
  params = {
    Bucket: config.aws.bucket,
    Key: signFile,
    Expires: expireStamp
  }
  try {
    // let signedURL = await s3.getSignedUrl('getObject', params)
    let signedURL = await s3.getSignedUrl('putObject', params)
    console.log('\x1b[36m%s\x1b[0m', `Signed URL: ${signedURL}`)
    return signedURL
  } catch (err) {
    return err
  }
}

Reviewing the permissions I have no issues with uploading and write access has been set in the permissions. In Node how can I write to a file in the S3 bucket using that file's pre-signed URL as the path?

DᴀʀᴛʜVᴀᴅᴇʀ
  • 7,681
  • 17
  • 73
  • 127
  • Do you know for sure that the AWS credentials you are using to sign the request actually allow s3:PutObject to that S3 bucket? Maybe verify this with the awscli. Also see https://aws.amazon.com/premiumsupport/knowledge-center/s3-403-forbidden-error/ for some advice on diagnosing 403 (with the console but the content is still of value). – jarmod Jul 01 '19 at 19:36

2 Answers2

1

fs is the filesystem module. You can't use it as an HTTP client.

You can use the built-in https module, but I think you'll find it easier to use node-fetch.

fetch('your signed URL here', {
  method: 'PUT',
  body: JSON.stringify(data),
  // more options and request headers and such here
}).then((res) => {
  // do something
}).catch((e) => {
  // do something else
});
Brad
  • 159,648
  • 54
  • 349
  • 530
0

Was looking for an elegant way to transfer s3 file to an s3 signed url using PUT. Most examples I found were using the PUT({body : data}). I came across one suggestion to read the data to a readable stream and then pipe it to the PUT. However I still didn't like the notion of loading large files into memory and then assigning them to the put stream. Piping read to write is always better in memory and performance. Since the s3.getObject().createReadStream() returns a request object, which supports pipe, all that we need to do is to pipe it correctly to the PUT request which exposes a write stream.

Get object function

async function GetFileReadStream(key){  
    return new Promise(async (resolve,reject)=>{  
        var params = {
            Bucket: bucket, 
            Key: key
            };
        var fileSize = await  s3.headObject(params)
        .promise()
        .then(res => res.ContentLength);
        resolve( {stream : s3.getObject(params).createReadStream(),fileSize});
    });
}

Put object function

const request = require('request');
async function putStream(presignedUrl,readStream){
    return new Promise((resolve,reject)=>{
        var putRequestWriteStream =  request.put({url:presignedUrl,headers:{'Content-Type':'application/octet-stream','Content-Length':readStream.fileSize  }});
        putRequestWriteStream.on('response', function(response) {           
            var etag = response.headers['etag'];
            resolve(etag); 
            })
            .on('end', () => 
                console.log("put done"))
        readStream.stream.pipe(putRequestWriteStream);
    });
}

This works great with a very small memory foot print. Enjoy.

user3484816
  • 51
  • 1
  • 2