13

I'd like to upload a file to AWS S3 via the POST interface, but I fail to do so.

I've already made it work with PUT and getSignedUrl, but unfortunately that interface doesn't allow direct file size restrictions. So I tried to use the POST interface, because there I can use 'content-length-range' condition.

Here's my request signature:

const aws = require('aws-sdk');

aws.config.update({
    signatureVersion: 'v4',
    region: 'eu-central-1',
    accessKeyId: config.aws.keyId,
    secretAccessKey: config.aws.keySecret
});

const s3 = new aws.S3();

return new Promise((resolve, reject) => {
    const params = {
        Bucket: config.aws.bucket,
        Fields: {
            key: filePath
        },
        Expires: config.aws.expire,
        Conditions: [
            ['acl', 'public-read'],
            ['content-length-range', 0, 10000000] // 10 Mb
        ]
    };
    const postUrl = s3.createPresignedPost(params, (err, data) => {
        resolve(data);
    });
});

This part seems to be OK, but I can't use the required signature to upload a file to S3.

Here are a few other attempts I made:

request.post({
    url: payload.url,
    body: payload,
    form: fs.createReadStream(__dirname + `/${filePath}`)
}, (err, response, body) => {});

Another attempt:

let formData = payload;
formData.file = fs.createReadStream(__dirname + `/${filePath}`);
request.post({ 
    url: payload.url,
    formData: formData
}, (err, response, body) => {});

With fetch:

const fetch = require('node-fetch');
const FormData = require('form-data');

const form = new FormData();
const fields = payload.fields;
for(const field in payload.fields) {
    form.append(field, payload.fields[field]);
}
form.append('file', fs.createReadStream(__dirname + `/${filePath}`));
fetch(payload.url, {
    method: 'POST',
    body: form.toString(),
    headers: form.getHeaders()
})
.then((response) => {})
.catch((err) => {});

Neither of these work, they either say 'Bad request', or 'Badly formed request'. One of them uploaded something to the server, but the file was unreadable.

How can I add a max file size limit to an S3 bucket?

Update: I think I move forward just a little. With this code, I get the error response: You must provide the Content-Length HTTP header.

const fetch = require('node-fetch');
const FormData = require('form-data');

const form = new FormData();
form.append('acl', 'public-read');
for(const field in payload.fields) {
    form.append(field, payload.fields[field]);
}
form.append('file', fs.createReadStream(__dirname + `/${filePath}`));

fetch(payload.url, {
    method: 'POST',
    body: form,
    headers: form.getHeaders()
})
.then((response) => { return response.text(); })
.then((payload) => { console.log(payload); })
.catch((err) => console.log(`Error: ${err}`));
Bence Gedai
  • 1,461
  • 2
  • 13
  • 25
  • 1
    I believe that the S3 SDK sends extra meta data that is required for the server to process the file. Use a proxy such as Fiddler to inspect the requests sent by the SDK so you can begin to replicate it. Compare the SDK request and the `request.post` request and find the difference then compensate for it. I would help you out further by doing this for you but I'm not familiar with S3 from a JavaScript standpoint as I've only used it via .NET As for the max bucket size, read the AWS documentation. – Kieran Devlin Jul 03 '17 at 14:41

2 Answers2

15

Finally it works. Here's the code in case anyone has the same problem.

A few things to note:

  • Request or form-data libraries have a bug, one of them doesn't set the 'Content-Length' header. See the issue https://github.com/request/request/issues/316
  • The order of the form fields are important, acl later, it will fail.
  • There are different AWS protocols out there, you should check the ones available in your zone. In my case, I had to set signatureVersion to v4 even in the S3 constructor.

I'm not proud of the code quality, but at last it works.

const aws = require('aws-sdk');
const fs = require('fs');
const request = require('request');
const config = require('./config');

let s3;

const init = () => {
    aws.config.update({
        signatureVersion: 'v4',
        region: 'eu-central-1',
        accessKeyId: config.aws.keyId,
        secretAccessKey: config.aws.keySecret
    });

    s3 = new aws.S3({signatureVersion: 'v4'});
};

const signFile = (filePath) => {
    return new Promise((resolve, reject) => {
        const params = {
            Bucket: config.aws.bucket,
            Fields: {
                key: filePath
            },
            Expires: config.aws.expire,
            Conditions: [
                ['content-length-range', 0, 10000000], // 10 Mb
                {'acl': 'public-read'}
            ]
        };
        s3.createPresignedPost(params, (err, data) => {
            resolve(data);
        });
    });
};

const sendFile = (filePath, payload) => {
    const fetch = require('node-fetch');
    const FormData = require('form-data');

    const form = new FormData();
    form.append('acl', 'public-read');
    for(const field in payload.fields) {
        form.append(field, payload.fields[field]);
    }
    form.append('file', fs.createReadStream(__dirname + `/${filePath}`));
    form.getLength((err, length) => {
        console.log(`Length: ${length}`);
        fetch(payload.url, {
            method: 'POST',
            body: form,
            headers: {
                'Content-Type': false,
                'Content-Length': length
            }
        })
        .then((response) => {
            console.log(response.ok);
            console.log(response.status);
            console.log(response.statusText);
            return response.text();
        })
        .then((payload) => {
            console.log(payload);
            console.log(form.getHeaders());
        })
        .catch((err) => console.log(`Error: ${err}`));
    });

};


init();

const file = 'test.pdf';
const filePath = `files/new/${file}`;
signFile(filePath)
.then((payload) => { sendFile(file, payload); });
brandonscript
  • 68,675
  • 32
  • 163
  • 220
Bence Gedai
  • 1,461
  • 2
  • 13
  • 25
  • 1
    Works! I don't have `content-length` and my `content-type` is `multipart/form-data`. I also don't have `signatureVersion` and `region` set for AWS config. – wao813 Sep 05 '17 at 20:31
  • would this be possible to do using JSON instead of form data? – Turner Houghton Mar 17 '18 at 04:08
  • 5
    I've been fighting with that upload for a while now as well. This almost works. I have to remove the line `'Content-Type': false,` to make it work. Otherwise I get a an error `PreconditionFailedAt least one of the pre-conditions you specified did not holdBucket POST must be of the enclosure-type multipart/form-data` – Andreas Dec 24 '18 at 22:33
  • I'm facing an issue with `'acl', 'public-read'`, I tried adding `{'acl': 'public-read'}` in conditions and sending `acl : public-read` while uploading from Postman :/ https://stackoverflow.com/questions/59484157/how-to-make-an-aws-s3-file-public-accessible-using-createpresignedpost?noredirect=1#comment105146844_59484157 – Dev1ce Dec 26 '19 at 09:04
0

Here is example with axios

fields - is fields map returned by S3 url - is url returned by S3

    const axios = require("axios");
    const FormData = require("form-data");
    const fs = require("fs");

    const fileContents = fs.readFileSync(path.join(__dirname, "./img.png"));


    const form = new FormData();
    Object.keys(fields).forEach((key) => {
      form.append(key, fields[key]);
    });
    form.append("file", fileContents);

    await axios
      .post(url, form, {
        headers: {
          ...form.getHeaders(),
          "Content-Length": form.getLengthSync(),
        },
      })
      .catch((_error: any) => {
        console.error(`#20231214016872 _error: `, _error);
      });

To get the result from S3 (I added condition to upload only up to 10MB per file):


import AWS from "aws-sdk";

AWS.config.update({
  signatureVersion: "v4",
  region: "???? YOUR USED AWS REGION",
});

const s3 = AWS.S3();


let result = (await new Promise((resolve) => {
      const params = {
        Bucket: config.s3.bucket,
        Expires: uploadUrlExpireInSeconds,
        Fields: {
          key: key,
        },
        // 1 byte to 10 MB
        Conditions: [["content-length-range", 1, config.s3.maxContentLength]],
      };

      s3.createPresignedPost(params, (error, data) => {
        if (error) {
          console.error(
            `#2023101119240 S3 get file upload url failed: `,
            error
          );
        }
        resolve(data);
      });
    })) as { url: string; fields: { [key: string]: string } } | undefined;
Lukas Liesis
  • 24,652
  • 10
  • 111
  • 109