0

I am planning to deploy next.js static export to s3 bucket with Node JS script.

I have setup a S3 bucket for static website hosting.

I get the expected behaviour when I simply drag and drop the static export to the S3 bucket so I pretty sure I setup the S3 bucket correctly.

But when I try to upload it with a Node JS script, despite all the files appear in the bucket, the behaviour of static website hosting seems to break.

My script is basically copy from with a twist on reading the environment variable on .env:

Upload entire directory tree to S3 using AWS sdk in node js

More Info and steps to reproduce this problem is in the Testing Repo.

Testing Repo:

https://github.com/vmia159/aws-upload-test

I appreciate if someone can have an idea about the issues.

vmia159
  • 3
  • 2
  • Where did you get the code for `uploadtoS3.js`? – DreamBold Nov 25 '22 at 06:44
  • Is a modify version from ofhouse in https://stackoverflow.com/questions/27670051/upload-entire-directory-tree-to-s3-using-aws-sdk-in-node-js I just transform es module to commonjs & read variable from .env & a bit of console log – vmia159 Nov 25 '22 at 06:49
  • The code seems okay as long as the env variables are correct. – DreamBold Nov 25 '22 at 06:53
  • I think you omitted the S3 region – DreamBold Nov 25 '22 at 06:56
  • `const s3 = new AWS.S3({ accessKeyId: process.env.S3_ACCESS_KEY, secretAccessKey: process.env.S3_SECRET_KEY, region: process.env.S3_REGION, });` – DreamBold Nov 25 '22 at 07:03
  • It is still broken despite it can still uploaded to S3 bucket with/without region param. – vmia159 Nov 25 '22 at 07:28
  • Can you send me the link to the frontend URL or a screenshot, if possible? I don't understand what you mean exactly by `broke` – DreamBold Nov 25 '22 at 07:29
  • When you set error document to index.html, I expected it to direct to home page when you type random text after the website which is true when you drag and drop to s3 bucket. However, if upload by script once, this behavior change and no longer obey the error document rules set in website static hosting. http://stk-upload-test.s3-website-ap-northeast-1.amazonaws.com/ – vmia159 Nov 25 '22 at 07:54
  • Is it the correct link? – DreamBold Nov 25 '22 at 08:01
  • Is the correct link. You can only download the html files but not hosting a website even if you delete everything and reupload it by drag and drop. – vmia159 Nov 25 '22 at 08:18

1 Answers1

0

https://github.com/aws/aws-sdk-js/issues/4279

As chrisradek points out, you need to provide content type to make it work

require('dotenv').config();
const bucketName = process.env.BUCKET_NAME;
const { promises: fs, createReadStream } = require('fs');
const path = require('path');
const { S3 } = require('aws-sdk');
const mime = require('mime-types');

const s3 = new S3({
  accessKeyId:  process.env.AWS_ACCESS_KEY_ID,
  secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
});

const uploadDir = async (s3Path, bucketName) => {
  // Recursive getFiles from
  // https://stackoverflow.com/a/45130990/831465

  async function getFiles(dir) {
    const dirents = await fs.readdir(dir, { withFileTypes: true });
    const files = await Promise.all(
      dirents.map(dirent => {
        const res = path.resolve(dir, dirent.name);
        return dirent.isDirectory() ? getFiles(res) : res;
      })
    );
    return Array.prototype.concat(...files);
  }

  const files = await getFiles(s3Path);
  const uploads = files.map(filePath =>
    s3
      .putObject({
        Key: path.relative(s3Path, filePath),
        Bucket: bucketName,
        Body: createReadStream(filePath),
        ContentType: mime.lookup(filePath)
      })
      .promise()
      .catch(err => {
        console.log(`fail to upload ${filePath}`);
      })
  );
  return Promise.all(uploads);
};

const uploadProcess = async () => {
  await uploadDir(path.resolve('./out'), bucketName);
  console.log('Upload finish');
};
uploadProcess();