1

I upload a pdf blob object to S3

      const params = {
        Bucket: "poster-print-bucket",
        Key: Date.now().toString() + ".pdf",
        Body: blob,
        contentType: "application/pdf",
      };

      const uploaded = await S3.upload(params).promise();

When I open a url which is i.e https://poster-print-bucket.s3.ca-central-1.amazonaws.com/1633526785678.pdf It downloads me a blank pdf

I thought maybe my blob is corrupted or something but I managed to upload same blob to firebase storage just fine.

btw I'm using nextjs api/upload-poster route

What's happening?

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
Karolis
  • 143
  • 3
  • 12
  • Have you tried converting blob to base64? https://stackoverflow.com/questions/18650168/convert-blob-to-base64 – Justinas Oct 07 '21 at 12:16
  • Error `Failed to load PDF document.` When opening the url – Karolis Oct 07 '21 at 12:27
  • Throw the `ContentMD5` on there to also validate that it is actually getting uploaded. You probably want a `try/catch` as well to make sure there isn't an error and that the upload is actually completing before continuing. – Warren Parad Oct 07 '21 at 12:43
  • I'm not familiar with ContentMD5. How it can help to validate if it is uploaded? I see it in my S3 bucket that it is uploaded. Also file size seem to be what I expect but that pdf being blank is something that is very strange – Karolis Oct 07 '21 at 12:49

2 Answers2

2

Using the AWS SDK v3 (up-to-date at the time of this post), you could use PutObjectCommand which accepts a Uint8Array as Body params (docs).

Convert your Blob instance to an ArrayBuffer (docs), and your ArrayBuffer to an Uint8Array.

Code would look like:

const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3');
const client = new S3Client(/* config */);

const arrayBuffer = await blob.arrayBuffer();
const typedArray = new Uint8Array(arrayBuffer);
await client.send(new PutObjectCommand({
   Bucket: /* ... */,
   Key: /* ... */,
   Body: typedArray,
}));

Jean-Baptiste Martin
  • 1,399
  • 1
  • 10
  • 19
1

I spent more time fixing this issue than I would like to admit. Here is the solution:

Frontend (converting blob to base64 before sending to backend):

            function toBase64(blob) {
              const reader = new FileReader();
              return new Promise((res, rej) => {
                reader.readAsDataURL(blob);
                reader.onload = function () {
                  res(reader.result);
                };
              });
            }

            toBase64(currentBlob)
              .then((blob) => {
                return axios
                  .post("/api/upload-poster", blob, {
                    headers: {
                      "Content-Type": "application/pdf",
                    },
                  })
                  .then(({ data }) => data.uploaded.Location);
              })

Backend:

      const base64 = req.body;
      const base64Data = Buffer.from(base64.replace(/^data:application\/\w+;base64,/, ""), "base64");
     
      const params = {
        Bucket: "poster-print-bucket",
        Key: nanoid() + ".pdf",
        Body: base64Data,
        ContentEncoding: "base64",
        contentType: "application/pdf",
      };

      const uploaded = await S3.upload(params).promise();

Why this all song and dance is required? Can it be something easier?

Karolis
  • 143
  • 3
  • 12