3

I am uploading image files using s3. But whenever I try to download using the URL as well as from the s3 console will download the image file. But, it will not visible to the image viewer. It just shows an incompatible file type.

myS3Function.uploadFile(request.body.fileName, request.files.myFileData, "image_folder").then(filename => {
 //success
})

.

const AWS = require('aws-sdk');

const s3 = new AWS.S3({
 region: process.env.REGION
});

exports.uploadFile  = (filename, data, folderName) => {
return new Promise((resolve, reject) => {
    const params = {
        Bucket: process.env.AWS_S3_BUCKET, 
        Key: folderName+'/'+filename,
        Body: data.data,
        ACL:'public-read',
        ContentType: "image/jpeg"
    };
    s3.upload(params, function(s3Err, data) {
        if (s3Err) reject(s3Err)
        console.log(`File uploaded successfully at ${data.Location}`)
        resolve(`${data.Location}`)

    });
});
}

I uploaded files using postman now as form data. I can see text files uploaded using this code correctly. Then why do images have the issue? Also, images and pdf actual file size is increased a little bit.

enter image description here

enter image description here

KIRAN K J
  • 632
  • 5
  • 28
  • 57
  • what is the data type of `data.data`? – Ermiya Eskandary Nov 02 '21 at 12:52
  • @ErmiyaEskandary request.files.FORM_DATA_NAME is data. – KIRAN K J Nov 02 '21 at 13:07
  • 1
    @ErmiyaEskandary updated the question with some more details – KIRAN K J Nov 04 '21 at 06:49
  • So images uploaded via Postman are downloaded correctly? Is the issue only when uploading the file like above or do images just not download correctly at all? – Ermiya Eskandary Nov 06 '21 at 15:02
  • @ErmiyaEskandary Uploading is working. But after downloading it, I cannot view it in image viewer. Also, I can see some size increase happening after upload. That means, the original file size is 10kb, after upload, it will become 14kb. After if I download it, the size will remain 14kb – KIRAN K J Nov 06 '21 at 15:07
  • Via above code only or Postman too? – Ermiya Eskandary Nov 06 '21 at 15:12
  • @ErmiyaEskandary downloads tried only from the s3 console directly. Upload tried only from postman – KIRAN K J Nov 06 '21 at 15:14
  • Okay so then what is rhe point of the above code if you haven’t tested it? Does code + Postman **both** not work? – Ermiya Eskandary Nov 06 '21 at 16:14
  • @ErmiyaEskandary The above code is running with serverless. After that, I uploaded a file using postman. – KIRAN K J Nov 06 '21 at 16:24
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/238941/discussion-between-ermiya-eskandary-and-kiran-k-j). – Ermiya Eskandary Nov 06 '21 at 19:04
  • @KIRANKJ Probably not but for your comments, please, consider review [this SO question](https://stackoverflow.com/questions/57968048/how-to-upload-image-buffer-data-in-aws-s3) or the [AWS docs](https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-payload-encodings-configure-with-console.html) about how to deal with binary types, I think be could be of help. – jccampanero Nov 06 '21 at 22:11
  • @jccampanero I tried by adding binary types. But I am getting same result. – KIRAN K J Nov 07 '21 at 06:42
  • Thank you very much for the feedback @KIRANKJ. That is very strange indeed. You mentioned that you are posting the information with Postman: please, could you provide some screenshot or, in general, further information about how you are performing the actual upload with that tool? – jccampanero Nov 07 '21 at 18:21
  • @jccampanero I've uploaded screenshots. Could you please take a look – KIRAN K J Nov 08 '21 at 00:34
  • Have you set ContentType in your upload params e.g. `ContentType: "image/png"`? Also, the file size change could be because the file was base64-encoded (causing increase in size) and some component did not know that so did not decode it. A base64-encoded image would be about 1/3rd larger. – jarmod Nov 08 '21 at 00:35
  • @jarmod I tested by adding ContentType: "image/jpeg". Do I need to add any base64 decoding code? – KIRAN K J Nov 08 '21 at 00:40
  • I think you can also add `ContentEncoding: 'base64'` to params. – jarmod Nov 08 '21 at 00:42
  • @jarmod I checked that now. But the same is happening. Do you think "data.data" is correct for body data? Should I convert it to buffer or something another? – KIRAN K J Nov 08 '21 at 00:56
  • It should be simple enough to dump the first few hex bytes of the payload and see if they are the original file's bytes or if it's base64-encoded version of the original file bytes, or something else entirely. Also, is this an Express app or something else? is it running in AWS Lambda (guess not)? Are you using API Gateway? – jarmod Nov 08 '21 at 01:17
  • @jarmod This is the first few characters of data.data "ÿØÿà►JFIF☺☺☺☺ÿÛ ▬§↑▬▬▬↓↑↑↓∟∟∟→∟→∟▲→∟↔→→→∟→∟↓↓→∟!.%∟▲+!→∟&8&+/1555→$;@;3?.451☺♀♀♀►☼►▼↕↕" . I am currently running this code using serverless offline. This will deploy as lambda once it works fine. Under the API gateway, I created my project. In the binary media types I added "multipart/form-data" and "image/jpeg" – KIRAN K J Nov 08 '21 at 01:24
  • What headers are you sending from Postman? Did you explicitly add a Content-Type header in Postman (e.g. multipart/form-data)? If so, remove it and let Postman populate the header for you. – jarmod Nov 08 '21 at 01:35
  • @jarmod I am sending only auth token in headers – KIRAN K J Nov 08 '21 at 01:39
  • Thank you @KIRANKJ. It looks fine to me. Please, for testing purposes, instead of uploading the file to S3, could you save the file to a local directory? I assume you are using express fileupload, so you can see an example [here](https://github.com/richardgirges/express-fileupload/blob/e8d9b671842ee4bf0fe3f85ed988ce7e4e1b7aa5/example/server.js#L15-L37). Please, could you try and see if t works? – jccampanero Nov 08 '21 at 12:39
  • @jccampanero I tried the above. The file is created to an upload folder in my code. But the issue still exists like in s3. The file size is wrong like in s3. – KIRAN K J Nov 09 '21 at 00:49
  • Sorry for the late reply @KIRANKJ. That is great, because it means that the problem could be in either the Postman upload (I don't think so) or in the code that receives the actual request (probably). Please could you provide further details in your question about the code that handle the HTTP request sent from Postman? – jccampanero Nov 09 '21 at 14:25
  • @jccampanero based on the code, i cannot see there are no places modify request. But can be serverless be a problem. I can see from the below questions the issue happening with serverless. But i cannot fix by the solution provided. https://forum.serverless.com/t/upload-image-to-s3-image-broken/12390 AND https://forum.serverless.com/t/file-uploaded-to-s3-but-it-doesnt-open-corrupted/11018 – KIRAN K J Nov 09 '21 at 15:51
  • Hi @KIRANKJ. It may be the problem, of course. In any way, I want to mean if you could further describe the code that actually invokes `myS3Function.uploadFile`. I mean, where does the `request` variable come from, for example? Sorry if I am missing something. – jccampanero Nov 09 '21 at 16:19
  • @jccampanero this is the handler app.post('/api/uploadProductImage', jwt_val.default(), images.uploadProductImage); From the handler it will go to the controller exports.uploadProductImage = (request, response) => { myS3Function.uploadFile(request.body.fileName, request.files.myFileData, "image_folder")..... – KIRAN K J Nov 09 '21 at 17:29
  • @jccampanero For simplicity, I skipped some unwanted/independent codes while posting here. Sorry for that. Also, the text file is working in both cases(s3 and express) – KIRAN K J Nov 09 '21 at 17:33
  • @KIRANKJ Thank you very much. There is no need to apologize, on the contrary, thank you for sharing the code. Everything looks fine to me... Sorry for asking but, the `1.jpg` you upload is right, isn't it? I mean, are you able to see it in your image software? Please, could you try using curl instead of postman? The command would be something like: `curl -X POST -H "Authorization: Bearer " -F 'image=@/path/to/your/image/1.jpg' -v http://localhost:3000/dev/api/uploadProductImage`. Please, adapt to your needs. Please, could you try? – jccampanero Nov 09 '21 at 19:01
  • @jccampanero Uploaded with curl. But unfortunately, the uploaded image still has the same issue with curl too. So Postman is not the problem. The image is working fine before upload. But after upload, it becomes broken. Same for pdf files too. But text file is working with S3, express and now curl too. – KIRAN K J Nov 10 '21 at 01:30
  • The weirdest part is the image at local was modified (size changed) after upload. It seems like something tries to write data to the image more than just read. I think Postman won't do this. What's your idea? @K – Pengson Nov 10 '21 at 08:34
  • Sorry for the late reply. Yes, it is very strange. As far I understand, the problem exists even with a simple express app, even without the serverless framework, is it right? In my opinion, there should be some kind of _byte_ or encoding conversion, it seems that your backend is not considering the incoming information as binary by any reason. Every time you send a request, Postman prints every detail in its Console. Please, could you include that information in the question? – jccampanero Nov 10 '21 at 11:29
  • @jccampanero The express code is also run by serverless offline. As we know that, a normal express code will work fine. If some issue was there with Postman, we will get the exact result with curl. Now all those combinations failed. Now I am sure that it is an issue with serverless. I posted the same question with little more details here https://forum.serverless.com/t/image-file-is-not-viewing-uploaded-by-s3/16217 – KIRAN K J Nov 10 '21 at 11:49
  • @Pengson That's why we tried the same with cUrl. But no use – KIRAN K J Nov 10 '21 at 11:50
  • @KIRANKJ Then, probably the issue will have to do with the server less framework configuration. I mentioned in my first comment the need to configure `multipart/form-data` as a binary type in API Gateway. As suggested as well in the two links you provided, did you update the configuration of the serverless framework accordingly? Did you deploy the new configuration? – jccampanero Nov 10 '21 at 12:08
  • @jccampanero I updated same in both serverless.yml file as well as aws console – KIRAN K J Nov 10 '21 at 12:38
  • Thank you @KIRANKJ. I am running out of ideas mate. Please, see these SO questions [1](https://stackoverflow.com/questions/59472692/aws-s3-after-uploading-image-is-broken) and [2](https://stackoverflow.com/questions/60017442/how-to-upload-multipart-form-data-in-serverless), both related to get rid of `aws-serverless-express` in some way, by removing the dependency, or by using the `aws-serverless-express-binary` package instead. Would it fill your requirements? Please, could you try? – jccampanero Nov 10 '21 at 13:10
  • @jccampanero My current dev dependency contains these "dependencies": { "amazon-cognito-identity-js": "^5.1.0", "aws-sdk": "^2.984.0", "body-parser": "^1.19.0", "cors": "^2.8.5", "express": "^4.17.1", "express-fileupload": "^1.2.1", "jsonwebtoken": "^8.5.1", "jwk-to-pem": "^2.0.5", "mysql2": "^2.3.0", "node-fetch": "^3.0.0", "request": "^2.88.2", "sequelize": "^6.6.5", "serverless-http": "^2.7.0", "serverless-offline": "^8.1.0" } – KIRAN K J Nov 10 '21 at 14:00
  • Thank you very much for sharing your dependencies @KIRANKJ. It seems to be a `serverless-offline`issue. Please, consider review this [Github issue](https://github.com/dougmoscrop/serverless-http/issues/81) in `serverless-http`, and these others [1](https://github.com/dherault/serverless-offline/pull/784) and [2](https://github.com/dherault/serverless-offline/issues/464) related to the mentioned `serverless-offline` library. – jccampanero Nov 10 '21 at 17:41
  • @KIRANKJ I posted an answer summarizing our comments. I will try expanding it later. – jccampanero Nov 11 '21 at 10:03

4 Answers4

4

As indicated in the different comments of your question, there are several things that can motivate your problem.

Please, be sure that you are providing the necessary configuration about the different content types that should be considered binary. The AWS documentation provides great detail about it; this related SO question can be valuable as well.

Due to the fact you are using the serverless framework, as indicated in the links 1 2 you cited, please provide the necessary configuration there as well:

provider:
  apiGateway:
    binaryMediaTypes:
      - 'multipart/form-data'

In any way, it seems that even with this configuration, you are still facing the problem. You told you were able to successfully upload text files, but your images get corrupted, increasing their size: as indicated in the comments, it seems a clear indication that in some place the information is being converted to a different encoding, from binary to text, something like that. In fact, according to your dependencies, this seems to be the actual problem ,as reported in this issue of the serverless-http library and especially, in this others 1 and 2 of the serverless-offline library.

I think the issue is only local and that it will probably work without further problems in AWS.

In any way, as you can see in first of the above mentioned issues, the one related to serverless-http, the library has the following code:

return Buffer.from(event.body, event.isBase64Encoded ? 'base64' : 'utf8'); 

So, as a workaround, submitting your information as base 64 encoded can solve the issue: it is not a straightforward task if you are using form submission in your HTML - see for instance this great example for some ideas - although it can do the trick if you interact directly with your API from code. The only necessary change is in your params variable:

const params = {
  Bucket: process.env.AWS_S3_BUCKET, 
  Key: folderName+'/'+filename,
  Body: Buffer.from(data.data, 'base64'),
  ContentEncoding: 'base64',
  ContentType: 'image/jpeg',
  ACL:'public-read',
};

Please, note the use of Buffer.from(..., 'base64') and the inclusion of ContentEncoding: 'base64'.

In any way, if the code works in AWS I think the way to go would be waiting for the serverless-offline issues resolution.

jccampanero
  • 50,989
  • 3
  • 20
  • 49
  • If I am trying to upload an image file of more than 10MB, then myFileData will be huge and is there any chance to interrupt the upload? – KIRAN K J Nov 11 '21 at 12:41
  • @KIRANKJ `s3.upload` will return an instance of [`AWS.S3.ManagedUpload `](https://github.com/aws/aws-sdk-js/blob/345852e53a3a6599e76dd37ade4395e585700897/lib/s3/managed_upload.js). You can use the [`abort`](https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3/ManagedUpload.html#abort-property) method provided by this object to cancel an upload in progress. Do you mean that? If your objects are of big size, let's say greater than 100 Mb, you can try the [multipart upload feature](https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html) provided by S3. – jccampanero Nov 11 '21 at 13:00
  • I can see a nearly 30% increase in size while converting to base 64. So resigned URL might be the alternative to this entire problem. What is your opinion? One more important thing I found. I tried the base 64 uploading of a 1MB file. It's about 5 minutes, it's still uploading. The delay might be because of the serverless offline. But If this is the scenario, I cannot develop the app with normal 1 to 2 MB files. So is it a good idea to swit? – KIRAN K J Nov 11 '21 at 13:09
  • Yes, the base 64 encoding definitively will increase your file size. Do you mean using [`createPresignedPost`](https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#createPresignedPost-property)? Something like [this](https://stackoverflow.com/questions/44888301/upload-file-to-s3-with-post)? Yes, certainly it could be an option. Honestly, I have never tested it. – jccampanero Nov 11 '21 at 13:28
  • s3.getSignedUrl('putObject', params, function(error, url){} This is the way. I tested that. It is very reliable. The draw back seen is we have two API request. One for generating the URL and another for uploading – KIRAN K J Nov 11 '21 at 13:44
  • Yes, of course @KIRANKJ. Well, it will depend on your actual use case. I am afraid that it is not a general rule, maybe the use of presigned URLs could be preferable if your file uploads are hugh. But as I told, it will depend on your users, the network conditions, etcetera. Be aware only that [`getSignedUrl`](https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#getSignedUrl-property) will no allow you to set certain information like `ACL`, but it may not be a problem. – jccampanero Nov 11 '21 at 13:58
  • 1
    @KIRANKJ Although probably the best option will be the use of presigned urls, just in case, consider review [this SO question](https://stackoverflow.com/questions/14672746/how-to-compress-an-image-via-javascript-in-the-browser), if you are dealing with images, it can be an option for reducing the file size as well. I hope it helps. – jccampanero Nov 12 '21 at 12:01
  • I am looking at the pros and cons of the different possibilities with base64 – KIRAN K J Nov 12 '21 at 12:26
  • I understand @KIRANKJ. As I told, there is not a general rule, it will greatly depend on your actual use case: if you thing presigned urls will do the trick, implement the solution like that although I honestly think that if you have a proper UI that provide feedback about the upload being performed to the final app users, unless you face some API Gateway limitation, an usual file upload will be fine. As a follow up of the previous comment, you can use a compression library if necessary as well: see, for instance [`compress.js`](https://github.com/alextanhongpin/compress.js), and these ... – jccampanero Nov 12 '21 at 12:51
  • [others](https://openbase.com/categories/js/best-nodejs-compression-libraries) as well. – jccampanero Nov 12 '21 at 12:51
3

I'm not sure what the problem is, but it seems like serverless offline is having trouble processing the data coming in as multipart/form-data. The easiest solution would be to encode the file as base64 and send the payload as application/json.

To encode the file as base64, use this website: https://base64.guru/converter/encode/file. It is trivial to do this locally and programatically, but the website is cross-platform and should be good enough for testing.

Request

enter image description here

Payload:

{

    "fileName": "sampleFile.jpg",
    "myFileData": "R0lGODlhPQBEAPeoAJosM//AwO/AwHVYZ/z595kzAP/s7P+goOXMv8+fhw/v739/f+8PD98fH/8mJl+fn/9ZWb8/PzWlwv///6wWGbImAPgTEMImIN9gUFCEm/gDALULDN8PAD6atYdCTX9gUNKlj8wZAKUsAOzZz+UMAOsJAP/Z2ccMDA8PD/95eX5NWvsJCOVNQPtfX/8zM8+QePLl38MGBr8JCP+zs9myn/8GBqwpAP/GxgwJCPny78lzYLgjAJ8vAP9fX/+MjMUcAN8zM/9wcM8ZGcATEL+QePdZWf/29uc/P9cmJu9MTDImIN+/r7+/vz8/P8VNQGNugV8AAF9fX8swMNgTAFlDOICAgPNSUnNWSMQ5MBAQEJE3QPIGAM9AQMqGcG9vb6MhJsEdGM8vLx8fH98AANIWAMuQeL8fABkTEPPQ0OM5OSYdGFl5jo+Pj/+pqcsTE78wMFNGQLYmID4dGPvd3UBAQJmTkP+8vH9QUK+vr8ZWSHpzcJMmILdwcLOGcHRQUHxwcK9PT9DQ0O/v70w5MLypoG8wKOuwsP/g4P/Q0IcwKEswKMl8aJ9fX2xjdOtGRs/Pz+Dg4GImIP8gIH0sKEAwKKmTiKZ8aB/f39Wsl+LFt8dgUE9PT5x5aHBwcP+AgP+WltdgYMyZfyywz78AAAAAAAD///8AAP9mZv///wAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACH5BAEAAKgALAAAAAA9AEQAAAj/AFEJHEiwoMGDCBMqXMiwocAbBww4nEhxoYkUpzJGrMixogkfGUNqlNixJEIDB0SqHGmyJSojM1bKZOmyop0gM3Oe2liTISKMOoPy7GnwY9CjIYcSRYm0aVKSLmE6nfq05QycVLPuhDrxBlCtYJUqNAq2bNWEBj6ZXRuyxZyDRtqwnXvkhACDV+euTeJm1Ki7A73qNWtFiF+/gA95Gly2CJLDhwEHMOUAAuOpLYDEgBxZ4GRTlC1fDnpkM+fOqD6DDj1aZpITp0dtGCDhr+fVuCu3zlg49ijaokTZTo27uG7Gjn2P+hI8+PDPERoUB318bWbfAJ5sUNFcuGRTYUqV/3ogfXp1rWlMc6awJjiAAd2fm4ogXjz56aypOoIde4OE5u/F9x199dlXnnGiHZWEYbGpsAEA3QXYnHwEFliKAgswgJ8LPeiUXGwedCAKABACCN+EA1pYIIYaFlcDhytd51sGAJbo3onOpajiihlO92KHGaUXGwWjUBChjSPiWJuOO/LYIm4v1tXfE6J4gCSJEZ7YgRYUNrkji9P55sF/ogxw5ZkSqIDaZBV6aSGYq/lGZplndkckZ98xoICbTcIJGQAZcNmdmUc210hs35nCyJ58fgmIKX5RQGOZowxaZwYA+JaoKQwswGijBV4C6SiTUmpphMspJx9unX4KaimjDv9aaXOEBteBqmuuxgEHoLX6Kqx+yXqqBANsgCtit4FWQAEkrNbpq7HSOmtwag5w57GrmlJBASEU18ADjUYb3ADTinIttsgSB1oJFfA63bduimuqKB1keqwUhoCSK374wbujvOSu4QG6UvxBRydcpKsav++Ca6G8A6Pr1x2kVMyHwsVxUALDq/krnrhPSOzXG1lUTIoffqGR7Goi2MAxbv6O2kEG56I7CSlRsEFKFVyovDJoIRTg7sugNRDGqCJzJgcKE0ywc0ELm6KBCCJo8DIPFeCWNGcyqNFE06ToAfV0HBRgxsvLThHn1oddQMrXj5DyAQgjEHSAJMWZwS3HPxT/QMbabI/iBCliMLEJKX2EEkomBAUCxRi42VDADxyTYDVogV+wSChqmKxEKCDAYFDFj4OmwbY7bDGdBhtrnTQYOigeChUmc1K3QTnAUfEgGFgAWt88hKA6aCRIXhxnQ1yg3BCayK44EWdkUQcBByEQChFXfCB776aQsG0BIlQgQgE8qO26X1h8cEUep8ngRBnOy74E9QgRgEAC8SvOfQkh7FDBDmS43PmGoIiKUUEGkMEC/PJHgxw0xH74yx/3XnaYRJgMB8obxQW6kL9QYEJ0FIFgByfIL7/IQAlvQwEpnAC7DtLNJCKUoO/w45c44GwCXiAFB/OXAATQryUxdN4LfFiwgjCNYg+kYMIEFkCKDs6PKAIJouyGWMS1FSKJOMRB/BoIxYJIUXFUxNwoIkEKPAgCBZSQHQ1A2EWDfDEUVLyADj5AChSIQW6gu10bE/JG2VnCZGfo4R4d0sdQoBAHhPjhIB94v/wRoRKQWGRHgrhGSQJxCS+0pCZbEhAAOw==",
    "fileType": "image/jpeg",
    "productId": 12,
    "isDefault": false,
    "position": 2
}

product-image.js

const s3Functions = require('../services/s3Functions')
const { KEY_S3_PRODUCT_IMAGES_FOLDER } = require('../util/constants');
const { KEY_STATUS, KEY_DATA } = require('../util/constants');

exports.uploadProductImage = (body) => {
    return new Promise((resolve, reject) => {
        s3Functions.uploadFile(
            body.fileName,
            body.myFileData,
            body.fileType,
            KEY_S3_PRODUCT_IMAGES_FOLDER
        ).then(filename => {
            resolve({ [KEY_STATUS]: 1, [KEY_STATUS]: "Uploaded successfully", [KEY_DATA]: filename });
        }).catch(error => {
            reject({ [KEY_STATUS]: 0, [KEY_STATUS]: "Upload Failed", [KEY_DATA]: error });
        });
    })
}

s3Functions.js

const AWS = require('aws-sdk');

const s3 = new AWS.S3({
    params: { Bucket: 'rename-me', ACL:'public-read' },
});

exports.uploadFile = (
    filename,
    data,
    contentType,
    folderName
) => {
    console.log('Uploading')
    return new Promise((resolve, reject) => {
        const params = {
            Key: folderName+'/'+filename,
            Body: Buffer.from(data, 'base64'),
            ContentEncoding: 'base64',
            ContentType: contentType,
        };
        s3.upload(params, function(s3Err, data) {
            if (s3Err) {
                console.error(s3Err);
                reject(s3Err)
            }
            console.log(`File uploaded successfully at ${data.Location}`)
            resolve(`${data.Location}`)
        });
    });
}

Here's the codebase you provided on the other thread, with the included changes. I haven't cleaned it up, but just in case you run into any issues.

dhruvit-r
  • 73
  • 7
  • If I am trying to upload an image file of more than 10MB, then myFileData will be huge and is there any chance to interrupt the upload? – KIRAN K J Nov 11 '21 at 12:10
  • @KIRANKJ Right, Base64 increases the size of the data by about 30%. Not ideal, but it does get the job done. In my experience, forms are a pain to deal with except when you are submitting from an actual HTML form. Don't use Base64 if your users are expected to have very slow connections or if the file sizes are huge (greater than 25MB is my cut-off). Apart from that, ensure that there's proper error-handling in the FE. – dhruvit-r Nov 11 '21 at 13:34
  • @KIRANKJ I don't understand your second question. Are you asking if the upload is interrupt-able? Or are you asking if this would cause interruptions? – dhruvit-r Nov 11 '21 at 13:35
  • I mean because of the large form data size, will it crash? Becuase I just uploaded a 1MB file with the code shared. It took 8 minutes to upload. But after upload the file size is decreased to 750KB. The bottom part(nearly 25%) of the image was greyed out after upload. Seems it is not a reliable way. Did we are trying multipart upload? If not will it resolve? Also, can the entire problem be solved by resigned URL method? – KIRAN K J Nov 11 '21 at 13:40
  • Hmm, I just uploaded a 3MB file in 11.90 seconds over a shared 10MBps connection. If you are working on a local (serverless offline) instance, could you try deploying it to an actual lambda to rule out any issues caused by your machine. Also wanted to mention that I've seen this method deployed in production systems, so I don't think this is unreliable. Of course, if the pre-signed URL method works for you, than this is moot. It is a good option with a couple of drawbacks: increased development complexity and multiple requests. – dhruvit-r Nov 11 '21 at 18:05
  • Might be sls offline issue. I need to deploy and test. I am still doubt about the size. In my case i might have maximum of 50 MB file. Then which method I prefer. Base 64 or signed url? Do you know the post request form data having any limits? Will it exponentially increase lambda execution time and affect bill? – KIRAN K J Nov 12 '21 at 00:54
  • @KIRANKJ None of the methods will increase the time exponentially, but the API Gateway has a 10MB limit on the size of the payload (source: https://docs.aws.amazon.com/apigateway/latest/developerguide/limits.html#http-api-quotas). Go for the pre-signed URL method if you need support for more than that. – dhruvit-r Nov 12 '21 at 04:14
2

You need to send Body as a Buffer or ReadableStream, so you need to read the request.files.image as a Buffer before send it to s3 api. Also add content type to request headers

I'm not using your same stack but this is my working code

CURL testing command

curl -X 'PUT' \
 'http://localhost/api' \
 -H 'accept: _/_' \
 -H 'Content-Type: multipart/form-data' \
 -F 'image=@/Users/some/dir/userprofilesample1.jpg;type=image/jpeg'

S3 api call - I'm not specifying ContentType

.upload({
        Body: content, // Buffer | ReadableStream
        Bucket: bucketName,
        Key: path,
        ACL:  'public-read',
      })

https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html

David Rearte
  • 764
  • 8
  • 19
0

Hi I would suggest you to use binary option while uploading the pic instead of Multi-part. Then try downloading it. That will solve your issue. See postman example snapshot here

If You want to use multipart for image upload on s3 then you'll have to create presigned url for each part then only you can use put to upload your file.

Here is a reference article where you could find multipart implementation

https://dev.to/traindex/multipart-upload-for-large-files-using-pre-signed-urls-aws-4hg4

  • This is good information but you should quote the most relevant content from that article as part of your answer. – Besworks Apr 05 '22 at 19:17