I am working on an Angular / JavaScript App:
Is there a way to compress (zip) and download multiple files from a S3 bucket.
OR
Copy the selected files to a dynamic folder and download the folder in the as a zip file from a S3 bucket.
I am working on an Angular / JavaScript App:
Is there a way to compress (zip) and download multiple files from a S3 bucket.
OR
Copy the selected files to a dynamic folder and download the folder in the as a zip file from a S3 bucket.
One addition (unable to comment, too n00b on SO). You can now mount an EFS that your Lambda functions can access, this will help with extremely large files (incredibly low latency memory access).
Caveat: this EFS mount is persistent amongst any lambda function you mount it to (ie: 2 lambda functions with the same EFS mounted will have simultaneous access to the files). This could be a problem... or it's a great solution to maintain state between lambda functions. Perspective I suppose.
Since S3 is just a file storage service, so it won't zip your "folders" on the fly. You can either use the sync command of AWS CLI to download all files whose object keys start with a given prefix, or you need some other AWS service such as Lambda, EC2, Fargate to zip those "folders" before downloading.