2

I am working on an Angular / JavaScript App:

Is there a way to compress (zip) and download multiple files from a S3 bucket.

OR

Copy the selected files to a dynamic folder and download the folder in the as a zip file from a S3 bucket.

Gh05d
  • 7,923
  • 7
  • 33
  • 64
Gautham Raj
  • 61
  • 1
  • 2
  • 6
  • https://stackoverflow.com/questions/43275575/how-to-zip-files-in-amazon-s3-bucket-and-get-its-url#43281552 – Gh05d Jun 01 '20 at 13:05
  • Does this answer your question? [Amazon S3 console: download multiple files at once](https://stackoverflow.com/questions/41764836/amazon-s3-console-download-multiple-files-at-once) – avocadoLambda Jun 01 '20 at 13:06
  • Does this answer your question? [How to zip files in Amazon s3 Bucket and get its URL](https://stackoverflow.com/questions/43275575/how-to-zip-files-in-amazon-s3-bucket-and-get-its-url) – Jan Jun 01 '20 at 18:43
  • This is what you want https://stackoverflow.com/a/73081269/10447654 – javrd Aug 02 '22 at 11:23

2 Answers2

1

One addition (unable to comment, too n00b on SO). You can now mount an EFS that your Lambda functions can access, this will help with extremely large files (incredibly low latency memory access).

Caveat: this EFS mount is persistent amongst any lambda function you mount it to (ie: 2 lambda functions with the same EFS mounted will have simultaneous access to the files). This could be a problem... or it's a great solution to maintain state between lambda functions. Perspective I suppose.

Justin Ito
  • 11
  • 1
0

Since S3 is just a file storage service, so it won't zip your "folders" on the fly. You can either use the sync command of AWS CLI to download all files whose object keys start with a given prefix, or you need some other AWS service such as Lambda, EC2, Fargate to zip those "folders" before downloading.

jellycsc
  • 10,904
  • 2
  • 15
  • 32