3

I have a website that allows users to search for music tracks and download those they they select as mp3.

I have the site on my server and all of the mp3s on s3 and then distributed via cloudfront. So far so good.

The client now wishes for users to be able to select a number of music track and then download them all in bulk or as a batch instead of 1 at a time.

Usually I would place all the files in a zip and then present the user a link to that new zip file to download. In this case, as the files are on s3 that would require I first copy all the files from s3 to my webserver process them in to a zip and then download from my server.

Is there anyway i can create a zip on s3 or CF or is there someway to batch / group files in to a zip?

Maybe i could set up an EC2 instance to handle this?

I would greatly appreciate some direction.

Best

Joe

arkleyjoe
  • 131
  • 2
  • 3
  • I am facing exactly the same scenario, please let me know how you handled this scenario? thanks... – Siva Jun 22 '12 at 06:57
  • multi-threads is the answer http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/transfer/TransferManager.html – Nicholas DiPiazza Jun 01 '16 at 18:24

3 Answers3

1

I am facing the exact same problem. So far the only thing I was able to find is Amazon's s3sync tool:

https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html

In my case, I am using Rails + its Paperclip addon which means that I have no way to easily download all of the user's images in one go, because the files are scattered in a lot of subdirectories.

However, if you can group your user's files in a better way, say like this:

/users/<ID>/images/...
/users/<ID>/songs/...

...etc., then you can solve your problem right away with:

aws s3 sync s3://<your_bucket_name>/users/<user_id>/songs /cache/<user_id>

Do have in mind you'll have to give your server the proper credentials so the S3 CLI tools can work without prompting for usernames/passwords.

And that should sort you.

Additional discussion here: Downloading an entire S3 bucket?

dimitarvp
  • 2,316
  • 2
  • 20
  • 29
1

I am afraid you won't be able to create the batches w/o additional processing. firing up an EC2 instance might be an option to create a batch per user

cloudberryman
  • 4,598
  • 2
  • 27
  • 14
0

s3 is single http request based.

So the answer is threads to achieve the same thing

Java api - uses TransferManager

http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/transfer/TransferManager.html

You can get great performance with multi threads.

There is no bulk download sorry.

Nicholas DiPiazza
  • 10,029
  • 11
  • 83
  • 152