0

I have a list of folders of the S3 bucket that I want to download from different directories in .txt format. I am able to use the CP command to download one folder. But, I am not sure how can we run a CLI command providing .txt file and download different folders from different directories. Any guidance would be highly appreciated.

Update: My directory from S3 looks like this, where I want to download folder A1, A2, B5, C9, and C11. I have a .txt file with the list of folders.

s3://storage-folder/Folder A/A1
s3://storage-folder/Folder A/A2
s3://storage-fodler/Folder B/B5
s3://storage-fodler/Folder B/B6
s3://storage-fodler/Folder C/C9
s3://storage-fodler/Folder C/C11

want to get locally in my machine as:

Folder A/A1
Folder A/A2
Folder B/B5
Folder B/B6
Folder C/C9
Folder C/C11

For only one folder, I am using the following cp command

aws s3 cp "s3://storage-folder/Folder A/A1" "./Folder A/A1/" --recursive
Udhy
  • 109
  • 1
  • 1
  • 5

1 Answers1

0

Here is how your bash script would look like:

while IFS="" read -r line || [ -n "$line" ]
do
    local_path="./$(echo "$line" | cut -d '/' -f4-)"
    aws s3 cp "\"$line\"" "\"$local_path\"" --recursive
done < file.txt

While loop explained here: Looping through the content of a file in Bash


Also, I strongly recommend you not to use spaces in your paths, that will help you avoid some nasty quotes escaping.

Dawid Fieluba
  • 1,271
  • 14
  • 34