-1

I have a simple bash script to delete a list of files in s3 that looks like this:

#!/usr/bin/bash
filename="$1"
s3folder="$2"
while read -r line; do
file="$line"
aws s3 rm s3://bucketname/data/$s3folder/$file
done < "$filename"

When I run this, I see the stdout of the s3 delete command (Example: delete: s3://s3bucketname/data/foldername/file.json) so I know my s3 command is good and I know the data from my variables and input file are landing properly, there are no errors, but my files are still there after the script has finished.

If I run the individual aws s3 rm commands manually on the command line, it works.

I've tried every variant I could find of sourcing the bash profile and running the command as the user with s3 privs etc.

I've also tried the --recursive --exclude --include variants of the aws s3 rm command but that has the same results.

`for file in $(cat $filename);do aws s3 rm s3://s3bucketname/data/$s3folder/ --recursive --exclude "*" --include "$file"; done`

If I run the aws s3 rm command in a script using a single file name as a command line arg, that also works so I guess it is something about the loop but I'm not getting it ?

cliftonf
  • 67
  • 5
  • Check your input file for [carriage return](https://stackoverflow.com/questions/39527571/are-shell-scripts-sensitive-to-encoding-and-line-endings) characters – that other guy Nov 08 '20 at 21:33
  • Yup that was it. Resaved the input file in notepad++ after changing the EOL chars to Linux, pushed the file over, re-ran the script and files are deleted. Thanks @thatotherguy ! – cliftonf Nov 08 '20 at 22:01

1 Answers1

0

Problem turned out to be windows EOL chars in the input file. Converted them to Linux EOL chars and the aws s3 rm commands are working as expected. Thanks to that other guy for the quick answer to my hours spent troubeshooting this !

cliftonf
  • 67
  • 5