I am trying to create a script that will find specific log files in different folders and compress them using tar. This is how I am doing:
find /tmp/logs -type f -name '*.log*' -mtime -1 -print0 | tar -czf $target_folder1/$current_date.tar.gz --null -T -
find /home/second_logs -type f -name '*.log*' -mtime -1 -print0 | tar -czf $target_folder2/$current_date.tar.gz --null -T -
This works great, find all .log files under /tmp/logs and /home/second_logs within last day and compress them. Now, when there is a process writing to any log file under /tmp/logs, tar will give me a warning but then the script won't execute the second find/tar command and just exit out:
tar: Removing leading `/' from member names
tar: /tmp/logs/somelog.log: file changed as we read it
So I get the first compressed tar file but the second one won't be executed. How can I make sure both find commands will be executed in my bash script?
Full script:
#!/bin/bash -xe
serverhost=$(hostname -s)
current_date=$(date +"%Y-%m-%d")
target_folder=/tmp/pull-logs
app1_logs=/tmp/logs/app1
app2_logs=/tmp/logs/app2
tomcat_logs=/tmp/logs/tomcat
find $app1_logs -type f -name '*.log*' -mtime -1 -print0 | tar -czf $target_folder/$serverhost-app1-$current_date.tar.gz --null -T -
find $app2_logs -type f -name '*.log*' -mtime -1 -print0 | tar czf $target_folder/$serverhost-app2-$current_date.tar.gz --null -T -
find $tomcat_logs -type f -name '*.log*' -mtime -1 -print0 | tar czf $target_folder/$serverhost-tomcat-$current_date.tar.gz --null -T -
#aws s3 cp $target_folder s3://my/s3 --recursive
Thank you!