I am working on CentOS with bash shell.
I do not see anything wrong in the code, but it keeps print arguments list too long
error.
It is not only apply to aws cp
, after for cat /dev/null > $FILE_DIR/dailey_member.csv
also prints same message.
echo " ===== dailey vote by member"
cat /dev/null > $FILE_DIR/dailey_member.csv
QUERY_DAILEY_MEMBER_RS=`mysql -h $MYSQL_URL -u$MYSQL_USER -p$MYSQL_PW $MYSQL_SCHEMA -e "SELECT voteDate, usrId, count(usrId) FROM tbl_vote GROUP BY voteDate, usrId;"`
IFS=$'\n' LINES=($QUERY_DAILEY_MEMBER_RS)
for i in "${!LINES[@]}"; do
LINE_IDX=$i
LINE=${LINES[$LINE_IDX]}
LINE=${LINE//$'\t'/,}
echo -e $LINE >> $FILE_DIR/dailey_member.csv
done
echo "aws s3 cp $FILE_DIR/dailey_member.csv " $S3_PATH # copy output works.
aws s3 cp "$FILE_DIR/dailey_member.csv" $S3_PATH
exit 0
How could I fix this?
Thanks.
FYI, What I am trying to do is...
- do MYSQL query
- parse result and write as csv file
- send to s3 repository, so people can download file
============== edited #1 ================
I set my variable like on top of the file ...
export FILE_DIR="/home/jyoh/applications18/cron/file"
export S3_PATH="s3://juneyoung/applications18/voteLogs/"
The result puts on the console.
[jyoh@ip-XX-XX-X-XXX cron]$ sh ./daileyVoteByMember.sh
===== dailey vote by member
aws s3 cp /home/jyoh/applications18/cron/file/dailey_member.csv s3://juneyoung/applications18/voteLogs/
./daileyVoteByMember.sh: line 27: /usr/bin/aws: arguments list too long
[jyoh@ip-XX-XX-X-XXX cron]$ aws s3 cp /home/jyoh/applications18/cron/file/dailey_member.csv s3://juneyoung/applications18/voteLogs/
upload: file/dailey_member.csv to s3://juneyoung/applications18/voteLogs/dailey_member.csv
line27 is where aws s3 cp "$FILE_DIR/dailey_member.csv" $S3_PATH
is.
============== edited #2 ================
result with option -x
.
...SKIP...
+ LINE_IDX=265782
+ LINE='20180129 9>2387463256785462364 1'
+ LINE='20180129,9>2387463256785462364,1'
+ echo -e '20180129,9>2387463256785462364,1'
+ for i in '"${!LINES[@]}"'
+ LINE_IDX=265783
+ LINE='20180129 9>3658743656387567834 1'
+ LINE='20180129,9>3658743656387567834,1'
+ echo -e '20180129,9>3658743656387567834,1'
+ echo 'aws s3 cp /home/jyoh/applications18/cron/file/dailey_member.csv s3://juneyoung/applications18/voteLogs/'
+ aws s3 cp /home/jyoh/applications18/cron/file/dailey_member.csv s3://juneyoung/applications18/voteLogs/
./daileyVoteByMember.sh: line 27: /usr/bin/aws: arguments list too long
+ exit 0
============== edited #3 ================
Copy aws
in new bash file and execute
the file contents
#! /bin/bash
aws s3 cp /home/jyoh/applications18/cron/file/dailey_member.csv s3://juneyoung/applications18/voteLogs/
exit 0
console output
[jyoh@ip-XX-XX-X-XXX cron]$ sh testSh.sh
upload: file/dailey_member.csv to s3://juneyoung/applications18/voteLogs/
============== edited #4 ================
This works fine and clean. Maybe some problem with for
and >>
file redirection?
function sendS3 () {
echo "S3 send command : aws s3 cp $1 $S3_PATH"
aws s3 cp $1 $S3_PATH
}
echo " ===== daily vote"
mysql -h "$MYSQL_URL" -u"$MYSQL_USER" -p"$MYSQL_PW" "$MYSQL_SCHEMA" -e "SELECT QUERY #1;" | sed $'s/\t/,/g' > "$FILE_DIR"/daily.csv
sendS3 "$FILE_DIR"/daily.csv
echo " ===== dailey vote by member"
mysql -h "$MYSQL_URL" -u"$MYSQL_USER" -p"$MYSQL_PW" "$MYSQL_SCHEMA" -e "SELECT QUERY #2;" | sed $'s/\t/,/g' > "$FILE_DIR"/daily_member.csv
sendS3 "$FILE_DIR"/dailey_member.csv
echo " ===== daily vote by ip"
mysql -h "$MYSQL_URL" -u"$MYSQL_USER" -p"$MYSQL_PW" "$MYSQL_SCHEMA" -e "SELECT QUERY #3" | sed $'s/\t/,/g' > "$FILE_DIR"/daily_ip.csv
sendS3 "$FILE_DIR"/daily_ip.csv
echo " ===== daily vote by member,item"
mysql -h "$MYSQL_URL" -u"$MYSQL_USER" -p"$MYSQL_PW" "$MYSQL_SCHEMA" -e "SELECT QUERY #4" | sed $'s/\t/,/g' > "$FILE_DIR"/daily_item.csv
sendS3 "$FILE_DIR"/daily_item.csv
exit 0
============== edited #5 ================
This is the full script. I replaced url and some variables for security.
#! /bin/bash
export SLACK_URL="SLACK_API_URL"
export MYSQL_URL="MYSQL_CONNECTION_URL"
export MYSQL_USER="MYSQL_USER"
export MYSQL_PW="MYSQL_PW"
export MYSQL_SCHEMA="USER"
export FILE_DIR="FILE_PATH"
export QUERY_DAILY_RS=""
export QUERY_DAILY_MEMBER_RS=""
export QUERY_DAILY_IP_RS=""
export QUERY_DAILY_MEMBER_ITEM_RS=""
export S3_PATH="S3_URL_TO_STORE"
echo -e "Sending vote report to Slack"
echo $MYSQL_URL
echo $MYSQL_USER::$MYSQL_PW
function sendS3 () {
echo "S3 send command : aws s3 cp $1 $S3_PATH"
aws s3 cp $1 $S3_PATH
}
echo " ===== daily vote"
cat /dev/null > $FILE_DIR/daily.csv
QUERY_DAILY_RS=`mysql SELECT QUERY 1;`
IFS=$'\n' LINES=($QUERY_DAILY_RS)
for i in "${!LINES[@]}"; do
LINE_IDX=$i
LINE=${LINES[$LINE_IDX]}
LINE=${LINE//$'\t'/,}
echo -e $LINE >> "$FILE_DIR"/daily.csv
done
sendS3 "$FILE_DIR/daily.csv"
echo " ===== daily vote by member"
cat /dev/null > $FILE_DIR/daily_member.csv
QUERY_DAILY_MEMBER_RS=`mysql SELECT QUERY 2`
IFS=$'\n' LINES=($QUERY_DAILY_MEMBER_RS)
for i in "${!LINES[@]}"; do
LINE_IDX=$i
LINE=${LINES[$LINE_IDX]}
LINE=${LINE//$'\t'/,}
echo -e $LINE >> $FILE_DIR/daily_member.csv
done
sendS3 "$FILE_DIR/daily_member.csv"
echo " ===== daily vote by ip"
cat /dev/null > $FILE_DIR/daily_ip.csv
QUERY_DAILY_IP_RS=`mysql SELECT QUERY 3`
IFS=$'\n' LINES=($QUERY_DAILY_IP_RS)
for i in "${!LINES[@]}"; do
LINE_IDX=$i
LINE=${LINES[$LINE_IDX]}
LINE=${LINE//$'\t'/,}
echo -e $LINE >> $FILE_DIR/daily_ip.csv
done
sendS3 "$FILE_DIR/daily_ip.csv"
echo " ===== daily vote by member,item"
cat /dev/null > $FILE_DIR/daily_member_item.csv
QUERY_DAILY_MEMBER_ITEM_RS=`mysql SELECT QUERY 4`
IFS=$'\n' LINES=($QUERY_DAILY_MEMBER_ITEM_RS)
for i in "${!LINES[@]}"; do
LINE_IDX=$i
LINE=${LINES[$LINE_IDX]}
LINE=${LINE//$'\t'/,}
echo -e $LINE >> $FILE_DIR/daily_member_item.csv
done
sendS3 "$FILE_DIR/daily_member_item.csv"
curl -X POST -H 'Content-type: application/json' --data '{"text":"Vote data collecting done"}' $SLACK_URL
exit 0