I came up with this one liner to parse JSON output of AWS commands and use the values. It does work, somehow though it feels wrong :) The only stuff on it I could find was how to convert a known "key":"value" pairs, mine does any seemingly. Will appreciate any feedback.
rm -f tmp.file
cmd="aws sns list-subscriptions-by-topic --topic-arn arn:aws:sns:eu-west-1:${account}:${topic}"
#this is the command that produces JSON
eval "${cmd}" | jq . | tr -d "{" | tr -d "}" | tr -d "," | tr -d "[" | tr -d "]" | tr -d " " | grep -v '^[[:space:]]*$' | while read LINE; do
# The above line strips JSON-specific chars and feeds what is left line by line into a loop
varname=$(echo "${LINE}" | cut -d ":" -f1 | tr -d '"')
#key is whatever comes before ":"
value=$(echo "${LINE}" | cut -d ":" -f2-)
#value is whatever comes after ":"
if [ "${value}" != "" ]; then
echo "${varname}+=(${value})" >> tmp.file
#we create a temp file that adds values to an array named by key elements
fi
done
#here we source the tmp.file and remove it
source tmp.file & rm -f tmp.file
This does what I am expecting it to do, the question is - does it fit all possible cases of JSON output for AWS or will I be stuck debugging it forever in the real world? :)