In jq, how can I convert a JSON to a string with key=value
?
From:
{
"var": 1,
"foo": "bar",
"x": "test"
}
To:
var=1
foo=bar
x=test
You could try:
jq -r 'to_entries|map("\(.key)=\(.value|tostring)")|.[]' test.json
Here's a demo:
$ cat test.json
{
"var": 1,
"foo": "bar",
"x": "test"
}
$ jq -r 'to_entries|map("\(.key)=\(.value|tostring)")|.[]' test.json
foo=bar
var=1
x=test
Is there any way i can do this recursively?
Here is a function which might do what you want:
# Denote the input of recursively_reduce(f) by $in.
# Let f be a filter such that for any object o, (o|f) is an array.
# If $in is an object, then return $in|f;
# if $in is a scalar, then return [];
# otherwise, collect the results of applying recursively_reduce(f)
# to each item in $in.
def recursively_reduce(f):
if type == "object" then f
elif type == "array" then map( recursively_reduce(f) ) | add
else []
end;
Example: emit key=value pairs
def kv: to_entries | map("\(.key)=\(.value)");
[ {"a":1}, [[{"b":2, "c": 3}]] ] | recursively_reduce(kv)
#=> ["a=1","b=2","c=3"]
UPDATE: After the release of jq 1.5, walk/1 was added as a jq-defined built-in. It can be used with the above-defined kv, e.g. as follows:
walk(if type == "object" then kv else . end)
With the above input, the result would be:
[["a=1"],[[["b=2","c=3"]]]]
To "flatten" the output, flatten/0 can be used. Here is a complete example:
jq -cr 'def kv: to_entries | map("\(.key)=\(.value)");
walk(if type == "object" then kv else . end) | flatten[]'
Input:
[ {"a":1}, [[{"b":2, "c": 3}]] ]
Output:
a=1
b=2
c=3
Incidentally, building off of @aioobe's excellent answer. If you need the keys to be all upper case you can use ascii_upcase
to do this by modifying his example:
jq -r 'to_entries|map("\(.key|ascii_upcase)=\(.value|tostring)")|.[]'
I had a scenario similar to yours but wanted to uppercase all the keys when creating environment variables for accessing AWS.
$ okta-credential_process arn:aws:iam::1234567890123:role/myRole | \
jq -r 'to_entries|map("\(.key|ascii_upcase)=\(.value|tostring)")|.[]'
EXPIRATION=2019-08-30T16:46:55.307014Z
VERSION=1
SESSIONTOKEN=ABcdEFghIJ....
ACCESSKEYID=ABCDEFGHIJ.....
SECRETACCESSKEY=AbCdEfGhI.....
without jq
, I was able to export every item in json using grep
and sed
but this will work for simple cases only where we have key/value pairs
for keyval in $(grep -E '": [^\{]' fileName.json | sed -e 's/: /=/' -e "s/\(\,\)$//"); do
echo "$keyval"
done
here's a sample response:
❯ for keyval in $(grep -E '": [^\{]' config.dev.json | sed -e 's/: /=/' -e "s/\(\,\)$//"); do
echo "$keyval"
done
"env"="dev"
"memory"=128
"role"=""
"region"="us-east-1"
To add on top of aaiobe’s answer, here’s a few things to make it safer / more flexible to use:
null
safetyjsn_
præfix for namespacing the resulting variables to avoid them overwriting your normal variables$ cat nil.jsn
null
$ jq -r 'to_entries? | map("set -A jsn_\(.key|@sh) -- \(.value|@sh)") | .[]' <nil.jsn
$ cat test.jsn
{
"var": 1,
"foo": [
"bar",
"baz",
"space test"
],
"x": "test"
}
$ jq -r 'to_entries? | map("set -A jsn_\(.key|@sh) -- \(.value|@sh)") | .[]' <nil.jsn
$ jq -r 'to_entries? | map("set -A jsn_\(.key|@sh) -- \(.value|@sh)") | .[]' <test.jsn
set -A jsn_'var' -- 1
set -A jsn_'foo' -- 'bar' 'baz' 'space test'
set -A jsn_'x' -- 'test'
The above outputs Korn Shell arrays (for mksh
and AT&T ksh
). If you prefer/need GNU bash-style arrays (also supported by many other recent shell releases) use:
$ jq -r 'to_entries? | map("jsn_\(.key | @sh)=(\(.value | @sh))") | .[]' <nil.jsn
$ jq -r 'to_entries? | map("jsn_\(.key | @sh)=(\(.value | @sh))") | .[]' <test.jsn
jsn_'var'=(1)
jsn_'foo'=('bar' 'baz' 'space test')
jsn_'x'=('test')
In all subsequent examples I will be using the Korn Shell variant, but applying the difference between the above two variants will work for them.
In both cases, you get arrays, but dereferencing the array itself is the same as dereferencing its element #0 so this is safe even for single values:
$ eval "$(jq -r 'to_entries? | map("set -A jsn_\(.key|@sh) -- \(.value|@sh)") | .[]' <test.jsn)"
$ echo $jsn_x
test
$ echo $jsn_foo = ${jsn_foo[0]} but not ${jsn_foo[1]}
bar = bar but not baz
There is a downside though: scripts written for just sh
do not have arrays, so you need to either target bash
, ksh88
, ksh93
, mksh
, zsh
, (possibly others), or their common subset (all of these support [[ … ]]
; all but ksh88
should support GNU bash-style arrays).
Two further improvements:
null
handling (see below)$ jq -r 'if . == null then
> null
> else
> to_entries | map(
> select(IN(.key;
> "foo", "val", "var"
> )) | "set -A jsn_\(.key | @sh) -- \(.value | @sh)"
> ) | .[]
> end' <nil.jsn
null
$ jq -r 'if . == null then
> null
> else
> to_entries | map(
> select(IN(.key;
> "foo", "val", "var"
> )) | "set -A jsn_\(.key | @sh) -- \(.value | @sh)"
> ) | .[]
> end' <test.jsn
set -A jsn_'var' -- 1
set -A jsn_'foo' -- 'bar' 'baz' 'space test'
In this example, only the keys foo
, val
(not existent here) and var
are permitted. This can be used e.g. to filter out keys whose values are something other than simple values or one-dimensional JSONArrays of simple values, so the result is safe¹.
You would use this e.g. as follows in a shell snippet:
set -o pipefail
if ! vars=$(curl "${curlopts[@]}" "$url" | jq -r '
if . == null then
null
else
to_entries | map(
select(IN(.key;
"foo", "val", "var"
)) | "set -A jsn_\(.key | @sh) -- \(.value | @sh)"
) | .[]
end
'); then
echo >&2 "E: no API response"
exit 1
fi
if [[ $vars = null ]]; then
echo >&2 "E: empty API response"
exit 1
fi
eval "$vars"
echo "I: API response: var=$jsn_var"
for x in "${jsn_foo[@]}"; do
echo "N: got foo '$x'"
done
① While jq
throws an error unless the question mark is used, the failure mode is not so nice if there are indeed multi-dimensional arrays or other shenanigans:
$ cat test.jsn
{
"foo": [
[
"bar",
"baz"
],
"space test"
],
"var": 1,
"x": "test"
}
$ jq -r 'to_entries? | map("set -A jsn_\(.key|@sh) -- \(.value|@sh)") | .[]' <test.jsn
$ echo $?
0
$ jq -r 'to_entries | map("set -A jsn_\(.key|@sh) -- \(.value|@sh)") | .[]' <test.jsn
jq: error (at <stdin>:11): array (["bar","baz"]) can not be escaped for shell
$ echo $?
5
$ jq -r 'if . == null then
> null
> else
> to_entries | map(
> select(IN(.key;
> "foo", "val", "var"
> )) | "set -A jsn_\(.key | @sh) -- \(.value | @sh)"
> ) | .[]
> end' <test.jsn
jq: error (at <stdin>:11): array (["bar","baz"]) can not be escaped for shell
$ echo $?
5
Make sure to test for that if you don’t know your JSON will be fine. Heck, test for that, always; catch jq
error exit codes. Note how the set -o pipefail
in the above example, if supported by the shell (most recent ones do), makes the whole command substitution fail if either cURL or jq
fail (or both, of course); otherwise you’d have to redirect cURL output to a temporary file, check curl
’s errorlevel, then run jq on the temporary file in a command substitution and check its exit status (you’ll still have to do precisely that if you wish to distinguish between them in error messages).
Here's a compact solution: to_entries[]|join("=")
$ echo '{"var": 1, "foo": "bar", "x": "test"}' | \
jq -r 'to_entries[]|join("=")'
var=1
foo=bar
x=test