1

Need to query an API endpoint for specific parameters, but there's a parameter limit of 20.

params are gathered into an array & stored in a JSON file, and ref'd in a variable tacked onto the end of my curl command, which generates the full curl API request.

curl -s -g GET '/api/endpoint?parameters='$myparams

eg.

curl -s -g GET '/api/endpoint?parameters=["1","2","3","etc"]'

This works fine when the params json is small and below the parameter limit per request. Only problem is params list fluctuates but is many times larger than the request limit.

My normal thinking would be to iterate through the param lines, but that would create many requests and probably block me too.

What would a good approach be to parse the parameter array json and generate curl API requests respectful of the parameter limit, with the minimum requests? Say its 115 params now, so that'd create 5 api requests of 20 params tacked on & 1 of 15..

  • You could have an array of your parameters that you want to pass in in json, parse and partition out to 5-item arrays, loop over that to download the content from that endpoint, after the loop merge the downloaded content. You're trying to pass 115 or so parameters in batches... naturally that's going to take a lot of requests. – Jeff Mercado Jul 13 '22 at 17:09

2 Answers2

1

You can chunk the array with undocumented _nwise function and then use that, e.g.:

<<JSON jq -r '_nwise(3) | "/api/endpoint?parameters=\(.)"'
["1","2","3","4","5","6","7","8"]
JSON

Output:

/api/endpoint?parameters=["1","2","3"]
/api/endpoint?parameters=["4","5","6"]
/api/endpoint?parameters=["7","8"]

This will generate the URLs for your curl calls, which you can then save in a file or consume directly:

<input.json jq -r ... | while read -r url; do curl -s -g -XGET "$url"; done

Or generate the query string only and use it in your curl call (pay attention to proper escaping/quoting):

<input.json jq -c '_nwise(3)' | while read -r qs; do curl -s -g -XGET "/api/endpoint?parameters=$qs"; done
knittl
  • 246,190
  • 53
  • 318
  • 364
1

Depending on your input format and requirements regarding robustness, you might not need jq at all; sed and paste can do the trick:

<<IN sed 's/\\/&&/g;s/"/\\"/g' | sed 's/^/"/;s/$/"/' | paste -sd ',,\n' | while read -r items; do curl -s -g -XGET "/api/endpoint?parameters=[$items]" done;
1
2
3
4
5
6
7
8
IN

Output:

curl -s -g -XGET /api/endpoint?parameters=["1","2","3"]
curl -s -g -XGET /api/endpoint?parameters=["4","5","6"]
curl -s -g -XGET /api/endpoint?parameters=["7","8"]

Explanation:

  • sed 's/\\/&&/g;s/"/\\"/g': replace \ with \\ and " with \".
  • sed 's/^/"/;s/$/"/': wrap each line/item in quotes
  • paste -sd ',,\n': take 3 lines and join them by a comma (repeat the comma character as many times as you need items minus 1)
  • while read -r items; do curl -s -g -XGET "/api/endpoint?parameters=[$items]"; done;: read generated items, wrap them in brackets and run curl
knittl
  • 246,190
  • 53
  • 318
  • 364