1

So, I need to run below curl multiple times to get a list of names for certain teams. Every time I have to put a TeamName to get a list of users. I already have a list of the team names in a .txt file. I was wondering if there is a way to loop with all the list of team names from the .txt and output it to the file.

Like using for loop in bash, but doesn't need to use for loop anything that will provide me with the list of users for the given teams' names.

for ((i=1;i<=10;i++)); do curl

Source

curl -X GET 'https://api.opsgenie.com/v2/teams/TeamName?identifierType=name' --header 'Authorization: GenieKey abcdefdh'|python -m json.tool | grep -E username|role|name

Andrzej Sydor
  • 1,373
  • 4
  • 13
  • 28
  • Does this answer your question? [Looping through the content of a file in Bash](https://stackoverflow.com/questions/1521462/looping-through-the-content-of-a-file-in-bash) – Coder-256 Dec 28 '20 at 04:41
  • So I did similar to one of the answer posted there. But what I want is to grab each team name one at a time from the file and replace it with TeamName in the curl command and loop it until it grabs and runs thru all the team names. Like setting up some variable and use that variable. cat tmp1.json | while read line; do curl -X GET 'https://api.opsgenie.com/v2/teams/TeamName?identifierType=name' --header 'Authorization: GenieKey 123456'|python -m json.tool | grep -E username|role|name; done – user14899123 Dec 28 '20 at 05:46
  • Sure, simply change the url to `"https://api.opsgenie.com/v2/teams/TeamName?identifierType=$line"` (using double quotes here is important) – Coder-256 Dec 28 '20 at 20:47
  • @Coder-256 Input: Using double quotes `while read line; do curl -X GET "https://api.opsgenie.com/v2/teams/TeamName?identifierType=$line" --header 'Authorization: GenieKey API KEY' |python -m json.tool | grep -E username\|role\|name done – user14899123 Dec 29 '20 at 18:17
  • Whoops, maybe the url should be `"https://api.opsgenie.com/v2/teams/$line?identifierType=name"`. Also I would try just examining the raw output of curl to make sure it looks right before you pipe it to all those tools. There are also options like `-v` for verbosity while testing and `-sSf` for better error handling in the final version, check out the man page. – Coder-256 Dec 30 '20 at 00:10
  • @Coder-256 Thanks, that actually worked. But I have one small problem. If the team name has space it will not going to work for those teams but the rest still works. So some of the team names have space on it. Like "Team Name" or Team- Name". So for this, I do like curl -X GET 'https://api.opsgenie.com/v2/teams/Team%20Name?identifierType=name' --header. One way is to fill all the space with %20 in the .txt file. Is there a way I can say in the script like, if there is space under name then add %20. Thanks for your help – user14899123 Jan 05 '21 at 02:37

2 Answers2

0
while read team;
do
     curl -X GET "https://api.opsgenie.com/v2/teams/$team?identifierType=name" --header 'Authorization: GenieKey abcdefdh'|python -m json.tool | grep -E username|role|name
done < file.txt

Read each line of the file file.txt as a variable team. Substitute this variable in the curl url.

Paul Hodges
  • 13,382
  • 1
  • 17
  • 36
Raman Sailopal
  • 12,320
  • 2
  • 11
  • 18
  • I tried before giving $ variable but it didn't work, it basically, looks for $team as the team name and gives this message": "No team exists with name [$team]". – user14899123 Dec 29 '20 at 18:06
0

Here is the final command, including url-encoded spaces (via Bash pattern substitution):

while read TEAM_NAME; do
    curl \
        --header 'Authorization: GenieKey abcdefgh' \
        "https://api.opsgenie.com/v2/teams/${TEAM_NAME// /%20}?identifierType=name" \
    | python -m json.tool \
    | grep -E 'username|role|name'
done < file.txt

Note that this only escapes spaces in the team name, no other special characters are handled. Curl actually has complete, built-in url-encoding, but it seems it is only available for data/query parameters, not part of the URL path itself.

Coder-256
  • 5,212
  • 2
  • 23
  • 51