0

Objective:

I'm trying to write a script that will fetch two URLs from a GitHub release page and do something different with each one.

So far:

Here's what I've got so far.

λ curl -s https://api.github.com/repos/mozilla-iot/gateway/releases/latest | grep "browser_download_url.*tar.gz" | cut -d : -f 2,3 | tr -d \"

This will return the following:

"https://github.com/mozilla-iot/gateway/releases/download/0.8.1/gateway-8c29257704ddb021344bdaaa790909a0eacf3293bab94e02859828a6fd9b900a.tar.gz"
"https://github.com/mozilla-iot/gateway/releases/download/0.8.1/node_modules-921bd0d58022aac43f442647324b8b58ec5fdb4df57a760e1fc81a71627f526e.tar.gz"

I want to be able to create some directories, pull in the first one, navigate in the directories from the newly pulled zip after extracting it, and then pull in the second.

Community
  • 1
  • 1
Josh
  • 676
  • 1
  • 5
  • 13
  • The quotes could not possibly be printed in the output as you literally just obliterated them with `tr -d` – tripleee May 02 '19 at 10:57

2 Answers2

1

fetching the first line is easy by piping the output to head -n1. for solving your problem, you need more than just fetching the first URL of the cURL output. give this a try:

#!/bin/bash

# fetch your URLs
answer=`curl -s https://api.github.com/repos/mozilla-iot/gateway/releases/latest | grep "browser_download_url.*tar.gz" | cut -d : -f 2,3 | tr -d \"`

# get URLs and file names
first_file=`echo "$answer" | grep -Eo '.+?\.tar\.gz' | head -n1 | tr -d " "`
second_file=`echo "$answer" | grep -Eo '.+?\.tar\.gz' | head -n2 | tail -1 | tr -d " "`
first_file_name=`echo "$answer" | grep -Eo '[^/]+?\.tar\.gz' | head -n1 `
second_file_name=`echo "$answer" | grep -Eo '[^/]+?\.tar\.gz' | head -n2 | tail -1`

#echo $first_file
#echo $first_file_name
#echo $second_file_name
#echo $second_file

# download first file
wget "$first_file"

# extracting first one that must be in the current directory.
# else, change the directory first and put the path before $first_file!
tar -xzf "$first_file_name"

# do your stuff with the second file
meistermuh
  • 393
  • 3
  • 11
  • I tried running that and changing it up a bit. I haven't gotten it to work yet. However, if I change the `wget` to `echo` and comment out `tar`, this is what I get: ``` https://github.com/mozilla-iot/gateway/releases/download/0.8.1/gateway-8c29257704ddb021344bdaaa790909a0eacf3293bab94e02859828a6fd9b900a.tar.gz [0] https://github.com/mozilla-iot/gateway/releases/download/0.8.1/gateway-8c29257704ddb021344bdaaa790909a0eacf3293bab94e02859828a6fd9b900a.tar.gz [1] ``` – Josh May 02 '19 at 10:23
  • ^ Basically, I just get the first URL repeated with [0] and [1] after them. – Josh May 02 '19 at 10:26
  • 1
    You should not use uppercase for your private variables. But in fact, there is no need to keep anything except the current URL in a variable here - just loop over the `curl` output lines. And of course, as always, [quote your variables.](/questions/10067266/when-to-wrap-quotes-around-a-shell-variable) – tripleee May 02 '19 at 10:55
  • 1
    @Josh : you are right - i tried it on my server now and recognized that there was a different in executing all commands within a script vs. executing them each on the command line itself. see the update...sorry – meistermuh May 02 '19 at 11:31
  • @tripleee : indeed quoting should not be forgotten ;) but looping would be best if there's no need for handling the n-th URL differently – meistermuh May 02 '19 at 11:40
  • That works beautifully! That's just what I was needing. Now I need to figure it all out. – Josh May 05 '19 at 05:27
0

You can simply pipe the URLs to xargs curl;

curl -s https://api.github.com/repos/mozilla-iot/gateway/releases/latest |
grep "browser_download_url.*tar.gz" |
cut -d : -f 2,3 | tr -d \" |
xargs curl -O

Or if you want to do some more manipulation on each URL, perhaps loop over the results:

curl ... | grep ... | cut ... | tr ... |
while IFS= read -r url; do
    curl -O "$url"
    : maybe do things with "$url" here
done

The latter could easily be extended to someting like

... | while IFS= read -r url; do
    d=${url##*/}
    mkdir -p "$d"
    ( cd "$d"
      curl -O "$url" 
      tar zxf *.tar.gz
      # end of subshell means effects of "cd" end
    )
 done
tripleee
  • 175,061
  • 34
  • 275
  • 318