0

I run this search on Guthub and I get 881 repos. Blazor & C# repos. https://github.com/search?l=C%23&q=blazor&type=Repositories

Is there a way to download all these repos easily instead of one by one?

Tony_Henrich
  • 42,411
  • 75
  • 239
  • 374
  • Why not parse the response html for repo paths and create download links which you run with curl or similar? – Johannes Stadler May 28 '19 at 05:12
  • Because I don't want to do all that work if there's something out there that does it already. – Tony_Henrich May 28 '19 at 05:16
  • A quick google search finds this as an example: https://github.com/BeameryHQ/git-beam-it. But questions to recommend or find a book, tool, ... are not a programming question and seem off-topic. – tkruse May 28 '19 at 05:37
  • @tkruse I didn't ask for a tool or a recommendation. I asked for a way. A way could be a programming example or instructions from GitHub API. If an answerer wants to mention a tool, it's their choice. The question is related to Github and its API. Not all SO question must be pure programming only. – Tony_Henrich May 28 '19 at 06:35

1 Answers1

3

Yes, your query can be run via the github search api:

That gives you one page of 100 repositories. You can loop over all pages, extract the ssh_url (or http if you prefer), and write the result to a file:

# cheating knowing we currently have 9 pages
for i in {1..9}
do
    curl "https://api.github.com/search/repositories?q=blazor+language:C%23&per_page=100&page=$i" \
     | jq -r '.items[].ssh_url' >> urls.txt
done

cat urls.txt | xargs -P8 -L1 git clone

You can optimize to extract the number of pages from the response headers.

References:

Similar question:

tkruse
  • 10,222
  • 7
  • 53
  • 80