有没有一种方法可以基于搜索结果从Github批量/批量下载所有仓库. [英] Is there a way to bulk/batch download all repos from Github based on a search result?
问题描述
我在Guthub上进行了此搜索,得到了881个回购.开拓者& C#仓库. https://github.com/search?l=C% 23& q = blazor& type =存储库
I run this search on Guthub and I get 881 repos. Blazor & C# repos. https://github.com/search?l=C%23&q=blazor&type=Repositories
有没有一种方法可以轻松地下载所有这些存储库,而不是一个一个地下载?
Is there a way to download all these repos easily instead of one by one?
推荐答案
是的,可以通过github搜索api运行查询:
Yes, your query can be run via the github search api:
这将为您提供100个存储库中的一页.您可以遍历所有页面,提取ssh_url(如果需要,则提取http),然后将结果写入文件:
That gives you one page of 100 repositories. You can loop over all pages, extract the ssh_url (or http if you prefer), and write the result to a file:
# cheating knowing we currently have 9 pages
for i in {1..9}
do
curl "https://api.github.com/search/repositories?q=blazor+language:C%23&per_page=100&page=$i" \
| jq -r '.items[].ssh_url' >> urls.txt
done
cat urls.txt | xargs -P8 -L1 git clone
您可以优化以从响应标题中提取页面数.
You can optimize to extract the number of pages from the response headers.
参考:
- https://developer.github.com/v3/search/
- Parsing JSON with Unix tools
- How to apply shell command to each line of a command output?
- Running programs in parallel using xargs
类似的问题:
这篇关于有没有一种方法可以基于搜索结果从Github批量/批量下载所有仓库.的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!