How to clone all repos at once from GitHub?

I have a company GitHub account and I want to back up all of the repositories within, accounting for anything new that might get created for purposes of automation. I was hoping something like this:

git clone*.git 

or similar would work, but it doesn’t seem to like the wildcard there.

Is there a way in Git to clone and then pull everything assuming one has the appropriate permissions?

  • How to merge my local project with a github project?
  • JGit: Status of remote repository
  • fatal: No configured push destination
  • Unable to open pull request from fork
  • Allow forking private repository on GitHub and integrate in Eclipse
  • Using git-flow, how to work rebase into the mix
  • Pushing Subgit git repo to remote GitHub repo/server
  • Sublime Text won't automatically change code, when I switched to different branch
  • 15 Solutions collect form web for “How to clone all repos at once from GitHub?”

    I don’t think it’s possible to do it that way. Your best bet is to find and loop through a list of an Organization’s repositories using the API.

    Try this:

    • Create an API token by going to Account Settings -> Applications
    • Make a call to: http://${GITHUB_BASE_URL}/api/v3/orgs/${ORG_NAME}/repos?access_token=${ACCESS_TOKEN}
    • The response will be a JSON array of objects. Each object will include information about one of the repositories under that Organization. I think in your case, you’ll be looking specifically for the ssh_url property.
    • Then git clone each of those ssh_urls.

    It’s a little bit of extra work, but it’s necessary for GitHub to have proper authentication.

    On Windows and all UNIX/LINUX systems, using Git Bash or any other Terminal, replace YOURUSERNAME by your username and use:

    curl "$USER/repos?page=$PAGE&per_page=100" |
      grep -e 'git_url*' |
      cut -d \" -f 4 |
      xargs -L1 git clone

    The maximum page-size is 100, so you have to call this several times with the right page number to get all your repositories (set PAGE to the desired page number you want to download).

    This gist accomplishes the task in one line on the command line:

    curl -s[your_org]/repos?per_page=200 | ruby -rubygems -e 'require "json"; JSON.load( { |repo| %x[git clone #{repo["ssh_url"]} ]}'

    Replace [your_org] with your organization’s name. And set your per_page if necessary.


    As ATutorMe mentioned, the maximum page size is 100, according to the GitHub docs.

    If you have more than 100 repos, you’ll have to add a page parameter to your url and you can run the command for each page.

    curl -s "[your_org]/repos?page=2&per_page=100" | ruby -rubygems -e 'require "json"; JSON.load( { |repo| %x[git clone #{repo["ssh_url"]} ]}'

    Note: The default per_page parameter is 30.

    Organisation repositories

    To clone all repos from your organisation, try the following shell one-liner:

    ORG=company; curl "$ORG/repos?per_page=1000" | grep -o 'git@[^"]*' | xargs -L1 git clone

    User repositories

    Cloning all using Git repository URLs:

    USER=foo; curl "$USER/repos?per_page=1000" | grep -o 'git@[^"]*' | xargs -L1 git clone

    Cloning all using Clone URL:

    USER=foo; curl "$USER/repos?per_page=1000" | grep -w clone_url | grep -o '[^"]\+://.\+.git' | xargs -L1 git clone

    If you need to clone the private repos, you’ve to insert your API key (using hub command should help you with that). Or include Authorization token in your header, by adding:


    Check the example here:

    • How to download GitHub Release from private repo using command line.


    – To increase speed, set number of parallel processes by specifying -P parameter for xargs (-P4 = 4 processes).

    – If you need to raise the GitHub limits, try authenticating by specifying your API key.

    – Add --recursive to recurse into the registered submodules, and update any nested submodules within.

    Go to Account Settings -> Application and create an API key
    Then insert the API key, github instance url, and organization name in the script below

    # Substitute variables here
    curl ${URL} | ruby -rjson -e 'JSON.load( {|repo| %x[git clone #{repo["ssh_url"]} ]}'

    Save that in a file, chmod u+x the file, then run it.

    Thanks to Arnaud for the ruby code.

    I found a comment in the gist @seancdavis provided to be very helpful, especially because like the original poster, I wanted to sync all the repos for quick access, however the vast majority of which were private.

    curl -u [[USERNAME]] -s[[ORGANIZATION]]/repos?per_page=200 |
      ruby -rubygems -e 'require "json"; JSON.load( { |repo| %x[git clone #{repo["ssh_url"]} ]}'

    Replace [[USERNAME]] with your github username and [[ORGANIZATION]] with your Github organization. The output (JSON repo metadata) will be passed to a simple ruby script:

    # bring in the Ruby json library
    require "json"
    # read from STDIN, parse into ruby Hash and iterate over each repo
    JSON.load( do |repo|
      # run a system command (re: "%x") of the style "git clone <ssh_url>"
      %x[git clone #{repo["ssh_url"]} ]

    I made a script with Python3 and Github APIv3

    Just run


    This python one-liner will do what you need. It:

    • checks github for your available repos
    • for each, makes a system call to git clone

      python -c "import json, urllib, os; [os.system('git clone ' + r['ssh_url']) for r in json.load(urllib.urlopen('<<ORG_NAME>>/repos?per_page=200'))]"

    There is also a very useful npm module to do this. It can not only clone, but pull as well (to update data you already have).

    You just create config like this:

       "username": "BoyCook",
       "dir": "/Users/boycook/code/boycook",
       "protocol": "ssh"

    and do gitall clone for example. Or gitall pull

    In case anyone looks for a Windows solution, here’s a little function in PowerShell to do the trick (could be oneliner/alias if not the fact I need it to work both with and without proxy).

    function Unj-GitCloneAllBy($User, $Proxy = $null) {
        (curl -Proxy $Proxy "$User/repos?page=1&per_page=100").Content 
          | ConvertFrom-Json 
          | %{ $_.clone_url } 
          # workaround git printing to stderr by @wekempf aka William Kempf
          | %{ & git clone $_ 2>&1 } 
          | % { $_.ToString() }

    So, I will add my answer too. 🙂 (I found it’s simple)

    Fetch list (I’ve used “magento” company):

    curl -si | grep ssh_url | cut -d '"' -f4

    Use clone_url instead ssh_url to use HTTP access.

    So, let’s clone them all! 🙂

    curl -si | \
        grep ssh_url | cut -d '"' -f4 | xargs -i git clone {}

    If you are going to fetch private repo’s – just add GET parameter ?access_token=YOURTOKEN

    You can get a list of the repositories by using curl and then iterate over said list with a bash loop:

    GIT_REPOS=`curl -s curl https://${GITHUB_BASE_URL}/api/v3/orgs/${ORG_NAME}/repos?access_token=${ACCESS_TOKEN} | grep ssh_url | awk -F': ' '{print $2}' | sed -e 's/",//g' | sed -e 's/"//g'`
    for REPO in $GIT_REPOS; do
      git clone $REPO

    So, in practice, if you want to clone all repos from the organization FOO which match BAR, you could use the one-liner below, which requires jq and common cli utilities

    curl '' |
      jq '.[] |
      .ssh_url' |
      awk '/BAR/ {print "git clone " $0 " & "}' |

    You can use open-source tool to clone bunch of github repositories:


    git_cloner --type github --owner octocat --login user --password user https://my_bitbucket

    Use JSON API from
    You can see the code example in the github documentation:

    Or there:

    The easiest way ever : I found a very well Github repo with witch you can do it with just with a single command from your terminal :

    Git Baby is a git and github fan, let's start git clone.