Git clone fails with out of memory error – “fatal: out of memory, malloc failed (tried to allocate 905574791 bytes) / fatal: index-pack failed”

I’m attempting to clone a large (1.4GB) Git repository to a 32-bit Debian VM with 384MB of RAM. I’m using Git, and using the SSH protocol to clone (‘git clone’)

The clone fails with this message:

  • Unable to see file in bare repository at server site after pushing from local machine
  • RVM “ERROR: Unable to checkout branch .” single-user
  • Deploying a project stored on github to a debian webserver
  • Git and packaging: common pattern to keep packaging-files out of the way?
  • convert current git repo to one compatible with git-buildpackage
  • Github prompting for credentials after ssh
  • remote: Counting objects: 18797, done.
    remote: warning: subobtimal pack - out of memory
    remote: Compressing objects: 100% (10363/10363), done.
    fatal: out of memory, malloc failed (tried to allocate 905574791 bytes)
    fatal: index-pack failed

    I’ve tried reducing the amount of memory Git uses to repack on the host repository end, and repacking:

    git config pack.windowMemory 10m
    git config pack.packSizeLimit 20m
    git repack -a -d

    My questions are as follows:

    1. Is this a client-size (clone-side) problem or should it be resolved in the repo that I’m cloning from?
    2. In either case, is there anything I can do to make the clone succeed? A lot of the potential solutions online involve some/all of the following things, none of which are acceptable in this instance:

      • changing the contents of the repository substantively (i.e. deleting large files)
      • giving the VM which is doing the clone more RAM
      • giving the VM which is doing the clone a 64-bit virtual CPU
      • transferring out-of-band (e.g. using Rsync or SFTP to transfer the .git directory)

    Thanks in advance.

  • How to commit & push selected files but not all in Git
  • “Virtual” or “view” branch in git (on github)
  • add cloned gits to personal Gitlab
  • Rebase create a new commit when ammending
  • Remote Pushurl won't work
  • Git lists same file modified and not staged for commit?
  • 4 Solutions collect form web for “Git clone fails with out of memory error – “fatal: out of memory, malloc failed (tried to allocate 905574791 bytes) / fatal: index-pack failed””

    git clone will not look at your pack.packSizeLimit setting, it’ll anyway transfer everything in a single pack – unless it changed since the last time I looked.

    Using SCP or Rsync might be a way to work around your issue indeed. Removing the “useless” large files, then repacking the repository you try to clone could also help.

    Giving more RAM to the VM might also help – I don’t think you’ll need a 64-bits address space to allocate 900MB… You could also give it enough SWAP space to handle the 900MB package instead of increasing the RAM.

    I got a similar issue on Windows using MSysGit 32 bits.
    The git 64 bits from Cygwin did the job.
    Maybe you should use a 64 bits Debian VM (instead of 32 bits one).

    My original answer is available on question Git on Windows, “Out of memory – malloc failed”.

    Today I had the same issue. Git server ran out of memory, but gitlab reported that there is still memory available. We checked memory with htop (reported none available), restarted gitlab and everything went back to normal.

    sudo git pull

    I faced the same error message every time I pull and sudo git pull actually helped me to overcome this error message and pull was successful.

    Git Baby is a git and github fan, let's start git clone.