Git clone fails with out of memory error – “fatal: out of memory, malloc failed (tried to allocate 905574791 bytes) / fatal: index-pack failed”

I’m attempting to clone a large (1.4GB) Git repository to a 32-bit Debian VM with 384MB of RAM. I’m using Git, and using the SSH protocol to clone (‘git clone’)

The clone fails with this message:

  • Upgrade Subversion 1.4.3 to 1.5.2 on Debian (hosted account)
  • Git and packaging: common pattern to keep packaging-files out of the way?
  • Foundation project creation fails with npm ERR! git rev-list on debian 8.5.0
  • Versioning of debian packaging Information
  • Where to find prebuilt binaries for subversion 1.6 for Ubuntu (or Debian)?
  • github: server certificate verification failed
  • remote: Counting objects: 18797, done.
    remote: warning: subobtimal pack - out of memory
    remote: Compressing objects: 100% (10363/10363), done.
    fatal: out of memory, malloc failed (tried to allocate 905574791 bytes)
    fatal: index-pack failed

    I’ve tried reducing the amount of memory Git uses to repack on the host repository end, and repacking:

    git config pack.windowMemory 10m
    git config pack.packSizeLimit 20m
    git repack -a -d

    My questions are as follows:

    1. Is this a client-size (clone-side) problem or should it be resolved in the repo that I’m cloning from?
    2. In either case, is there anything I can do to make the clone succeed? A lot of the potential solutions online involve some/all of the following things, none of which are acceptable in this instance:

      • changing the contents of the repository substantively (i.e. deleting large files)
      • giving the VM which is doing the clone more RAM
      • giving the VM which is doing the clone a 64-bit virtual CPU
      • transferring out-of-band (e.g. using Rsync or SFTP to transfer the .git directory)

    Thanks in advance.

  • How can I get the diff between all the commits that occurred between two dates with Git?
  • git add command fails and keeps running
  • How to retrieve untracked files with git stash
  • Correct way to set up Github repo in Pydev?
  • Creating a hidden directory in a repository
  • Is it safe to run other git commands while git push is running?
  • 4 Solutions collect form web for “Git clone fails with out of memory error – “fatal: out of memory, malloc failed (tried to allocate 905574791 bytes) / fatal: index-pack failed””

    git clone will not look at your pack.packSizeLimit setting, it’ll anyway transfer everything in a single pack – unless it changed since the last time I looked.

    Using SCP or Rsync might be a way to work around your issue indeed. Removing the “useless” large files, then repacking the repository you try to clone could also help.

    Giving more RAM to the VM might also help – I don’t think you’ll need a 64-bits address space to allocate 900MB… You could also give it enough SWAP space to handle the 900MB package instead of increasing the RAM.

    I got a similar issue on Windows using MSysGit 32 bits.
    The git 64 bits from Cygwin did the job.
    Maybe you should use a 64 bits Debian VM (instead of 32 bits one).

    My original answer is available on question Git on Windows, “Out of memory – malloc failed”.

    Today I had the same issue. Git server ran out of memory, but gitlab reported that there is still memory available. We checked memory with htop (reported none available), restarted gitlab and everything went back to normal.

    sudo git pull

    I faced the same error message every time I pull and sudo git pull actually helped me to overcome this error message and pull was successful.

    Git Baby is a git and github fan, let's start git clone.