To monitor why git add . slow?

Assume project where no add and commit has been done for a long time.
I do git add . but it takes too much time.
I would like to estimate which files/directories are most expensive in the current case.
I have a good .gitignore file which works sufficiently but, still sometimes, I have too much and/or something too difficult to be added and committed to Git.

I have often directories which size is from 300GB to 2 TB in my directories.
Although excluding them by directory/* and directory/ in .gitignore, the addition is slow.

  • Why, in some situation does “git checkout origin/branch” result in “detached at <SHA>” instead of “detached at origin/master”?
  • Why do I need to force git to sync my remote repository?
  • How can I programmatically (in a shell script) determine whether or not there are changes?
  • Is it OK to have a development branch where individual commits may not even be working?
  • Github API - Get contents of tag instead of master
  • Xcode snapshot fails when using git lfs
  • How can you estimate which directories/files are too expensive to be committed?

  • Manually merge git pull request
  • Using Git to update one file with the changes made to a second file
  • BitBucket - download source as ZIP
  • better git commit template in intellij
  • Eclipse or Maven add /target to gitignore
  • Export git repository to svn
  • One Solution collect form web for “To monitor why git add . slow?”

    Git slowness is generally from large binary files. This isn’t because they’re binary, just because binary files tend to be large and more complex to compress & diff.

    Based on your edit indicating the file sizes, I suspect this is your problem.

    The answers to this question offer a few solutions: removing them from source control, manually running git gc, etc.

    Git Baby is a git and github fan, let's start git clone.