Is there an upper limit to the number of commits a git repository can handle?

I’m wondering if there’s an upper limit to the number of commits that a git repository can handle.

In a solo project I’m working on right now, I’ve been coding locally, committing/pushing changes in git, then pulling the changes on my development server.

  • How to find commit SHA1 of a tree containing a file containing a given string
  • Git push everything to new origin
  • Project commits too many files
  • What is the difference between local repository and index?
  • Add existing project to BitBucket using Xcode
  • How to compare changesets in Git?
  • I treat this as an easier alternative to working locally and uploading changes via FTP… Fortunately/Unfortunately it’s such an easy workflow that I sometimes go through many edit/commit/push/pull/browser-refresh cycles while coding.

    I’m wondering if this is going to turn around and bite me somewhere down the line. If it’s likely to be a problem, I’m wondering how I can avoid that trouble … It seems like a rebase might be the way to go, especially since I won’t have to worry about conflicting branches etc.

  • Git change code from a specific pushed commit on specific branch
  • Validate if commit exists
  • How can I import all local repositories to gitlab automatically?
  • Should composer.lock be committed to version control?
  • Why is git-svn is randomly changing my root directory to the parent?
  • Using Git to update production server
  • 3 Solutions collect form web for “Is there an upper limit to the number of commits a git repository can handle?”

    Well the “upper limit” would likely be the point at which a SHA1 collision occurs, but since the SHAs are 40 hexadecimal digits long (16^40 ~ 1.4×10^48 possibilities), it’s so close to zero possibility that it’s not even funny. So there’s roughly a zero percent chance you’ll have any problems for at least the next several millennia.

    Hyperbolic Example (just for fun): 1 commit/minute (just changing one file -> three new SHAs used (file, commit, tree) = 3 new shas used / minute = … = 1.6M shas used / year = 1.6 Billion shahs / millennia = 1×10^-37 % used each millenia… (at 1000 files/commmit/min, it’s still 3.6×10^-35%)

    That being said, if you want to clean up your history, squashing them down with rebase is probably your best bet. Just make sure you understand the implications if you’ve shared the repo publicly at all.

    You might also want to garbage collect after rebasing to free up some space (make sure the rebase worked right first, though, and you might need to tell it to collect everything or it will, by default, not collect anything newer than two-weeks old).

    I’m pretty sure you don’t have to worry at all 🙂

    Git is using SHA-1 hash in order to check files, the probability of having an hash conflict is near zero. So have fun !!

    I personnally did around 30 commits a day without issue.

    But avoid versionning binaries files 🙂 it’s really heavy for what it is.

    I think there is no strong limit to the number of commits git can handle, only what you can personally digest. With larger projects and multiple developers, you’ll see more activity than you would ever generate on your own.

    You can keep a secondary branch that you merge to every week if you wish, but git will never care about how many commits you have. Go crazy as long as you can understand what you’re doing. You can always diff several commits back or use tools like bisect to figure out history problems.

    Git Baby is a git and github fan, let's start git clone.