3

I am going to use git/github to track a writing project I'm doing. I later hope to use the changelog of the repo for research purposes.

To get more observations/snapshots of the work in progress, I was thinking of writing a script that checks for changes every 10 minutes, and auto-adds, commits, and pushes them (assume no merge issues).

If I work an average of 10 hours a week on the project, for 6 months, that's (roughly) 14,400 small commits. Would I start to experience any sort of bottleneck or performance decline in git due to having that many total commits?

If not there, is there any breaking point? Millions of commits?

Data Skeptic
  • 154
  • 1
  • 1
  • 7
  • The linux kernel has had about 70k commits/yr the last several years, so you should be fine. 10 min snapshots that are not actually logical versions, is not really what a version control system is for. You might as well just sync to dropbox. – pvg Jan 14 '16 at 04:24
  • Following up on the comment by @pvg, simply because you are making many small commits on a feature branch, does not mean you have to bring them all in to the remote. You can squash those commits. – Tim Biegeleisen Jan 14 '16 at 04:26

1 Answers1

6

The Linux kernel git repository has over 500,000 commits, so you should be fine. Performance issues with a git repository is more to do with the cumulative size of the committed files than with the number of commits. See this answer for more details.

Community
  • 1
  • 1
Dave
  • 1,338
  • 12
  • 17