-1

I have around 500,000 text files which are mostly around 10 KB to 200KB (some particular ones could be up to around 10MB) and the total size is around 3GB.

I'd like to know how would the performance of git be to manage so large amount of files? Has people ever been host a repo with such scale?

If a web interface is served like GitHub, and supports editing file on line, would it take long to commit the change? Would it block when people push changes to the server?

Are there specific settings required for git to perform well on the server?

YakovL
  • 7,557
  • 12
  • 62
  • 102
Danny Lin
  • 2,050
  • 1
  • 20
  • 34

1 Answers1

0

As I mention in "What are the file limits in Git (number and size)?" and "git with large files", git is ill-suited to huge repo.

A large number of files could work (provided you don't change/add too many at a time, past the initial import)

But a large size is problematic due to the packfile indexes (.idx) files structure, and the cost of looking for a specific file in those indexes.

Plus GitHub is likely to enforce its soft limit quota if you try to store a huge repo on their servers.

It is best to split that repo in a collection of coherent smaller repo (that you can still group into one through submodules)

Community
  • 1
  • 1
VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250