I'll ask the question first and then give explanation:
What will be the problems if my pack-file is very huge, say 100-150Mb?
What am I doing?
I have created a forking model for my organization. So every developer forks from blessed_repo, clone their forks on local machines, hack-hack-hack, push to fork.
Now, every user is pushing virtually same stuff to their forks and the objects are duplicated in multiple user's forks.
Hence I decided to have a Shared_objects_Store to which each fork's alternate
will point to.
But here Junio C Hamano
says I need to run git repack -adl
every-time in the borrowed repo, as git-gc
will not remove duplicate objects from borrowing repo if they are in loose state in the alternate
object store.
Now, If I keep running git repack -adl
everytime, it create one and onyl one pack, and it is going to be huge. Will it in the long run give me problems like this or anything else?
Thanks in Advance!
Update-1
I have to use alternates as disk-space is a problems. Also without it, backups are getting bulkier. (I have 100 forks of same repo, each one pushing same contents will be a mess.)
My server is on CentOS, the initial fork creates hard-links, but subsequent pushes from local repos to personal forks make redundant objects.