Our git
repository has grown to an unwieldy size due to binary files, images, etc being accidentally committed to it. There are no large files currently in the repo, but there are large files in the history of the repo.
My plan is to remove these large files from our git
history, and I have found a number of good resources and SO answers for doing so (https://rtyley.github.io/bfg-repo-cleaner/, How to remove/delete a large file from commit history in Git repository?, https://help.github.com/articles/removing-files-from-a-repository-s-history/).
My primary issue is that we have a number of contributors to our repo (hosted on Bitbucket) and I'm worried that once I rip the large files out of the history, our contributors will push
the history with the large files back up into the remote repo.
Specifically, the BFG Repo Cleaner documentation states:
At this point, you're ready for everyone to ditch their old copies of the repo and do fresh clones of the nice, new pristine data. It's best to delete all old clones, as they'll have dirty history that you don't want to risk pushing back into your newly cleaned repo.
So, my question is twofold:
- Is there a way to ensure that pushes from old clones of the repo won't re-introduce the large files?
- If not, is there a way to keep old clones from pushing and thus require all contributors to start with a fresh clone?
Thank you!