0

For reasons outside of my control, I am working with a repository that has many copies of similar content (a small OS). This means that while the repo size is fairly small, the working directory takes multiple hours to checkout on my system.

The specific tasks being performed is doing a git replace --graft to concatenate linear histories, then doing git filter-branch --tag-filter cat -- master to make the change permanent.

My issue is that filter-branch requires a clean working copy (such as generated by git checkout . or a standard git clone <URL>). This increases run time to several hours, which is undesirable. Is there a faster way to do this? The actual re-write takes about 30 seconds (all blobs stay the same, just the pointers change).

Emma Talbert
  • 227
  • 1
  • 9

2 Answers2

1

Some ideas.

1) stash your local changes, then pop them back when you are done.

2) create a new clone but use git clone --reference to speed it up.

Andreas Wederbrand
  • 38,065
  • 11
  • 68
  • 78
  • I am not making local changes, I just don't want to check out the files at all. I am already cloning from a local source, but the working copy is simply massive (copying the git objects takes about 10 seconds, but checking out the tree takes hours). – Emma Talbert Jul 01 '19 at 18:48
1

You can create a bare clone and do all filtering there:

git clone --bare source bare
cd bare
git filter-branch master   # this is a no-op here, but you get the idea
j6t
  • 9,150
  • 1
  • 15
  • 35
  • I am about 99% sure I had tried this previously and ran into an issue, but it works now, and I don't remember the issue in detail. Thank you. – Emma Talbert Jul 01 '19 at 19:10