0

My team is working with two remotes: origin and backup.

backup is GitHub-based, so it does not support large files. Someone made a commit to origin with a large file and since then we have all made numerous commits.

We have since deleted that large file. (In fact, I didn't finish pulling the commit with it and I don't think I actually downloaded it because it was removed in the succeeding commit.)

I have tried

git checkout master
git checkout --orphan tmp
git commit -m "Root"
git push backup master

but the push fails due to that large file exceeding the size limit.

How do I push the current state of origin to backup but not the large file?

ZX9
  • 898
  • 2
  • 16
  • 34

1 Answers1

0

backup is GitHub-based, so it does not support large files. Someone made a commit to origin with a large file and since then we have all made numerous commits.

We have since deleted that large file.

If you ran git rm on it, you didn't. Git stores a complete chain of all files in each commit. When you git rm a file, it's still in history. It's still in the repository. You'd have to completely obliterate it from history. Simplest way to do that is using the BFG Repo Cleaner.

But there's a better way to handle the large file problem: allow them.

Use git-lfs to handle large files. This only stores a pointer to the large file in the repository. The actual file is stored elsewhere. Then you can put all the large files in your repository you want.

There's still the problem of what to do with that large file that's already been committed. BFG to the rescue again! You can use the BFG to retroactively place large files into git-lfs.

$ java -jar bfg.jar --convert-to-git-lfs 'large_file_name' --no-blob-protection

Then continue as normal.

Community
  • 1
  • 1
Schwern
  • 153,029
  • 25
  • 195
  • 336