0
Enumerating objects: 85, done.
Counting objects: 100% (85/85), done.
Delta compression using up to 8 threads
Compressing objects: 100% (78/78), done.
Writing objects:  15% (13/84), 41.26 MiB | 46.00 KiB/s

It does this every time, I removed some files and it still increased, and when it was nearing the end last time it errored out with:

Enumerating objects: 75, done.
Counting objects: 100% (75/75), done.
Delta compression using up to 8 threads
Compressing objects: 100% (69/69), done.
remote: fatal: pack exceeds maximum allowed size
error: remote unpack failed: index-pack abnormal exit
To ssh://github.com/adobug/SSST2022Hackathon.git
 ! [remote rejected] main -> main (failed)
error: failed to push some refs to 'ssh://github.com/adobug/SSST2022Hackathon.git'
adobug
  • 11
  • 1
  • 5

1 Answers1

0

When you make new commits, you add to the repository.

Suppose you have a Git repository with 100 commits. You make one new commit, in which you add a big file (say, a 400 gigabyte database). You now have 101 commits.

But—oops!—you didn't mean to commit the 400 gigabyte database. So you run git rm big-database.db and then git commit. You now have ... 102 commits, with the big file in the commit before the last one.

No matter what you do, as you add more commits, you add to the database. You can't make a big file in history go away by adding more history. You're going to have to remove history. This may horrify some folks (How to cope with mutable Git history?) but Git does allow you to "reset yourself back in time" (git reset) and then make new and improved commits to use instead of the old bad ones. The old bad commits are not yet gone, but your git push won't try to send those commits now; instead, it will send only your new and improved commits.

For more, see, e.g., How to remove/delete a large file from commit history in the Git repository? and its many duplicates.

torek
  • 448,244
  • 59
  • 642
  • 775