1

I made a lot of changes in my repository, and I of course, forgot my .gitignore file.

In this file, it was said that the /vendor folder should not be included. This folder's size is 400mb.

So I tried to push my project thanks to:

git add -A
git commit -m "commit"
git push test master

It failed, because the project was too big:

error: unpack failed: error Object too large (201,984,000 bytes), rejecting the pack. Max object size limit is 104,857,600 bytes.

I added the .gitignore file again, and ran:

git rm -r --cached .

But when I push it again, I still have the same error (with the same sizes), even if the "vendor" folder is not included anymore.

It's like git was keeping an old version of the commit...

Do you have any idea on how to fix this?

BTW, I'm pushing it to springloops.

Vico
  • 1,696
  • 1
  • 24
  • 57

2 Answers2

1

git rm -r --cached . modifies the index, but does not change the previous commit (or create a new commit)

You would need first to reset HEAD to before your git add -A .:

git reset @~ 

(mixed reset by default: reset HEAD and the index)

Then check your .gitignore and git status, add, commit and push.


With Git 2.38 (Q3 2022), allow large objects read from a packstream to be streamed into a loose object file straight, without having to keep it in-core as a whole.

That should make the remote repository more resilient.

See commit aaf8122, commit 2b6070a, commit 97a9db6, commit a1bf5ca (11 Jun 2022) by Han Xin (chiyutianyi).
See commit 3c3ca0b, commit 21e7d88 (11 Jun 2022) by Ævar Arnfjörð Bjarmason (avar).
(Merged by Junio C Hamano -- gitster -- in commit 73b9ef6, 14 Jul 2022)

object-file.c: refactor write_loose_object() to several steps

Helped-by: Ævar Arnfjörð Bjarmason
Helped-by: Jiang Xin
Signed-off-by: Han Xin
Signed-off-by: Ævar Arnfjörð Bjarmason

When writing a large blob using "write_loose_object()", we have to pass a buffer with the whole content of the blob, and this behavior will consume lots of memory and may cause OOM.
We will introduce a stream version function ("stream_loose_object()") in later commit to resolve this issue.

Before introducing that streaming function, do some refactoring on "write_loose_object()" to reuse code for both versions.

Rewrite "write_loose_object()" as follows:

  1. Figure out a path for the (temp) object file.
    This step is only used in "write_loose_object()".
  2. Move common steps for starting to write loose objects into a new function "start_loose_object_common()".
  3. Compress data.
  4. Move common steps for ending zlib stream into a new function "end_loose_object_common()".
  5. Close fd and finalize the object file.
VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250
  • I have tested to do a git reset, after a git rm -r cached, but with no success... git status doesn't tell me the vendor directory is uploaded. For a weird reason, my folder (only 40mb without vendor) takes 200 mb on git – Vico Aug 20 '15 at 11:58
  • @Vico did you do a `git reset`, or a `git reset @~`?. In any case, check for any large files in the history with https://rtyley.github.io/bfg-repo-cleaner/ – VonC Aug 20 '15 at 12:00
  • I did both, and I don't have any large file in my repository. The largest file was a tar (200mb) in the vendor directory, but a git rm file_name gave not result, because it's not included in the repository... – Vico Aug 20 '15 at 23:15
0

What I did was simply removing the .git folder, then git init again. It's now working correctly.

Vico
  • 1,696
  • 1
  • 24
  • 57