3

I'm using GitHub to manage my repository and I'm getting the following errors while attempting to push a large commit (1.5 GB).

error: pack-objects died of signal 9
fatal: The remote end hung up unexpectedly
fatal: The remote end hung up unexpectedly
fatal: write error: Bad file descriptor

Any ideas how to resolve this?

David Jones
  • 10,117
  • 28
  • 91
  • 139

3 Answers3

6

I had this problem with a git pack file. To get around it I repacked and specified the max size of the pack.

git repack --max-pack-size=100M -a -d
Will
  • 8,102
  • 5
  • 30
  • 32
4

Github drops the connection because of the large commit size. Consider this help page:

https://help.github.com/articles/working-with-large-files

If you use SSH you will see something like this

remote: warning: Large files detected.

remote: error: File giant_file is 123.00 MB; this exceeds GitHub's file size limit of 100 MB

In this case your commit will be rejected. Using HTTPS there is currently no way to transmit the error message to your client.

Boris Brodski
  • 8,425
  • 4
  • 40
  • 55
  • I switched to SSH but I was getting the same errors. I just committed in several chunks and I was eventually able to get everything synced. – David Jones Dec 13 '13 at 10:08
2

This tip may apply to I was working with git trying to push to github on Bluehost but this could generically apply to a lot of hosting companies. Previous tips didn't work. After getting such errors as

Counting objects: 3532, done.
Delta compression using up to 24 threads.
fatal: unable to create thread: Resource temporarily unavailable
fatal: The remote end hung up unexpectedly
fatal: The remote end hung up unexpectedly
fatal: write error: Bad file descriptor

This let me send git commits on the shared host. It must forbid additional threads.

git config --global pack.threads "1"

Related issues and solution here: git push fatal: unable to create thread: Resource temporarily unavailable

HongPong
  • 616
  • 8
  • 14