I have a git repo that stores only PDFs as binaries, in total there are 70 PDFs totalling 130MB, a few of them are large (about 15MB). When I try to clone the repo to my work computer I get the error:
remote: Counting objects: 93, done.
remote: fatal: Out of memory, malloc failed (tried to allocate 36864964 bytes)
error: git upload-pack: git-pack-objects died with error.
fatal: git upload-pack: aborting due to possible repository corruption on the remote side.
remote: aborting due to possible repository corruption on the remote side.
fatal: protocol error: bad pack header
Some other git answers have claimed to have fixed this by repacking the repo.
When I try to git repack --max-pack-size=5M -a -d
or git gc
on the server repo I get the same malloc error.
The git server is on my personal webspace offered by 1and1, I have the idea that it is not allowing me to use 36864964 bytes of memory in the process.
How can I clone the server repo on to my local computer?
Here is the output of ulimit -a
run on the server:
core file size (blocks, -c) 0
data seg size (kbytes, -d) 131072
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 48165
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) unlimited
cpu time (seconds, -t) 1800
max user processes (-u) 90
virtual memory (kbytes, -v) 131072
file locks (-x) unlimited
Thank you.