14

I have a repo from which pull takes forever because the server has little free RAM and it is swapping a lot while

remote: Compressing objects: 24%

is happening (even if I clone locally on the server). The network is not that constrained, so it would be fine to send all data uncompressed. How can I do that?

Tamás Szelei
  • 23,169
  • 18
  • 105
  • 180
  • You can try playing with `pack.*` options in [git-config](http://man.he.net/man1/git-config). – Piotr Praszmo Jul 14 '12 at 11:38
  • `echo '*.zip -delta' >>.gitattributes` to disable compression for zip files. see also [git pull without remotely compressing objects](https://stackoverflow.com/questions/7102053/git-pull-without-remotely-compressing-objects) – milahu May 05 '23 at 14:00

1 Answers1

20

From the git documentation:

 core.bigFileThreshold  

    Files larger than this size are stored deflated, without
    attempting delta compression.  Storing large files without
    delta compression avoids excessive memory usage, at the
    slight expense of increased disk usage.

Default is 512 MiB on all platforms.
This should be reasonable for most projects as source code and other 
text files can still be delta compressed, 
but larger binary media files won't be.

Common unit suffixes of 'k', 'm', or 'g' are supported.

So I guess by setting this value to something like 1 would do the trick.

Extended by comments: you can set this with a git config --add core.bigFileThreshold 1 command. It works for bare repos as well.

peterh
  • 11,875
  • 18
  • 85
  • 108
Peter van der Does
  • 14,018
  • 4
  • 38
  • 42