1

What are the tradeoffs of the different compression algorithms?

The purpose is backup, transfer & restore. I don't care about popularity, as long as a mature enough tool exists for unix. I care about

  • time
  • cpu
  • memory
  • compression level

the algorithms I am considering are

  • zip
  • bzip
  • gzip
  • tar
  • others?
flybywire
  • 261,858
  • 191
  • 397
  • 503

4 Answers4

2

Tar is not a compression algorithm per se.

You may use zip/gzip when time for compression/decompression is the most important issue.

You may use bzip when you need a better compression rate.

You may use LZMA when even bigger compression rate needed, but CPU time bigger.

Have a look here.

glmxndr
  • 45,516
  • 29
  • 93
  • 118
  • 1
    This comparison site is outdated, because there are lot of changes in lzma and 7zip since 2005. – bill Jun 16 '09 at 07:12
2

The best way is to look at compression benchmark sites:

Maximumcompression

Compressionratings

bill
  • 1,321
  • 11
  • 9
1

It usually depends on your input data but I've never found anything that gives me better general compression than 7zip (http://www.7-zip.org).

paxdiablo
  • 854,327
  • 234
  • 1,573
  • 1,953
0

It would be very simple to create a simple testbed for those cases.

Write a script that uses each in turn on a set of files that is representative of those you wish to comporess, and measure the time/cpu/memory usage/compression ratio acheived.

Rerun them a statistically significant number of times, and you'll have your answer.

pauljwilliams
  • 19,079
  • 3
  • 51
  • 79