9

I'm trying to use deflate/gzip streams in C# but it appears that the files after compression are bigger than before.

For example, I compress a docx file of 900ko, but it produce a 1.4Mo one !

And it does it for every file I tried.

May be I am wrong in the way I'm doing it? Here is my code :

  FileStream input = File.OpenRead(Environment.CurrentDirectory + "/file.docx");
  FileStream output = File.OpenWrite(Environment.CurrentDirectory + "/compressedfile.dat");

  GZipStream comp = new GZipStream(output, CompressionMode.Compress);

  while (input.Position != input.Length)
      comp.WriteByte((byte)input.ReadByte());

  input.Close();

  comp.Close(); // automatically call flush at closing
  output.Close();
Lazarus
  • 41,906
  • 4
  • 43
  • 54
kite
  • 679
  • 2
  • 8
  • 19
  • 1
    You do realize that a compression method that will compress *any arbitrary* input by at least one byte cannot exist? So especially if you are trying to compress data that is close to random already, e.g. precompressed data, you may see a size increase. – Joey Oct 05 '10 at 13:30
  • 3
    .docx is already compressed using ZIP compression (try renaming to .zip and having an explore). I'd be surprised if a second level of compression would yield any benefit. – spender Oct 05 '10 at 13:33
  • it should effectively do compression only on the flush, so it shouldn't change a thing – kite Oct 05 '10 at 13:34
  • @spender > didn't know that, I'll try with an other file formet – kite Oct 05 '10 at 13:34
  • Have you tried compressing a .txt file? – Lazarus Oct 05 '10 at 13:35
  • well, it works with a txt. didn't know docs was already a compressed format – kite Oct 05 '10 at 13:36
  • There was a bug opened with Microsoft covering this phenomenon, in which DeflateStream increases the size of a previously compressed data stream: https://connect.microsoft.com/VisualStudio/feedback/details/93930/gzipstream-deflatestream-fail-to-check-for-incompressible-data It's currently marked "Closed - External". I don't know what that means. – Cheeso May 08 '11 at 16:02

5 Answers5

7

Such a big difference seems strange to me, but you should keep in mind that docx is itself compressed in ZIP, so there is no reason to compress it again, results usually are bigger.

Andrey
  • 59,039
  • 12
  • 119
  • 163
  • yes thanks, I didn't know it, and it is why it didn't work :) tried with .txt and other format and it seems better. but it still doesn't works on a home-made serialized file type ... but it doesn't matter at the end, just wanted to see how to use those compression streams :) – kite Oct 05 '10 at 13:38
2

Firstly, deflate/gzip streams are remarkably bad at compression when compared to zip, 7z, etc.

Secondly, docx (and all of the MS document formats with an 'x' at the end) are just .zip files anyway. Rename a .docx to .zip to reveal the smoke and mirrors.

So when you run deflate/gzip over a docx, it will actually make the file bigger. (Its like doing a zip with a low level of compression over a zipped file with a high level of compression.)

However if you run deflate/gzip over HTML or a text file or something that is not compressed then it will actually do a pretty good job.

DJA
  • 661
  • 1
  • 9
  • 25
  • yep thanks, as said in other comment didn't know that docx was already compressed. and sure 7z and other libraries are better, but just wanted to try these out to see what they were able to do – kite Oct 05 '10 at 13:41
  • 2
    This seems like a totally invalid comment: *deflate/gzip streams are remarkably bad at compression when compared to zip, 7z, etc*. Fact is, 99% of zip files use DEFLATE as the compression format. So zip can be *no better* than DEFLATE, because it augments the compressed stream with metadata. – Cheeso May 08 '11 at 15:58
  • The phenomenon in which a DeflateStream actually *increases* the size of the previously compressed data is the topic of a bug that was opened with Microsoft in 2006: https://connect.microsoft.com/VisualStudio/feedback/details/93930/gzipstream-deflatestream-fail-to-check-for-incompressible-data – Cheeso May 08 '11 at 15:59
0

Although it is true, as others have indicated, that the example files you specified are already compressed - the biggest issue is to understand that unlike most compression utilities, the DeflateStream and GZipStream classes simply try to tokenize/compress a data stream without the intelligence that all the additional tokens (overhead) are actually increasing the amount of data required. Zip, 7z, etc. are smart enough to know that if data is largely random entropy (virtually uncompressable), that they simply store the data "as-is" (store, not compressed), instead of attempting to compress it further.

Michael
  • 3,821
  • 2
  • 19
  • 18
  • 1
    This is not true: *Zip, 7z, etc. are smart enough to know that if data is largely random entropy (virtually uncompressable), that they simply store the data "as-is" (store, not compressed), instead of attempting to compress it further.* ZIP is merely a file format. It does not "know" anything. a program that produces a ZIP file may do what you describe, but the ZIP format does not. – Cheeso May 08 '11 at 16:00
  • 1
    The phenomenon in which DeflateStream actually *inflates* the size of previously compressed data is the topic of a bug that has been opened with Microsoft: https://connect.microsoft.com/VisualStudio/feedback/details/93930/gzipstream-deflatestream-fail-to-check-for-incompressible-data – Cheeso May 08 '11 at 16:02
  • Wasn't talking about the format (good grief). Was talking about the compression utilities that write data in their corresponding formats. – Michael Jun 13 '11 at 15:03
0

I had the same issue with compressing databases containing jpg data. I tried dotnetzip - a drop in replacement and got decent compression (Supports Compact Framework too!):

MS : 10MB -> 10.0MB
DNZ: 10MB ->  7.6MB
Nightfirecat
  • 11,432
  • 6
  • 35
  • 51
Andy Joiner
  • 5,932
  • 3
  • 45
  • 72
-2

I don't think GzipStream and DeflateStream are intended to compress files. You would probably have better luck with a file compressor like SharpZipLib.

Dave Swersky
  • 34,502
  • 9
  • 78
  • 118
  • they are made to compress and decompress. I'm currently reading MCTS 70-536 certification book and they are used like that there ^^ – kite Oct 05 '10 at 13:40
  • and what are they for? http://msdn.microsoft.com/en-us/library/system.io.compression.gzipstream.aspx "GZipStream Class Provides methods and properties used to compress and decompress streams." – Andrey Oct 05 '10 at 13:41
  • They're perfectly good at compressing files and for many cases handier than zip since they work straight on the file rather than creating an archive, and you can output them straight from a webserver instead of compressing on the fly every time. Appending .gz to the name (after the original extension rather than replacing it) is common with gzip files. Not to say that SharpZipLib isn't better in a lot of cases though. – Jon Hanna Oct 05 '10 at 13:44
  • @kite: I worked at Microsoft PSS and helped develop some of the testing. If it's done in an MS certification book, it's equally likely to be a HORRIBLE way of doing things :) Having said that, there is no compressor that can make an already-compressed file smaller. – Dave Swersky Oct 05 '10 at 13:45
  • 1
    @Dave Swersky: That's a rather bold statement. One could use Huffman coding to compress a file, and then zip it to make it even smaller. Depending on how bad your first compressing technique is, a second compressen technique could make it better or worse. – astellin Oct 05 '10 at 14:20
  • 1
    @Excel: I stand corrected. I suppose combining two different types of compression could increase the ratio overall, but I will say using ZIP twice will not work. – Dave Swersky Oct 05 '10 at 16:11