3

I'm trying to send a big chunk of data through WCF (some GBs). I would like to compress the file while reading it using Streams, but looks like DeflateStream has two modes:

  • Compress (writes on the stream)
  • Decompress (reads the stream)

None of these modes works in my case. I would like to read an un-compressed file from disk and return a compressed stream through WCF.

Is there a way to do so, or I have to use a temporal file (or MemoryStream)?

Is a missing feature or is just not possible for some reason?

Kristian Hebert
  • 357
  • 1
  • 13
Olmo
  • 4,257
  • 3
  • 31
  • 35
  • Are you sure reading from `new DeflateStream(fileStream, CompressionMode.Compress)` does not work? – dtb May 09 '12 at 16:16
  • new DeflateStream(fi.OpenRead(), CompressionMode.Compress) throws "The base stream is not writeable" – Olmo May 09 '12 at 16:18
  • Also, DeflateStream.Read help says: "Reads a number of decompressed bytes into the specified byte array." DeflateStream.Write says: "Writes compressed bytes to the underlying stream from the specified byte array" – Olmo May 09 '12 at 16:20
  • Draw picture on paper to see what you want to be compressed. (File -> Method that reads file -> WCF channel -> optional network -> receiver). There is a good chance that you trying to compress at wrong time (i.e. either sending compressed byte array or properly doing streaming of response). – Alexei Levenkov May 09 '12 at 16:56
  • Is quite simple, I want to read a Stream from disk, compress it on the way while reading, and send the readable stream through WCF. The problem is that it only compress if writing :S – Olmo May 09 '12 at 17:15

6 Answers6

3

It looks like you are trying to compress while reading the file. The way deflatestream is written, compression has to happen as part of a write. Try wrapping the stream that you are sending over the wire, not the stream that you are reading off disk. If they are the same, you need an intermediate stream.

Chris Shain
  • 50,833
  • 6
  • 93
  • 125
  • I was thinking so. I would need some kind of Stream Reverser that writes in a buffer when the caller tries to read. And another that reads and writes or something like this. Awful... – Olmo May 09 '12 at 16:34
  • Yes, and that's clearly a non-starter. Maybe this will help? http://msdn.microsoft.com/en-us/library/ms751458.aspx – Chris Shain May 09 '12 at 19:39
  • While not exactly the same. I've ended up enabling compression for WCF at the IIS level like this: http://www.hanselman.com/blog/EnablingDynamicCompressionGzipDeflateForWCFDataFeedsODataAndOtherCustomServicesInIIS7.aspx thank you all – Olmo May 12 '12 at 17:42
3

Try using these methods for compressing and decompressing a byte array.

    private static byte[] Compress(byte[] data)
    {
        byte[] retVal;
        using (MemoryStream compressedMemoryStream = new MemoryStream())
        {
            DeflateStream compressStream = new DeflateStream(compressedMemoryStream, CompressionMode.Compress, true);
            compressStream.Write(data, 0, data.Length);
            compressStream.Close();
            retVal = new byte[compressedMemoryStream.Length];
            compressedMemoryStream.Position = 0L;
            compressedMemoryStream.Read(retVal, 0, retVal.Length);
            compressedMemoryStream.Close();
            compressStream.Close();
        }
        return retVal;
    }



    private static byte[] Decompress(byte[] data)
    {
        byte[] retVal;
        using (MemoryStream compressedMemoryStream = new MemoryStream())
        {
            compressedMemoryStream.Write(data, 0, data.Length);
            compressedMemoryStream.Position = 0L;
            MemoryStream decompressedMemoryStream = new MemoryStream();
            DeflateStream decompressStream = new DeflateStream(compressedMemoryStream, CompressionMode.Decompress);
            decompressStream.CopyTo(decompressedMemoryStream);
            retVal = new byte[decompressedMemoryStream.Length];
            decompressedMemoryStream.Position = 0L;
            decompressedMemoryStream.Read(retVal, 0, retVal.Length);
            compressedMemoryStream.Close();
            decompressedMemoryStream.Close();
            decompressStream.Close();
        }
        return retVal;
}
Vegard Innerdal
  • 349
  • 2
  • 10
  • 2
    This will mean holding Gb of data on memory. I will prefer saving the deflatered file on disk. – Olmo May 09 '12 at 16:36
2

You should have something like:

public void CompressData(Stream uncompressedSourceStream, Stream compressedDestinationStream)
{
    using (DeflateStream compressionStream = new DeflateStream(compressedDestinationStream, CompressionMode.Compress))
    {
        uncompressedSourceStream.CopyTo(compressionStream);
    }
}

public void DecompressData(Stream compressedSourceStream, Stream uncompressedDestinationStream)
{
    using (DeflateStream decompressionStream = new DeflateStream(uncompressedDestinationStream, CompressionMode.Decompress))
    {
        compressedSourceStream.CopyTo(decompressionStream);
    }
}

using (FileStream sourceStream = File.OpenRead(@"C:\MyDir\MyFile.txt))
using (FileStream destinationStream = File.OpenWrite(@"C:\MyDir\MyCompressedFile.txt.cp"))
{
    CompressData(sourceStream, destinationStream)
}

Also, be aware that you may have to change the WCF settings in your application's .config file to allow really large things to transfer.

JamieSee
  • 12,696
  • 2
  • 31
  • 47
  • If you do so, when you call compressed.Read(...) it throws "Stream does not support reading.". You just delay the exception that before was in the constructor. – Olmo May 09 '12 at 16:31
  • Edited. I realized something about the documentation for DeflateStream that is unclear. The constructor stream for DeflateStream is actually your destination stream. – JamieSee May 09 '12 at 17:05
  • But then destination stream is a writable stream. I need a readable stream in order to send it to the WCF. http://www.codeproject.com/Articles/20364/Progress-Indication-while-Uploading-Downloading-Fi. I know I could store a file and delete it afterwards, but looks chunky. – Olmo May 09 '12 at 17:14
  • Perhaps it would be clearer this way (edited again). As long as you already have a stream that is writeable, destination stream can be anything that you want. – JamieSee May 09 '12 at 17:32
  • Ok... so after looking over the project link you provided, I see that your problem is that in one direction, you need to be able to provide a stream that doesn't already exist to the RemoteFileInfo class. One solution would be to build a custom stream based on a queue to avoid having the entire contents in memory at once. It probably is easiest to just write the compressed file and then delete it when finished. – JamieSee May 09 '12 at 20:15
  • FYI, the SlidingStream implementation in this question (http://stackoverflow.com/questions/8221136/fifo-queue-buffer-specialising-in-byte-streams) looks potentially interesting. – JamieSee May 09 '12 at 22:20
1

You can wrap the DeflateStream in a stream of your own. Everytime you want to read from the compressing stream, you have to feed bytes into the deflatestream, until it writes to a buffer. You can then return bytes from that buffer.

public class CompressingStream : Stream
{
    private readonly DeflateStream _deflateStream;
    private readonly MemoryStream _buffer;
    private Stream _inputStream;
    private readonly byte[] _fileBuffer = new byte[64 * 1024];

    public CompressingStream(Stream inputStream)
    {
        _inputStream = inputStream;
        _buffer = new MemoryStream();
        _deflateStream = new DeflateStream(_buffer, CompressionMode.Compress, true);
    }

    public override int Read(byte[] buffer, int offset, int count)
    {
        while (true)
        {
            var read = _buffer.Read(buffer, offset, count);

            if (read > 0) return read;

            if (_inputStream == null) return 0;

            _buffer.Position = 0;
            read = _inputStream.Read(_fileBuffer, 0, _fileBuffer.Length);
            if (read == 0)
            {
                _inputStream.Close();
                _inputStream = null;
                _deflateStream.Close();
            }
            else
            {
                _deflateStream.Write(_fileBuffer, 0, read);
            }
            _buffer.SetLength(_buffer.Position);
            _buffer.Position = 0;
        }
    }

    public override bool CanRead
    {
        get { return true; }
    }
#region Remaining overrides...
}

Whenever wcf reads from the stream, the compressing stream will write to compressing DeflateStream, until it kan read from the output buffer (_buffer). It's ugly, but it works.

Kristian Hebert
  • 357
  • 1
  • 13
0

I was trying to create a Stream that, whenever Read is invoked:

  • Reads a chunk of data from the source file
  • Writes the chunk on a DeflateStream connected to a MemoryStream
  • Copies the content of the MemoryStream to the Read buffer parameter.

Of course, it will be more difficult since the size of both streams are not similar.

At the end I've dismissed this option since I find no way to predict the size of the resulting compressed file without fully compressing it at the beginning.

Reading the file, however, is able to predict the file size so maybe with another implementation of DeflateStream could be possible.

Hope it helps other lost souls out there...

Olmo
  • 4,257
  • 3
  • 31
  • 35
0

Azure API for blobs has an alternative for UploadStream(stream). You can get the stream with OpenWrite(). So now you are in control of pushing bytes and therefor can compress while streaming content to the service

using (var uploadStream = blob.OpenWrite())
using (var deflateStream = new DeflateStream(uploadStream, CompressionMode.Compress))
{
    stream.CopyTo(deflateStream);
}

I haven't check the WCF API but I would be surprised if you cannot do the same.

Stig
  • 1,974
  • 2
  • 23
  • 50