1

I'm working on some compression/decompression for a web-api application.

I've already implemented most of it thanks to multiple articles on the web and questions posted here.

However, I'm still stuck on one problem that remarkably hasn't even been questioned yet.

I short, I need to support streaming of large amounts of data, both in responses and requests, and both compressed. I have already implemented the DelegatingHandler and created 2 HttpContent classes, one for compressed content (response) and the other for decompressed content (request).

Compressing the response works perfect using following code

protected override Task SerializeToStreamAsync(System.IO.Stream stream, System.Net.TransportContext context)
{
    Stream compressionStream = this.Compressor.CreateCompressionStream(stream);

    return this.OriginalContent.CopyToAsync(compressionStream).ContinueWith(task =>
    {
        if (compressionStream != null)
        {
            compressionStream.Dispose();
        }
    });
}

I create a compression stream and copy the original content to the compression stream. However, when it comes to decompressing the request, I'm currently using following code.

protected override Task SerializeToStreamAsync(System.IO.Stream stream, System.Net.TransportContext context)
{
    Stream compressionStream = 
      this.Compressor.CreateDecompressionStream(
                                       this.OriginalContent.ReadAsStreamAsync().Result);

    return compressionStream.CopyToAsync(stream).ContinueWith(task =>
    {
        if (compressionStream != null)
        {
            compressionStream.Dispose();
        }
    });
}

As you can see, I'm obligated to read the original request as a stream before copying it to the decompression stream and send it further down the pipeline.

This may not be a good practice when large amounts of data is posted to the service. So now the question, is this the right way to do it? I've been looking for a seamless way to do this.

I was thinking at implementing a Blocking Stream (adapted to the new TAP), but I got stuck again because the HttpContent is completely TAP, meaning everything returns a Task object, and I need a handle to the actual Stream.

Community
  • 1
  • 1
Ronald
  • 1,990
  • 6
  • 24
  • 39

1 Answers1

0

I can answer part of your question. By the time you call this.OriginalContent.ReadAsStreamAsync(), your request's content is already buffered. This is because by default Web API buffers all incoming requests. But yeah, for larger requests it makes sense to not use this buffered mode. However, you can change this default buffer policy.

Example below(if you are using WebHost. This does not work for Selfhost):

config.Services.Replace(typeof(IHostBufferPolicySelector), new CustomBufferPolicySelector());

public class CustomBufferPolicySelector : WebHostBufferPolicySelector
{
    // This method gets called for every incoming request. You can inspect the HttpContextBase instance
    // to decide whether you would want buffered/non-buffered way of handling individual requests.
    public override bool UseBufferedInputStream(object hostContext)
    {
        HttpContextBase contextBase = hostContext as HttpContextBase;

        //by default, this returns 'true'  
        return base.UseBufferedInputStream(hostContext);
    }

    // just fyi
    public override bool UseBufferedOutputStream(HttpResponseMessage response)
    {
        return base.UseBufferedOutputStream(response);
    }
}

Now after having the above custom policy, when you do a this.OriginalContent.ReadAsStreamAsync(), you would be receiving an un-buffered stream.

Kiran
  • 56,921
  • 15
  • 176
  • 161
  • Interesting. I remember reading about this policy some weeks ago. But wit that on, I still be buffering it before pushing it through the decompression stream. I was hoping the new HttpClient and portable compression library would give me some insights, but so far not really. – Ronald Jun 18 '13 at 17:59