2

I have some malformed (obfuscated) JSON traffic coming in to our IIS 7.5 server from an external partner which is causing an exception to be thrown when trying to deserialize it. This particular content has a certain signature in the JSON body, namely {"a":[...] } where the ellipsis is a lot of other single-letter keys with values.

I have created an HttpModule to filter out these incoming requests, which is working fine, and the exceptions related to malformed JSON go away. The only issue is that I'm noticing a slow, but steady rise of 500-level timeouts reported by the server whenever I'm digging into the content of the request body (as demonstrated below). They are not extensive, only about 5 per minute, per server, and we're dealing with 10k+ requests per minute.

Here is most of the code from this module:

        // read in the stream
        byte[] streamBuffer = new byte[app.Context.Request.InputStream.Length];

        ...

        app.Context.Request.InputStream.Read(streamBuffer, 0, readLength);
        app.Context.Request.InputStream.Position = 0;
        string body = "";
        body = Encoding.ASCII.GetString(streamBuffer, 0, readLength);

        ...

        // compare it to bad JSON
        matchesBadRequest = body.StartsWith(BAD_JSON_REQUEST, StringComparison.OrdinalIgnoreCase);

        ...

        // if signature matches, send back a 500 error.
        app.Context.Response.StatusCode = 500;
        app.Context.ApplicationInstance.CompleteRequest();

Here is what I've done to isolate the problem:

  • I've commented out reading the body, and don't get the timeouts (except that, of course, there's never a match), so this gives me pretty high confidence that this is a key area.

  • I've taken out the match check, just in case slow string matching is what was slowing something down. I've even kept the readLength to a size of 6 chars, since that gives me enough confidence in the signature match, so I'm not trying to consume the entire body.

  • I've taken out the response status code and completing the request when a match has been found (letting the exception go through), and it's still causing timeouts.

  • I've changed the response status code to 400 and even 200, and I'm still getting the same number of 500-level timeouts. A side note to this is that there are about 300+ of these bad JSON calls coming through per minute, so it doesn't correlate to the timeouts which are about 5 a minute.

With this in mind, I'm wondering if reading the stream is just slowing down the process a little too much for some requests that might have a short TTL, and we're starting to see the issues described because they've been waiting too long. I haven't been able to identify the exact requests which are being timed out, so I'm trying to see if there's anything that can be done at the stream-reading level.

I'm not using StreamReader to parse the body for reasons described here.

Thanks in advance. Any help would be very welcome!

Community
  • 1
  • 1
alex.zambila
  • 159
  • 1
  • 1
  • 10
  • Have you tried reading the stream in smaller chucks instead of the entire length of the stream? – Oppositional Oct 18 '12 at 19:04
  • Yeah, in the second bullet point, I mentioned that I cut the readLength down to 6 characters limit when reading. Or did you mean other ways of getting the same data as InputStream.Read? – alex.zambila Oct 18 '12 at 19:08
  • So you read the first 6 chars, then reset position? I wonder if it is possible that the reset of the stream position consumes more time. You wil probably need to instrument the module to isolate what line (read stream, set position, etc.) is actually taking the most time. – Oppositional Oct 18 '12 at 19:41
  • I'm wondering whether you need to have your `streamBuffer` be the full length of the input stream, rather than simply allocating it at the size you actually use. Because it's an array it'll be created as a contiguous object in memory and, if there's fragmentation and contention, you may be waiting on the creation of that space. It's unlikely to be _the_ issue, of course, but it's something to consider. – David T. Macknet Oct 19 '12 at 18:23

0 Answers0