16

I am having trouble consuming the response from my WebFlux server via JavaScript's new Streams API.

I can see via Curl (with the help of --limit-rate) that the server is slowing down as expected, but when I try to consume the body in Google Chrome (64.0.3282.140), it it not slowing down like it should. In fact, Chrome downloads and buffers about 32 megabytes from the server even though only about 187 kB are passed to write().

Is there something wrong with my JavaScript?

async function fetchStream(url, consumer) {
    const response = await fetch(url, {
        headers: {
            "Accept": "application/stream+json"
        }
    });
    const decoder = new TextDecoder("utf-8");
    let buffer = "";
    await response.body.pipeTo(new WritableStream({
        async write(chunk) {
            buffer += decoder.decode(chunk);
            const blocks = buffer.split("\n");
            if (blocks.length === 1) {
                return;
            }
            const indexOfLastBlock = blocks.length - 1;
            for (let index = 0; index < indexOfLastBlock; index ++) {
                const block = blocks[index];
                const item = JSON.parse(block);
                await consumer(item);
            }
            buffer = blocks[indexOfLastBlock];
        }
    }));
}

According the the specification for Streams,

If no strategy is supplied, the default behavior will be the same as a CountQueuingStrategy with a high water mark of 1.

So it should slow down the promise returned by consumer(item) resolves very slowly, right?

Ryan Holdren
  • 358
  • 2
  • 11
  • Have you tried testing while throttling using the "Network conditions" tab in the Chrome dev tools? I'm seeing the appropriate chunkiness. – zero298 Feb 14 '18 at 19:36
  • I have tried that, yes. When I throttle to "Slow 3G", I see the same behaviour where all the items are downloaded, albeit slower. – Ryan Holdren Feb 14 '18 at 20:09
  • I'm not a browser expert, but maybe backpressure information is not part of the contract between the JavaScript engine and the browser's network stack. – Brian Clozel Feb 15 '18 at 08:55
  • I believe it is part of the contract, because it stops downloading eventually. It just buffers *way* too much data beforehand. – Ryan Holdren Feb 22 '18 at 00:08
  • I'm fairly certain your async function must return some type of value in order to `resolve` where it is currently returning `undefined`. It is possible that some odd async behavior (writer.ready() not fired) is occurring because of this. Try modifying your return statement in your `sink`. – Randy Casburn Feb 23 '18 at 03:54
  • I tried your suggestion, @RandyCasburn, and I can confirm that that was not the problem. The behaviour remains the same. – Ryan Holdren Feb 28 '18 at 17:12
  • Ever find a solution for this? I'm trying to find a way to do exactly this and am getting the same results as you. – JoshKraker Dec 23 '21 at 04:54

1 Answers1

3

Looking at the Backpressure support in the Streams API, it seems that Backpressure information is communicated within the Streams chain and not over the network. In this case, we can assume an unbounded queue somewhere and this would explain the behavior you're seeing.

This other github issue suggests that the Backpressure information indeed stops at the TCP level - they just stop reading from the TCP socket which, depending on the current TCP window size/TCP configuration, means the buffers will be filled and then TCP flow control kicks in. As this issue states, they can't set the window size manually and they have to let the TCP stack handle things from there.

HTTP/2 supports flow control at the protocol level, but I don't know if the browser implementations leverage that with the Streams API.

I can't explain the behavior difference you're seeing, but I think you might be reading too much in the Backpressure support here and that this works as expected according to the spec.

Brian Clozel
  • 56,583
  • 15
  • 167
  • 176
  • From what I can see, it seems like the buffer is **not** unbounded, just huge. Chrome seems to buffer about 2000 of my objects and they are about 16 kB so I think the buffer is about 32 megabytes. – Ryan Holdren Feb 28 '18 at 17:16