I am streaming data to the browser like this:
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Content-Disposition': 'attachment; filename="data.dat"',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
})
stream1.pipe(res, {end: false)};
stream1.on('end', () => {
console.log("stream 1 finished");
stream2.pipe(res, {end: false});
stream2.on('end', () => {
console.log("last stream finished");
res.end();
}
};
On Firebase Functions Emulator this works fine. The download starts immediately. curl -v
immediately shows the response headers and starts downloading.
But when I deploy the function to production, the same code behaves differently. The download does not start immediately. curl -v
doesn't even show the response headers.
It seams that the download for the client starts only after the server is completely done writing all streams. And when the streams are large, the client gets Error: could not handle the request
, the logs have no errors and suggest that the cloud function finished writing all streams.
Perhaps it is some buffering configuration problem like this? -> https://stackoverflow.com/a/66656773/176336