[SOLVED] readable streams
Chunks (byte[]) are written (an nothing else) directly to the response outputStream , on client side this code inside an async function reads that response's chunks nicely as they arrive as uInt8Array
const response = await fetch(url);
const reader = response.body.getReader();
while (true) {
const { value, done } = await reader.read();
if (done) break;
console.log('Received', value);
}
console.log('Response fully received');
Code read at https://web.dev/fetch-upload-streaming/
[OLD]
I would like to do the same implementation as <img src="stream/video.mjpeg">
does when that url returns a response with contentType: multipart/x-mixed-replace. Somehow the browser reads "chucks" from an unfinished/on-going response.
Why? I want to stream .h264 format, with jMuxer (/example/h264.html), here WebSocket are used, but "feels wrong" because sockets are for bidirectional communication (I suposse this is true).
So, the response which <img>
consumes succesfully and it is generated in server side(Java) looks like:
for(int[] bytes: listWithFrames){
responseOs.write((
"--BoundaryString\r\n" +
"Content-type: image/jpeg\r\n" +
"Content-Length: " +bytes.length +
"\r\n\r\n").getBytes());
responseOs.write(bytes);
responseOs.write("\r\n\r\n".getBytes());
responseOs.flush();}
First I've tried writting just bytes
on the response ussing XMLHttpRequest
like XMLHttpRequest/Sending_and_Receiving_Binary_Data, but oReq.response
is set when response ends and oReq.responseText
is not null only when oReq.responseType = "text";
Encoding in Base64 each bytes
to make it visible on oReq.responseText
and then decoding it can't be also the way because it adds a lot of size and oReq.responseText
grows forever (could be avoided by doing multiple request and a buffer to keep feeding jMuxer)
Do I really need Node? making-http-request-and-receiving-multipart-x-mixed-replace-response-in-node-js I am looking for some kind of fetch
that give me access to each recived binary (arraybuffer/uInt8array) part as they arrive.