I am writing a Javascript App that intends to play live video coming as a fragmented mp4 file from a server. The intention is to play it as soon as it is received so simulating a live broadcast.
The setup code is:
myMediaSource = new MediaSource();
myVideo = document.getElementById('fragvideo');
myVideo.src = window.URL.createObjectURL(myMediaSource);
myMediaSource.addEventListener('sourceopen', function(e) {
mySourceBuffer = myMediaSource.addSourceBuffer('video/mp4; codecs="avc1.42E01E, mp4a.40.2"');
}, false);
The code which receives over websocket binary data is this:
` function onmessage (event) {
if(event.data instanceof ArrayBuffer)
{
var bytes = new Uint8Array(event.data);
mySourceBuffer.appendBuffer(bytes);
setTimeout(function(){
sendmessage();
}, 50);
}
}
I am sending the file from server in 4K packets at a time. I don' get any error on Javascript side but the video does not play (the video are is black).
What am I doing wrong? Is the way I am feeding the data wrong? Hoping someone can help here...
Thanks.