6

I already looked at this question -

Concatenate parts of two or more webm video blobs

And tried the sample code here - https://developer.mozilla.org/en-US/docs/Web/API/MediaSource -- (without modifications) in hopes of transforming the blobs into arraybuffers and appending those to a sourcebuffer for the MediaSource WebAPI, but even the sample code wasn't working on my chrome browser for which it is said to be compatible.

The crux of my problem is that I can't combine multiple blob webm clips into one without incorrect playback after the first time it plays. To go straight to the problem please scroll to the line after the first two chunks of code, for background continue reading.

I am designing a web application that allows a presenter to record scenes of him/herself explaining charts and videos.

I am using the MediaRecorder WebAPI to record video on chrome/firefox. (Side question - is there any other way (besides flash) that I can record video/audio via webcam & mic? Because MediaRecorder is not supported on not Chrome/Firefox user agents).

navigator.mediaDevices.getUserMedia(constraints)
    .then(gotMedia)
    .catch(e => { console.error('getUserMedia() failed: ' + e); });

function gotMedia(stream) {
    recording = true;
    theStream = stream;
    vid.src = URL.createObjectURL(theStream);
    try {
        recorder = new MediaRecorder(stream);
    } catch (e) {
        console.error('Exception while creating MediaRecorder: ' + e);
        return;
    }

    theRecorder = recorder;
    recorder.ondataavailable = 
        (event) => {
            tempScene.push(event.data);
        };

    theRecorder.start(100);
}

function finishRecording() {
    recording = false;
    theRecorder.stop();
    theStream.getTracks().forEach(track => { track.stop(); });

    while(tempScene[0].size != 1) {
        tempScene.splice(0,1);
    }

    console.log(tempScene);

    scenes.push(tempScene);
    tempScene = [];
}

The function finishRecording gets called and a scene (an array of blobs of mimetype 'video/webm') gets saved to the scenes array. After it gets saved. The user can then record and save more scenes via this process. He can then view a certain scene using this following chunk of code.

function showScene(sceneNum) {
    var sceneBlob = new Blob(scenes[sceneNum], {type: 'video/webm; codecs=vorbis,vp8'});
    vid.src = URL.createObjectURL(sceneBlob);
    vid.play();
}

In the above code what happens is the blob array for the scene gets turning into one big blob for which a url is created and pointed to by the video's src attribute, so - [blob, blob, blob] => sceneBlob (an object, not array)


Up until this point everything works fine and dandy. Here is where the issue starts

I try to merge all the scenes into one by combining the blob arrays for each scene into one long blob array. The point of this functionality is so that the user can order the scenes however he/she deems fit and so he can choose not to include a scene. So they aren't necessarily in the same order as they were recorded in, so -

scene 1: [blob-1, blob-1] scene 2: [blob-2, blob-2] final: [blob-2, blob-2, blob-1, blob-1]

and then I make a blob of the final blob array, so - final: [blob, blob, blob, blob] => finalBlob The code is below for merging the scene blob arrays

function mergeScenes() {
    scenes[scenes.length] = [];
    for(var i = 0; i < scenes.length - 1; i++) {
        scenes[scenes.length - 1] = scenes[scenes.length - 1].concat(scenes[i]);
    }
    mergedScenes = scenes[scenes.length - 1];
    console.log(scenes[scenes.length - 1]);
}

This final scene can be viewed by using the showScene function in the second small chunk of code because it is appended as the last scene in the scenes array. When the video is played with the showScene function it plays all the scenes all the way through. However, if I press play on the video after it plays through the first time, it only plays the last scene. Also, if I download and play the video through my browser, the first time around it plays correctly - the subsequent times, I see the same error.

What am I doing wrong? How can I merge the files into one video containing all the scenes? Thank you very much for your time in reading this and helping me, and please let me know if I need to clarify anything.

I am using a element to display the scenes

  • See [How to use Blob URL, MediaSource or other methods to play concatenated Blobs of media fragments?](https://stackoverflow.com/questions/45217962/how-to-use-blob-url-mediasource-or-other-methods-to-play-concatenated-blobs-of) – guest271314 Jul 30 '17 at 14:51

1 Answers1

8

The file's headers (metadata) should only be appended to the first chunk of data you've got.
You can't make an new video file by just pasting one after the other, they've got a structure.

So how to workaround this ?

If I understood correctly your problem, what you need is to be able to merge all the recorded videos, just like if it were only paused. Well this can be achieved, thanks to the MediaRecorder.pause() method.

You can keep the stream open, and simply pause the MediaRecorder. At each pause event, you'll be able to generate a new video containing all the frames from the beginning of the recording, until this event.

Here is an external demo because stacksnippets don't works well with gUM...

And if ever you needed to also have shorter videos from between each resume and pause events, you could simply create new MediaRecorders for these smaller parts, while keeping the big one running.

Kaiido
  • 123,334
  • 13
  • 219
  • 285
  • Thank you so much for your answer! I will try it out and let you know! I do need to preserve the individual scenes because I wanted to allow for the user to go back and re-record a particular scene and to choose the order of scene presentation in the final video - do you know if it is possible to achieve these functionalities with your implementation? I would have to experiment with clipping and rearranging the blob arrays associated with each scene, but I have yet to do so.. but now I know where to look so thanks – Upamanyu Sundaram Jun 22 '17 at 16:41
  • So it seems that I can't rearrange segments of the video.. I will leave the question unanswered because I haven't been able to achieve what I came here for, but thank you for the contribution. - PS I knew about the pause functionality haha but didn't use it because I couldn't split the video up into scenes, thanks for the clever idea of instantiating multiple MediaRecorders though – Upamanyu Sundaram Jun 22 '17 at 22:56
  • 1
    I didn't got that you wanted to also be able to rearrange the different clips... I think it will be hard to get something solid enough for all implementations. I'll try during the week-end, but then I think the best to do is to record each clip separately, and send rearrange them on the backend. There would always be an canvas solution to do it on the front-side, but it would be an x1 reencoding, not sure your users will be patient enough... – Kaiido Jun 23 '17 at 01:42
  • @Kaiido is the reason why this works because the source gets paused (and therefore modifies the chunks)? Is it correct that it's not possible to just join, say, the first chunk (with the headers) and another chunk other than the 2nd one and have it work on chrome without hanging at the time gap? Are you aware of any work-arounds to seek past such a gap on the playing side rather than the recording side? I'm asking because I'm trying to find a lightweight client-side only solution for my website. – Chris Chiasson Nov 13 '18 at 01:59
  • @Kaiido thanks very much, you are a true life saver <3 after many days trying to find perfect solution I came across your answer and finally resolved my problem. Thanks sooo much. – Filip Kováč Dec 09 '21 at 10:37