Our screen recording chrome extension allows user to record their screen using the getDisplayMedia API, which returns a stream that is fed into the MediaRecorder API.
Normally, we'd record this stream using the webm video container with the newer vp9 codec like so:
const mediaRecorder = new MediaRecorder(mediaStream, {
mimeType: "video/webm; codecs=vp9"
});
However, Safari does not support the webm container, nor does it support decoding the vp9 codec. Since the MediaRecorder API in Chrome only supports recording in the webm container but does support the h264 encoding (which Safari can decode), we instead record with the h264 codec in a webm container:
const mediaRecorder = new MediaRecorder(mediaStream, {
mimeType: "video/webm; codecs=h264"
});
This works well for two reasons:
since our recording app is a chrome extension, we don't mind that it can only record in Chrome
since the video data is encoded as h264, we can now almost instantly move the video data to a .mp4 container, allowing Safari viewers to view these recorded videos without having to wait for an expensive transcoding process (note that you can view the videos without the chrome extension, in a regular web app)
However, because the media recorder API has no method for getting the duration of the video stream recorded so far, and measuring it manually with performance.now
proved to be imprecise (with a 25ms to 150ms error), we had to change to feeding the recorder data into a MediaSource such that we can use the mediaSourceBuffer.buffered.end(sourceBuffer.buffered.length - 1) * 1000
API to get a 100% accurate read of the video stream duration recorded so far (in milliseconds).
The issue is that for some reason the MediaSource fails to instantiate when we use our "video/webm; codecs=h264" mime type.
Doing this:
mediaSourceBuffer = mediaSource.addSourceBuffer("video/webm; codecs=h264");
Results in:
Failed to execute 'addSourceBuffer' on 'MediaSource': The type provided ('video/webm; codecs=h264') is unsupported.
Why is the mime type supported by MediaRecorder but not by MediaSource? Since they are of the same API family, shouldn't they support the same mime types? How can we record with the h264 codec while passing the data to a MediaSource using addSourceBuffer?
The only solution we can think of so far is to create 2 media recorders, one recording in vp9 for us to read the accurate duration of the video recorded so far using the buffered.end
API, and one recording in h264 for us to be able to immediately move the video data to a mp4 container without having to transcode the codec from vp9 to h264 for Safari users. However, this would be very inefficient as it would effectively hold twice as much data in RAM.
Reproduction cases / codesandbox examples
- vp9 example (both work)
- h264 example (media recorder works, media source does not)