1

When using WebRTC to record a video directly from user webcam, you get a BLOB object with the video.

video.src = window.URL.createObjectURL(blob);

When trying to apply the blob to the video element (for example with the code above or other alternatives like webkitURL, etc... as shown here https://stackoverflow.com/a/59934597/311188), it fails in Safari.

It shows the video controls, but it shows 0 seconds, and play does nothing.

Somebody created an example to achieve this, but it uses File (https://bl.ocks.org/unRob/3bd07a012597aa959c92) and File is not allowed in Safari

And I have been searching for how to create a File in Safari, but it does not work.

How can I put a blob directly in a video element so user can PLAY and PREVIEW what he has just recorded ?

FlamingMoe
  • 2,709
  • 5
  • 39
  • 64

1 Answers1

0

If you are using WebRTC, I assume that you are getting the content doing something like this:

const mediaStream = await navigator.mediaDevices.getUserMedia({ audio: true, video: true })

In that case, you just simply need to attach that stream to your video element doing:

video.srcObject = mediaStream

  • That's working in Chrome and Firefox with no problems at all ... that's theory ... but iOS / Safari are in other universe ... that does not work at all, but thank you ;-) – FlamingMoe Jun 30 '21 at 21:01
  • That's weird, I understand what you are saying, but I got that working on Chrome, Firefox, and Safari. Have you added the [webrtc-adapter](https://www.npmjs.com/package/webrtc-adapter)? You can check the functionality [here](https://webrtc.github.io/samples/src/content/getusermedia/gum/) – Alejandro Pinola Jul 01 '21 at 14:38
  • Record a video, and then put the video blob recorded in the player ... you'll see in Safari does not work – FlamingMoe Jul 04 '21 at 14:46