Questions tagged [mediastream]

The two main components in the MediaStream API are the MediaStreamTrack and MediaStream interfaces. The MediaStreamTrack object represents media of a single type that originates from one media source in the User Agent, e.g. video produced by a web camera. A MediaStream is used to group several MediaStreamTrack objects into one unit that can be recorded or rendered in a media element.

The two main components in the MediaStream API are the MediaStreamTrack and MediaStream interfaces. The MediaStreamTrack object represents media of a single type that originates from one media source in the User Agent, e.g. video produced by a web camera. A MediaStream is used to group several MediaStreamTrack objects into one unit that can be recorded or rendered in a media element. Media Capture and Streams

348 questions
34
votes
2 answers

MediaStream Capture Canvas and Audio Simultaneously

I'm working on a project in which I'd like to: Load a video js and display it on the canvas. Use filters to alter the appearance of the canvas (and therefore the video). Use the MediaStream captureStream() method and a MediaRecorder object to…
dsing7
  • 343
  • 1
  • 3
  • 4
21
votes
3 answers

MediaSource vs MediaStream in Javascript

My Javascript application gets a WebM video stream over a Websocket connection. There is no delay between remote peer sending video frames and the application getting them. I create a MediaSource object in the application, to which I "append video…
Sergio
  • 241
  • 2
  • 9
21
votes
1 answer

How to addTrack in MediaStream in WebRTC

I'm using webrtc to communicate between to peers. I wan't to add new track to old generated stream, as I wan't to give functionality to users to switch their microphones during audio communications. The code I'm using is, Let "pc" be the…
Akshay Rathore
  • 819
  • 1
  • 9
  • 23
18
votes
2 answers

Blob video duration metadata

I am writing a software that manipulates camera stream video in firefox. I am generating a Blob with video type recorded with MediaRecorder API. What i am doing to save the blob as video in local storage is using FileSaver library : …
Valere
  • 201
  • 2
  • 9
17
votes
2 answers

Video stream sideways on some browser/device combinations

I'm calling getUserMedia() to get a video stream and simply setting the stream as the srcObject of a video element. Specifically on Chrome on 2 different Windows Tablets, in Portrait mode the video is side ways. I can't find any orientation info in…
Wilhelmina Lohan
  • 2,803
  • 2
  • 29
  • 58
16
votes
1 answer

Why is the 'ended' event not firing for this MediaStreamTrack?

I'd like to be informed about a MediaStreamTrack's end. According to MDN an ended event is Sent when playback of the track ends (when the value readyState changes to ended). Also available using the onended event handler property. So I should be…
wpp
  • 7,093
  • 4
  • 33
  • 65
14
votes
5 answers

How to use web audio api to get raw pcm audio?

How usergetmedia to use the microphone in chrome and then stream to get raw audio? I need need to get the audio in linear 16.
jeo.e
  • 213
  • 1
  • 3
  • 10
13
votes
1 answer

webRTC convert webm to mp4 with ffmpeg.js

I am trying to convert webM files to mp4 with ffmpeg.js. I am recording a video from canvas(overlayer with some information) and recording the audio data from the video. stream = new MediaStream(); var videoElem =…
q-jack
  • 366
  • 2
  • 3
  • 17
11
votes
2 answers

Is it possible to add a stream as source to an html canvas element as to a html video element?

According to MDN: The HTMLMediaElement interface adds to HTMLElement the properties and methods needed to support basic media-related capabilities that are common to audio and video. HTMLMediaElement.captureStream(). It can be used with both…
sçuçu
  • 2,960
  • 2
  • 33
  • 60
11
votes
2 answers

MediaRecorder - How to play chunk/blob of video while recording?

I currently have a MediaStream which is being recorded using MediaRecorder. At the end of the recording after recorder.stop(), it produce a Blob and I am able to play that video back. My goal is to play not the entire video at the end, but play a…
Bernard
  • 181
  • 2
  • 7
10
votes
3 answers

Saving desktopCapturer to video file in Electron

The desktopCapturer api example shows how to write a screen capture stream to a
styfle
  • 22,361
  • 27
  • 86
  • 128
9
votes
3 answers

WebRTC video/audio streams out of sync (MediaStream -> MediaRecorder -> MediaSource -> Video Element)

I am taking a MediaStream and merging two separate tracks (video and audio) using a canvas and the WebAudio API. The MediaStream itself does not seem to fall out of sync, but after reading it into a MediaRecorder and buffering it into a video…
9
votes
1 answer

CanvasCaptureMediaStream / MediaRecorder Frame Synchronization

When using CanvasCaptureMediaStream and MediaRecorder, is there a way to get an event on each frame? What I need is not unlike requestAnimationFrame(), but I need it for the CanvasCaptureMediaStream (and/or the MediaRecorder) and not the window. …
Brad
  • 159,648
  • 54
  • 349
  • 530
8
votes
2 answers

enumerateDevices after getUserMedia: how to find the active devices?

Is there a way to detect which device (camera, microphone) is active, given a MediaStream instance? The app I'm currently working on does simply query for such a stream and attaches it to a
DMKE
  • 4,553
  • 1
  • 31
  • 50
8
votes
1 answer

What is a good set of constraints for lowest latency audio playback/monitoring with the MediaStream Recording API?

I'm currently spiking out a music application with HTML5/JS and am attempting to achieve the lowest latency I can with the MediaStream Recording API. The app allows a user to record music with a camera and microphone. While the camera and microphone…
Tomek
  • 4,689
  • 15
  • 44
  • 52
1
2 3
23 24