4

So I am using navigator.getDisplayMedia() to get the user to select the source of the screen to start recording. But this currently does not support audio, so I am using navigator.mediaDevices.getUserMedia({ audio: true }); to get the audio stream and then eventually adding the video track to it, and then pass this stream to mediarecorder.

Mine is a video conferencing web app, I manage to capture all kind of audio, but just when any other user joins the room, I do not get his/her audio in the recordings.

Has anyone faced a similar issue?

aditya
  • 339
  • 2
  • 12

1 Answers1

1

getUserMedia gives you microphone

navigator.mediaDevices.getUserMedia({audio: true}) gives you the user's microphone, not audio from their system.

Their microphone might pick up some system audio, but there's no guarantee. For instance, it won't if:

  1. They have their volume turned down
  2. They're using a headset

The specific reason you're not hearing other participants is echoCancellation is on by default in all browsers.

Without echo cancellation, when your mic picks up a remote speaker, their voice is immediately sent back to them, and they hear echo. If their mic doesn't cancel that echo either, you get a cycle with even more echo.

If you know what you're doing, you can turn off echoCancellation with:

await navigator.mediaDevices.getUserMedia({audio: {echoCancellation: false}});

or after the fact with:

await stream.getAudioTracks()[0].applyConstraints({echoCancellation: false});

Use web audio instead

If this is your own web conferencing app, then you should already have all the necessary audio tracks. You should be able to feed at least the active speaker directly to MediaRecorder, or perhaps even mix audio using web audio.

Screen sharing audio is coming

Audio from screen-sharing is in the spec, but it's optional, both to implement, and is at the end-users discretion.

So far, I believe Chrome is working on implementing it.

jib
  • 40,579
  • 17
  • 100
  • 158