3

Having implemented a couple years back a mechanism for signaling via a data channel message that a remote user muted their local video (e.g., set enable to false) and then taking the appropriate action on the remote side (e.g., showing the remote user avatar instead of the black video stream), i have been doing some testing on a non directly related function that got me looking at the states of the video tracks (i.e., the video tracks on the receive stream of the peer connection) and i notice that the muted state on the remote video fluctuates sometimes between true and false (though there is no actual change to the remote stream itself).

Hard to tell when this occurs exactly but seems that it MIGHT (no real idea whether this is actually the case or not) correlate to not attaching the media to an object (e.g., HTML video element for playback) for a long period of time (e.g., 10 seconds) and it seems that if it is attached in a short period the videotrack does not show a state of muted=true on the receiving side.

The W3 Media Capture and Streams Spec (see https://w3c.github.io/mediacapture-main/#track-muted) "A MediaStreamTrack is muted when the source is temporarily unable to provide the track with data. A track can be muted by a user. Often this action is outside the control of the application. This could be as a result of the user hitting a hardware switch or toggling a control in the operating system / browser chrome. A track can also be muted by the User Agent." The spec does not seem to address what the causes of this are or might be.

In the case of webRTC, can anyone provide some indication as to why the remote videostream as referenced from the webrtc peer connection might show a muted state of true when the media from the remote is actually flowing. Also, what might be the practical value or usage of the muted state on a remote video stream when it is not actually reflective of the remote state but of some local processing.

Thanks for any thoughts on this.

SBG
  • 357
  • 4
  • 17

2 Answers2

1

As the documentation says, the muted state varies from user action, network, or even the browser itself. In the case that it's muted while data is flowing, it could be because of your or the other users' browser (for example, could be many other reasons which don't really matter practically).

What is this for, you ask? Many of these properties are used only for testing purposes (when delving deep into webrtc development), not suitable for production.

As it says in MDN:

When possible, avoid polling muted to monitor the track's muting status. Instead, add event listeners for the mute and unmute events.

Helder Esteves
  • 451
  • 4
  • 13
  • Yes, thanks, Some additional info in https://www.w3.org/TR/mediacapture-streams/#tracks-and-constraints Best Practice Number 5. Device muting initiated by user agent "...Best practice is to mute a camera or microphone track in the following instances: An OS-level event for which the user agent already suspends media playback globally, but JavaScript is not suspended. The rationale is users may otherwise be surprised if capture were to continue in this situation (unless they've intentionally configured it this way). ...." – SBG Oct 20 '20 at 20:11
  • Sorry, missed edit deadline - See also https://www.w3.org/TR/mediacapture-streams/#life-cycle-and-media-flow - "There can be several reasons for a MediaStreamTrack to be muted: the user pushing a physical mute button on the microphone, the user closing a laptop lid with an embedded camera, the user toggling a control in the operating system, the user clicking a mute button in the browser chrome, the User Agent (on behalf of the user) mutes, etc." – SBG Oct 20 '20 at 20:20
1

I just battled with this bug for the past 3 days and might have come to a conclusion now that it's solved.

Basically, I was writing a 2+ peer WebRTC app and needed to track different RTCPeerConnection objects with separate ids (like in a js object). When a third peer joined, my code had it asynchronously initialize multiple RTCPeerConnection objects and add local MediaStream tracks. This kept fluctuating the MediaStreamTrack for video on the revieving ends between muted and not muted.

Buggy code:

peerIDsArray.forEach(async(peerID) => {
    // Initialize RTCPeerConnection object and configure it

    // Runs in parellel
    localMediaStream.getTracks().forEach(track => {
        peerConnectionsObject[peerID].addTrack(track, localMediaStream)
    })
})

This kept bugging out with mute/unmute MediaStreamTrack objects on receiving peers, and I think it's either the local MediaStream is not supposed to be tampered with asynchronously (in parallel) or you're not supposed to add tracks to multiple RTCPeerConnection objects at once. Running it in a simple loop solved the issue for me:

Working code:

for(let i = 0; i < peerIDsArray.length; i++) {
    let peerID = peerIDsArray[i]

    // Initialize RTCPeerConnection object and configure it

    // Runs synchronously
    localMediaStream.getTracks().forEach(track => {     
        peerConnectionsObject[peerID].addTrack(track, localMediaStream)
    })
}
Zoraiz
  • 301
  • 2
  • 9