2

I'm using the latest native iOS GoogleWebRTC pod version 1.1.29400

I'm attempting to disable/enable the audio track on demand as specified in the WebRTC spec here. In objective c, it is my understanding we call the isEnabled setter:

RTCMediaStreamTrack *track = self.localTracks[trackId];
track.isEnabled = YES/NO;

In my app on iOS 13.1.3, this has no effect on the remote audio stream. It continues to play.

This is my understanding based on multiple examples:

I cannot find anywhere in the official WebRTC iOS example where they call isEnabled on a RTCMediaStreamTrack.

But, I do see them call RTCAudioSession.isAudioEnabled from the view controller, here. But, when I built that app and messed around with the RTCAudioSession, setting isAudioEnabled to false muted both the microphone input and speaker output. I only want to disable the speaker output while keeping the mic hot.

Any guidance or tips would be appreciated :)

Corey Cole
  • 2,262
  • 1
  • 26
  • 43

3 Answers3

2

In the WebRTC M80 Release Notes they state they are going to be deprecating the mobile libraries.

To stay up to date with the latest bugfixes and features for the native mobile libraries (iOS and Android), we need to build from source.

After I built the AppRTCMobile example app using WebRTC.framework built from source, I made a few changes and verified I was able to mute the remote audio track on demand.

In ARDAppClient.h I add a strong pointer reference for the remote RTCMediaStream and a method header for toggling the stream mute:

@property(nonatomic, strong) RTCMediaStream *remoteAudioStream;
// ...
- (void)setRemoteAudioEnabled:(BOOL)enabled;

In ARDAppClient.m in the RTCPeerConnectionDelegate section, I listen to the didAddStream delegate and save a reference to the remote stream:

- (void)peerConnection:(RTCPeerConnection *)peerConnection
          didAddStream:(RTCMediaStream *)stream {
  RTCLog(@"Stream with %lu video tracks and %lu audio tracks was added.",
         (unsigned long)stream.videoTracks.count,
         (unsigned long)stream.audioTracks.count);
  _remoteAudioStream = stream;
}

In ARDAppClient.m I also add a function to mute/unmute the stream we now have a reference to:

- (void)setRemoteAudioEnabled:(BOOL)enabled {
  if (_state == kARDAppClientStateDisconnected) {
    return;
  }
  RTCLog(@"Setting remote stream to be %s", enabled ? "Enabled" : "Not Enabled");
  RTCLog(@"Number of remote audio tracks = %lu", (unsigned long)_remoteAudioStream.audioTracks.count);
  if (_remoteAudioStream.audioTracks.count == 0) {
    RTCLog(@"ERROR no audio tracks to disable!");
    return;
  }
  _remoteAudioTrack = _remoteAudioStream.audioTracks[0];
  [_remoteAudioTrack setIsEnabled:enabled];
}

Finally, in ARDVideoCallViewController.m I override the switch-camera button to toggle the remote audio track mute:

- (void)videoCallViewDidSwitchCamera:(ARDVideoCallView *)view {
  // [_captureController switchCamera];
  self.audioEnabled = !self.audioEnabled;
  [_client setRemoteAudioEnabled:self.audioEnabled];
}
Corey Cole
  • 2,262
  • 1
  • 26
  • 43
2

Works with the current GoogleWebRTC. Mutes / unmutes local mic and remote stream respectively:

func setMic(enabled: Bool) {
    for sender in peerConnection.senders {
        if sender.track?.kind == "audio" {
            sender.track?.isEnabled = enabled
        }
    }
}

func setRemoteTrack(enabled: Bool) {
    for receiver in peerConnection.receivers {
        if receiver.track?.kind == "audio" {
            receiver.track?.isEnabled = enabled
        }
    }
}
Elio_
  • 398
  • 4
  • 13
0

I faced similar problem in Android. Since Webrtc SDK for iOS and Android pretty similar you can use AudioTrack.setVolume(0.0) to mute and AudioTrack.setVolume({value more than zero}) to unmute remote audio track. But isEnabled property is not muting/unmuting for some reason.

Arsenius
  • 4,972
  • 4
  • 26
  • 39