I am working on a webrtc android. In this project, I developed Screen Sharing functionality it is working good. but I have two filters during screen sharing user can mute the microphone or mute phone audio. when user mute the microphone then receiver didn't listen to the voice of sender and when user mute the phone audio then receiver only listen the sender audio, not phone audio it is working but my concern is when sender mute microphone but allowed phone audio then receiver has to receiver phone audio, not sender audio e.g. when sender share the screen and play youtube video then receiver has to listen youtube audio, not sender audio. how to get achieve this functionality.
I disable audiotrack but receiver not able to listen both audio means phone audio and microphone audio I also tried AudioDeviceModule adm; adm.setMicrophoneMute(true);
it also did the same thing which I explain above
1) final AudioDeviceModule adm = createJavaAudioDevice(); adm.setMicrophoneMute(true);
2) AudioTrack localAudioTrack; localAudioTrack.setEnabled(true);
I expect when I mute microphone then receiver not listen to the sender's voice and when I mute phone audio receiver will not listen any audio from the background and when I mute microphone and enable phone audio then the receiver has to listen the background audio i.e. youtube or any other audio from the phone. but I am only able to do the mute functionality of the microphone.