2

I am working on a webrtc android. In this project, I developed Screen Sharing functionality it is working good. but I have two filters during screen sharing user can mute the microphone or mute phone audio. when user mute the microphone then receiver didn't listen to the voice of sender and when user mute the phone audio then receiver only listen the sender audio, not phone audio it is working but my concern is when sender mute microphone but allowed phone audio then receiver has to receiver phone audio, not sender audio e.g. when sender share the screen and play youtube video then receiver has to listen youtube audio, not sender audio. how to get achieve this functionality.

I disable audiotrack but receiver not able to listen both audio means phone audio and microphone audio I also tried AudioDeviceModule adm; adm.setMicrophoneMute(true); it also did the same thing which I explain above

1) final AudioDeviceModule adm = createJavaAudioDevice(); adm.setMicrophoneMute(true);

2) AudioTrack localAudioTrack; localAudioTrack.setEnabled(true);

I expect when I mute microphone then receiver not listen to the sender's voice and when I mute phone audio receiver will not listen any audio from the background and when I mute microphone and enable phone audio then the receiver has to listen the background audio i.e. youtube or any other audio from the phone. but I am only able to do the mute functionality of the microphone.

Milad Rashidi
  • 1,296
  • 4
  • 22
  • 40
  • 1
    Hello, I'm working on a similar project with webRTC and I would like to know if you succeed to record internal audio and send it to the webRTC server ? Best regards – florian-do Nov 11 '19 at 22:14
  • I am looking for an answer too, but you should have root privilege – nima moradi Nov 12 '19 at 19:12
  • I don't think we need root privilege for be able to send the audio to webRTC – florian-do Nov 12 '19 at 20:04
  • 1
    we need root access system audio – nima moradi Nov 13 '19 at 11:41
  • I can send the microphone audio through webRTC but I can't send the internal sound – florian-do Nov 13 '19 at 15:43
  • @nimamoradi I've just read that since Android 7, Google has removed the right to record internal audio and the only way is to root your phone, so yep you're right ! that's pretty sad. Thanks – florian-do Nov 13 '19 at 18:04
  • @florian-do there is a way to record audio using mediaProjection in android 10 only, that need not root – nima moradi Nov 14 '19 at 18:31
  • @nimamoradi oh thanks! now we have to found a way to send the buffer directly to webrtc – florian-do Nov 15 '19 at 16:34
  • @florian-do can you share your answer please – nima moradi Nov 15 '19 at 17:37
  • @nimamoradi hello from this github file we can get the flux in byte directly from the internal audio source https://github.com/julioz/AudioCaptureSample/blob/master/app/src/main/java/com/zynger/audiocapturesample/AudioCaptureService.kt – florian-do Nov 18 '19 at 16:04
  • I solved this issue by transferring media projection to WebRTCAudioRecord.java file. You can check this answer for more detail: stackoverflow.com/a/71716394/2209469 – Selim Emre Toy Apr 02 '22 at 09:40

1 Answers1

2
  1. You can achieve sharing system audio with AudioPlayback capture API that is available for android 10 and above Click here for more detailed implementation

    In the WebRtcAudioRecord Class, create an audio record object with AudioPlaybackCaptureConfiguration, add a method to switch the audio record object between normal audio record and AudioPlaybackCapture audio record. We will be able to share only one audio resource at a time , either mic or system audio, not together at a time

    public   void switchAudioTrack(MediaProjection mediaProjection) {
    
      if(mediaProjection!=null) {
    
      final int channelConfig = channelCountToConfiguration(channels);
      int minBufferSize = AudioRecord.getMinBufferSize(samplerate, channelConfig, audioFormat);
      int bufferSizeInBytes = Math.max(BUFFER_SIZE_FACTOR * minBufferSize, byteBuffer.capacity());
      AudioRecord screenShareAudioRecord = createAudioRecordOnMOrHigher(audioSource, samplerate, channelConfig, audioFormat, bufferSizeInBytes, mediaProjection);
    
      stopRecording();
      audioRecord = screenShareAudioRecord;
      startRecording();
    }else{
      stopRecording();
      initRecording(samplerate,channels);
      startRecording();
    }
    

    }

    private static AudioRecord createAudioRecordOnMOrHigher(
      int audioSource, int sampleRate, int channelConfig, int audioFormat, int bufferSizeInBytes,MediaProjection mediaProjection) {
        if(mediaProjection!=null){
        AudioPlaybackCaptureConfiguration config = new AudioPlaybackCaptureConfiguration
                .Builder(mediaProjection)
                 .addMatchingUsage(AudioAttributes.USAGE_MEDIA)
                .build();
        return new AudioRecord.Builder()
                .setAudioPlaybackCaptureConfig(config)
                .setAudioFormat(new AudioFormat.Builder()
                        .setEncoding(audioFormat)
                        .setSampleRate(sampleRate)
                        .setChannelMask(channelConfig)
                        .build())
                .setBufferSizeInBytes(bufferSizeInBytes)
                .build();
      }
    
      return new AudioRecord.Builder()
          .setAudioSource(audioSource)
          .setAudioFormat(new AudioFormat.Builder()
                              .setEncoding(audioFormat)
                              .setSampleRate(sampleRate)
                              .setChannelMask(channelConfig)
                              .build())
          .setBufferSizeInBytes(bufferSizeInBytes)
          .build();
    

    }

  • Do you mean we need to create a class by extending WebRtcAudioRecord? I am trying to get audiosource from the audiorecord to create a audio track but the app is crashing!! – prashanthns Jul 28 '21 at 14:41
  • 1
    Can you share better/more code? With the code you provided I don't know where to go. I got the part using MediaProjection to obtain access to system audio but can you elaborate on how to get this into WebRTC? – yomi Jan 13 '22 at 10:56
  • Have a look at https://github.com/ant-media/WebRTC-Android-SDK/pull/1 Referenced from another thread - https://stackoverflow.com/questions/68576071/system-audio-streaming-on-android-over-webrtc – selvan Dec 19 '22 at 06:08