5

I'm trying to use the native webrtc SDK (libjingle) for android. So far i can send streams from android to web (or other platforms) just fine. i can also receive the MediaStream from a peer. (to the onAddStream callback)

The project I'm working on is requiring only audio streams. no video tracks are being created nor sent to anyone.

My question is, how do i play the MediaStream Object that i get from remote peers?

@Override
public void onAddStream(MediaStream mediaStream) {
    Log.d(TAG, "onAddStream: got remote stream");
    // Need to play the audio ///
}

Again, the question is about audio. I'm not using video. apparently all the native webrtc examples uses video tracks, so i had no luck finding any documentation or examples on the web.

Thanks in advance!

Sagi Dayan
  • 118
  • 1
  • 7

1 Answers1

7

We can get the Remote Audio Track using below code

import org.webrtc.AudioTrack;

@Override
public void onAddStream(final MediaStream stream){
    if(stream.audioTracks.size() > 0) {
        remoteAudioTrack = stream.audioTracks.get(0);
    }
}

Apparently all the native webrtc examples uses video tracks, so i had no luck finding any documentation or examples on the web.

Yes, as an app developer we have to take care only video rendering. If we have received Remote Audio Track, by default it will play in default speaker(ear speaker/loud speaker/wired headset) based proximity settings.

Check below code in AppRTCAudioManager.java to enable/disable speaker

/** Sets the speaker phone mode. */
private void setSpeakerphoneOn(boolean on) {
    boolean wasOn = audioManager.isSpeakerphoneOn();
    if (wasOn == on) {
      return;
    }
    audioManager.setSpeakerphoneOn(on);
}

Reference Source: AppRTCAudioManager.java

Ajay
  • 2,483
  • 2
  • 16
  • 27
  • Thanks for the answer!! - How can i get an instance of the audioManager? in using pure libjingle. – Sagi Dayan Aug 31 '16 at 12:55
  • AudioManager is an Android internal library – Ajay Aug 31 '16 at 13:08
  • So simple - Thanks! – Sagi Dayan Aug 31 '16 at 14:14
  • @SagiDayan I can't do anything with the received audiotrack and it is not published to my Android Device speaker automatically public void onAddRemoteStream(final MediaStream remoteStream, final PnPeer peer) { CallActivity.this.runOnUiThread(new Runnable() { public void run() { Toast.makeText(CallActivity.this, "Connected to " + peer.getId(), Toast.LENGTH_SHORT).show(); } }); super.onAddRemoteStream(remoteStream, peer); // Will log values – kroky Sep 25 '16 at 15:54
  • @Ajay i am kind of stuck here in the same problem above Where should i use `remoteAudioTrack` after filling it ? – ADM Feb 20 '18 at 11:18
  • Audio Track will be played automatically if connection is established properly between peers. And you can control output volume/device with AudioManager. – Ajay Feb 21 '18 at 02:58
  • Ok .Thx for your attention. So i.e we do not have to use `remoteAudioTrack` object and any Video related renderer . I am also using `AppRTCAudioManager` but no luck in setting up call . I think i need to dig into connection process . Thx for support . Will get back if problem exists . – ADM Feb 21 '18 at 07:14
  • updated link - [AppRTCAudioManager.java](https://chromium.googlesource.com/external/webrtc/+/refs/heads/master/examples/androidapp/src/org/appspot/apprtc/AppRTCAudioManager.java) – Shahid Kamal Feb 19 '19 at 19:45