I'm trying to use the native webrtc SDK (libjingle) for android. So far i can send streams from android to web (or other platforms) just fine. i can also receive the MediaStream from a peer. (to the onAddStream callback)
The project I'm working on is requiring only audio streams. no video tracks are being created nor sent to anyone.
My question is, how do i play the MediaStream Object that i get from remote peers?
@Override
public void onAddStream(MediaStream mediaStream) {
Log.d(TAG, "onAddStream: got remote stream");
// Need to play the audio ///
}
Again, the question is about audio. I'm not using video. apparently all the native webrtc examples uses video tracks, so i had no luck finding any documentation or examples on the web.
Thanks in advance!