I am trying to create an Android app that utilizes the libjingle WebRTC native Android library to project the users Android desktop to a peer using WebRTC. To that end, I have successfully used the pristine.io libjingle mirror to recreate the Android apprtc example application using:
compile 'io.pristine:libjingle:10531@aar'
in my apps build.gradle file. The apprtc example woks fine with the https://apprtc.appspot.com/ demo web site. I have also created a separate app that records the user's screen to an H.264 encoded mp4 file using the MediaProjection
library introduced in Android API 21 following the example posted here.
Now, I would like to marry these two ideas into an app that utilizes the raw stream from MediaProjection
and MediaRecorder
, or at least the H.264 encoded file, as the video/audio stream for the WebRTC peerconnection. Is this even possible? The PeerConnection.addStream
method in libjingle expects an instance of MediaStream
. How can you create an object of type MediaStream
from the raw stream or from the resulting mp4 file?
Thank you for any insight you might be able to provide!