Overview
I would like to use a custom video source to live stream video via WebRTC Android implementation. If I understand correctly, existing implementation only supports front and back facing cameras on Android phones. The following classes are relevant in this scenario:
Currently for using front facing camera on Android phone I'm doing the following steps:
CameraEnumerator enumerator = new Camera1Enumerator(false);
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
VideoSource videoSource = peerConnectionFactory.createVideoSource(false);
videoCapturer.initialize(surfaceTextureHelper, this.getApplicationContext(), videoSource.getCapturerObserver());
VideoTrack localVideoTrack = peerConnectionFactory.createVideoTrack(VideoTrackID, videoSource);
My scenario
I've a callback handler that receives video buffer in byte array from custom video source:
public void onReceive(byte[] videoBuffer, int size) {}
How would I be able to send this byte array buffer? I'm not sure about the solution, but I think I would have to implement custom VideoCapturer
?
Existing questions
This question might be relevant, though I'm not using libjingle library, only native WebRTC Android package.
Similar questions/articles:
- for iOS platform but unfortunately I couldn't help with the answers.
- for native C++ platform
- article about native implementation