10

I'm working on a WebRTC based app for Android using the native implementation (org.webrtc:google-webrtc:1.0.24064), and I need to send a series of bitmaps along with the camera stream.

From what I understood, I can derive from org.webrtc.VideoCapturer and do my rendering in a separate thread, and send video frames to the observer; however it expects them to be YUV420 and I'm not sure I'm doing the correct conversion.

This is what I currently have: CustomCapturer.java

Are there any examples I can look at for doing this kind of things? Thanks.

n1xx1
  • 1,971
  • 20
  • 26
  • I managed to assemble a working version using the included (in webrtc) YuvConverter and it works as expected. However it would be cool if there was a way to do it without using this ogl trickery when I'm using a software render already. [New code for reference](https://gist.github.com/n1xx1/2cd38043838e259969bce983ce21ffaa) – n1xx1 Jul 24 '18 at 10:53
  • hey, I transform 2 VideoFrames in one bitmap and then try to make it VideoFrame again. So similar to your task but for me VideoFrames are generated as BLACK, did you encountered similar issue? – Kyryl Zotov Jun 16 '20 at 11:52
  • Hi @n1xx1 can you share your sample code. I am also looking for bitmap to streaming. if you share it will helpful for me. I am ready to offer bounty also – Ranjithkumar Sep 22 '20 at 11:34
  • 1
    Hey @RanjithKumar. The first comment has a working version, the idea is that in the while loop you can replace the Canvas rendering part (lines 47 through 52) with something that writes the bitmap you get from your stream. For my project I was sending an http request to an IP Camera at that point, you should be able to use Mjpeg streams or similar stuff aswell. – n1xx1 Sep 22 '20 at 14:02
  • @n1xx1 after using latest webrtc library its working. but it shows corrupted bitmap. like this - https://stackoverflow.com/questions/62261947/android-bitmap-to-webrtc-i420-frame-corrupted can you guide me ? – Ranjithkumar Sep 26 '20 at 21:51
  • See: https://chromium.googlesource.com/external/webrtc/+/refs/heads/master/sdk/android/api/org/webrtc/YuvHelper.java#106 ABGRToI420 seems to be what you want. Also relevant: https://cloud.tencent.com/developer/article/1597125 – Rick Sanchez Sep 26 '20 at 22:42
  • @n1xx1 can you give answer? – Ranjithkumar Sep 29 '20 at 09:49

2 Answers2

3
    YuvConverter yuvConverter = new YuvConverter();
    int[] textures = new int[1];
    GLES20.glGenTextures(1, textures, 0);
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXTURE_MIN_FILTER,GLES20.GL_NEAREST);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
    GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
    TextureBufferImpl buffer = new TextureBufferImpl(bitmap.getWidth(), bitmap.getHeight(), VideoFrame.TextureBuffer.Type.RGB, textures[0], new Matrix(), textureHelper.getHandler(), yuvConverter, null);
    VideoFrame.I420Buffer i420Buf = yuvConverter.convert(buffer);
    VideoFrame CONVERTED_FRAME = new VideoFrame(i420Buf, 180, videoFrame.getTimestampNs()) ;
Yang Peiyong
  • 11,536
  • 2
  • 21
  • 15
2

I've tried rendering it manually with GL as in Yang's answer, but that ended up with some tearing and framerate issues when dealing with a stream of images.

Instead, I've found that the SurfaceTextureHelper class helps simplify things quite a bit, as you can also use regular canvas drawing to render the bitmap into a VideoFrame. I'm guessing it still uses GL under the hood, as the performance was otherwise comparable. Here's an example VideoCapturer that takes in arbitrary bitmaps and outputs the captured frames to its observer:

import android.content.Context
import android.graphics.Bitmap
import android.graphics.Matrix
import android.graphics.Paint
import android.os.Build
import android.view.Surface
import org.webrtc.CapturerObserver
import org.webrtc.SurfaceTextureHelper
import org.webrtc.VideoCapturer

/**
 * A [VideoCapturer] that can be manually driven by passing in [Bitmap].
 *
 * Once [startCapture] is called, call [pushBitmap] to render images as video frames.
 */
open class BitmapFrameCapturer : VideoCapturer {
    private var surfaceTextureHelper: SurfaceTextureHelper? = null
    private var capturerObserver: CapturerObserver? = null
    private var disposed = false

    private var rotation = 0
    private var width = 0
    private var height = 0

    private val stateLock = Any()

    private var surface: Surface? = null

    override fun initialize(
        surfaceTextureHelper: SurfaceTextureHelper,
        context: Context,
        observer: CapturerObserver,
    ) {
        synchronized(stateLock) {
            this.surfaceTextureHelper = surfaceTextureHelper
            this.capturerObserver = observer
            surface = Surface(surfaceTextureHelper.surfaceTexture)
        }
    }

    private fun checkNotDisposed() {
        check(!disposed) { "Capturer is disposed." }
    }

    override fun startCapture(width: Int, height: Int, framerate: Int) {
        synchronized(stateLock) {
            checkNotDisposed()
            checkNotNull(surfaceTextureHelper) { "BitmapFrameCapturer must be initialized before calling startCapture." }
            capturerObserver?.onCapturerStarted(true)
            surfaceTextureHelper?.startListening { frame -> capturerObserver?.onFrameCaptured(frame) }
        }
    }

    override fun stopCapture() {
        synchronized(stateLock) {
            surfaceTextureHelper?.stopListening()
            capturerObserver?.onCapturerStopped()
        }
    }

    override fun changeCaptureFormat(width: Int, height: Int, framerate: Int) {
        // Do nothing.
        // These attributes are driven by the bitmaps fed in.
    }

    override fun dispose() {
        synchronized(stateLock) {
            if (disposed) {
                return
            }

            stopCapture()
            surface?.release()
            disposed = true
        }
    }

    override fun isScreencast(): Boolean = false

    fun pushBitmap(bitmap: Bitmap, rotationDegrees: Int) {
        synchronized(stateLock) {
            if (disposed) {
                return
            }

            checkNotNull(surfaceTextureHelper)
            checkNotNull(surface)
            if (this.rotation != rotationDegrees) {
                surfaceTextureHelper?.setFrameRotation(rotationDegrees)
                this.rotation = rotationDegrees
            }

            if (this.width != bitmap.width || this.height != bitmap.height) {
                surfaceTextureHelper?.setTextureSize(bitmap.width, bitmap.height)
                this.width = bitmap.width
                this.height = bitmap.height
            }

            surfaceTextureHelper?.handler?.post {
                val canvas = if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
                    surface?.lockHardwareCanvas()
                } else {
                    surface?.lockCanvas(null)
                }

                if (canvas != null) {
                    canvas.drawBitmap(bitmap, Matrix(), Paint())
                    surface?.unlockCanvasAndPost(canvas)
                }
            }
        }
    }
}

https://github.com/livekit/client-sdk-android/blob/c1e207c30fce9499a534e13c63a59f26215f0af4/livekit-android-sdk/src/main/java/io/livekit/android/room/track/video/BitmapFrameCapturer.kt

David Liu
  • 9,426
  • 5
  • 40
  • 63