5

I want to get the images from CameraX (Preview Use case) and encode them as h.264 Video using MediaCodec. How can I achieve this ? What I was trying was, to use the Surface returned from MediaCodec.createInputSurface() in Preview.Builder() by using the Preview.setSurfaceProvider(). I inherit a class from Preview.SurfaceProvider then inside that setup and configure my encoder and override onSurfaceRequested() to return Surface from createInputSurface(). Is this expected to work? Can I really share a Surface like this and expect CameraX to write to this Surface and fill Input for my Encoder?

Is there a more efficient method to encode live CameraX feed?

NOTE: I am using KOTLIN

aispark
  • 159
  • 8

1 Answers1

2

I finally solved that with OpenGLRenderer from CameraX OpenGL test. This is for beta 7 version of CameraX.

Setup camerax like usual, but use 2 previews:

val preview: Preview = Preview.Builder().apply {
    setTargetResolution(targetSize)
    setTargetRotation(rotation)
}.build()

val encoderPreview: Preview = Preview.Builder().apply {
    setTargetResolution(targetSize)
    setTargetRotation(rotation)
}.build()

cameraProvider.unbindAll()

camera = cameraProvider.bindToLifecycle(
        lifecycleOwner,
        cameraSelector,
        preview,
        encoderPreview
)

preview.setSurfaceProvider(viewFinder.createSurfaceProvider())

Then initialize encoder:

val format = MediaFormat.createVideoFormat(
        "video/avc", resolution.width, resolution.height
)

format.setInteger(
        MediaFormat.KEY_COLOR_FORMAT,
        MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface
)

format.setInteger(MediaFormat.KEY_BIT_RATE, 500 * 1024)
format.setInteger(MediaFormat.KEY_FRAME_RATE, 25)
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 3)

encoder = MediaCodec.createEncoderByType("video/avc")

encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE)

And connect both:

private val glRenderer = OpenGLRenderer()

surface = encoder.createInputSurface()

glRenderer.attachInputPreview(encoderPreview)

glRenderer.setFrameUpdateListener(executor, Consumer<Long> {
    // when frame is written to output surface
    publishFrame()
})

encoder.start()

glRenderer.attachOutputSurface(surface, resolution, 0)

Publish frame function:

private fun publishFrame() {
    val index: Int = try {
        encoder.dequeueOutputBuffer(info, 10 * 1000)
    } catch (e: Exception) {
        -1
    }

    if (!isRunning.get()) {
        return
    }

    if (index >= 0) {
        val outputBuffer = encoder.getOutputBuffer(index)
        if (outputBuffer == null) {
            return
        }

        if (info.size > 0) {
            outputBuffer.position(info.offset)
            outputBuffer.limit(info.offset + info.size)
            info.presentationTimeUs = System.nanoTime() / 1000

            // do something with frame
        }

        encoder.releaseOutputBuffer(index, false)

        if (info.flags.hasFlag(MediaCodec.BUFFER_FLAG_END_OF_STREAM)) {
            return
        }
    }
}

Note that FRAME_RATE parameter in encoder is not respected, you will get frame rate based on how many frames are published to output surface (how many times is called publishFrame). To control frame rate change private void renderLatest() function in OpenGLRenderer (drop frames, don't call renderTexture).

Edit: Newer solution which came as part of conversation on camerax google group can be found here

zoki
  • 535
  • 1
  • 3
  • 17
  • thank you for this. Where is encoderPreview used when connecting the two? I see you used `preview` in `glRenderer.attachInputPreview(preview)`. If you have a sample I've been really struggling getting this to work with my encoder, the screen is just black. – nymeria May 18 '21 at 20:06
  • 1
    Hi! I made a mistake. In glRenderer.attachInputPreview should be encoderPreview. Have fixed above. – zoki May 19 '21 at 13:53
  • 1
    I'm not using above solution anymore. It worked, but I had problems with transformations of preview because stream (encoded) was different than on UI, which could dynamically changed. Have implemented now everything with only one preview and have additional context in OpenGLRender. It's necessary to rework some code also in cpp. Too complex to post here :) – zoki May 19 '21 at 13:57
  • ah thank you. I wish there was a simpler way to tell the camera preview surface to use the mediacodec input surface. I'll look more at using OpenGLRenderer. Part of the encoding I am doing is in cpp but I was hoping to keep that separate from the camera - to be determined I guess. thanks anyway! – nymeria May 19 '21 at 23:46
  • 1
    @nymeria, check the link I added as part of last edit, to see that new solution. – zoki May 20 '21 at 07:13
  • wow thank you so much @zoki. this really helped! the preview on the screen is still black for me, but that might just be a mistake in my implementation. I'm going to spend some time this weekend reading the code you shared. I hope the camerax team will adopt this into the API, I don't think camera streaming is uncommon! – nymeria May 21 '21 at 18:12
  • oh I fixed the black screen issue by using the original solution of including an encoderPreview! – nymeria May 21 '21 at 18:48