1

I am developing a H.264 decoder using MediaCodec API. I am trying to call MediaCodec java API in JNI layer inside a function like:

void Decompress(const unsigned char *encodedInputdata, unsigned int inputLength, unsigned char **outputDecodedData, int &width, int &height) {
    // encodedInputdata is encoded H.264 remote stream
    // .....
    // outputDecodedData = call JNI function of MediaCodec Java API to decode
    // .....
}

Later I will send the outputDecodedData to my existing video rendering pipeline and render on Surface.

I hope I will be able to write a Java function to decode the input stream, but these would be challenge -

  1. This resource states that -

...you can't do anything with the decoded video frame but render them to surface

Here a Surface has been passed decoder.configure(format, surface, null, 0) to render the output ByteBuffer on the surface and claimed We can't use this buffer but render it due to the API limit.

So, will I able to send the output ByteBuffer to native layer to cast as unsigned char* and pass to my rendering pipeline instead of passing a Surface ot configure()?

Kaidul
  • 15,409
  • 15
  • 81
  • 150

1 Answers1

2

I see two fundamental problems with your proposed function definition.

First, MediaCodec operates on access units (NAL units for H.264), not arbitrary chunks of data from a stream, so you need to pass in one NAL unit at a time. Once the chunk is received, the codec may want to wait for additional frames to arrive before producing any output. You cannot in general pass in one frame of input and wait to receive one frame of output.

Second, as you noted, the ByteBuffer output is YUV-encoded in one of several color formats. The format varies from device to device; Qualcomm devices notably use their own proprietary format. (It has been reverse-engineered, though, so if you search around you can find some code to unravel it.)

The common workaround is to send the video frames to a SurfaceTexture, which converts them to GLES "external" textures. These can be manipulated in various ways, or rendered to a pbuffer and extracted with glReadPixels().

fadden
  • 51,356
  • 5
  • 116
  • 166
  • Thanks sir! So currently I am receiving NAL units of H.264 from native layer onto Java layer using a custom listener and trying to decode it using `MediaCodec` onto a `Surface` (of `TextureView`). Lets see how far I can go! :) – Kaidul Sep 17 '15 at 18:50
  • 1
    See also http://stackoverflow.com/questions/13307086/decoding-raw-h264-stream-in-android – fadden Sep 18 '15 at 05:20
  • Thanks, sir for the link! Fortunately I am acquainted with the H.264 structure, I implemented the encoding-decoding feature on native layer using ffmpeg, openh264 and other third party proprietory library. I hope I will make it working and will open new threads on SO if I will need consultant's (like you) advice :) – Kaidul Sep 18 '15 at 15:32
  • Sir, I tried and now facing some problems. Can you please take a look at this thread? http://stackoverflow.com/questions/32723393/video-rendering-is-broken-mediacodec-h-264-stream – Kaidul Sep 22 '15 at 17:41