6

I'm looking for the fastest way to take an image frame received from the MediaCodec decoder and draw it to the Android device screen. The important constraints and explanations are:

  1. Cannot use MediaPlayer. No intermediate app allowed.

  2. Must draw output frames from the MediaCodec decoder to the screen as quickly as possible (minimize latency).

  3. The available decoder output formats are as follows:
    ColorFormat[0] 0x00000013 COLOR_FormatYUV420Planar
    ColorFormat[1] 0x00000015 COLOR_FormatYUV420SemiPlanar
    ColorFormat[2] 0x7F000001 OMX_SEC_COLOR_FormatNV12TPhysicalAddress
    ColorFormat[3] 0x7FC00002 OMX_SEC_COLOR_FormatNV12Tiled

  4. The video resolution, and thus the resolution of each output frame, is 960x720.

  5. The target platform is Galaxy Note II and the approach can be specific to that platform (e.g. take advantage of available hardware functionality). This does not need to work on other platforms or be a generic solution.

An approach that takes less than 66ms would be good. Less than 33ms would be great. My current approach takes 80-90ms, which sucks. (I won't bother describing it since I don't want to skew the answers in any particular direction.)

Andrew Cottrell
  • 3,312
  • 3
  • 26
  • 41

1 Answers1

7

Your best bet is to decode directly to a Surface. Decoding to a ByteBuffer is going to slow you down quite a bit. A number of examples on bigflake (e.g. ExtractMpegFramesTest) send the output of a decoder to an off-screen surface and examine it with GLES, but it's a simple change to make it work with an on-screen SurfaceView.

Update: Grafika has two different MediaCodec-based video players that send the output to SurfaceView and TextureView, respectively.

fadden
  • 51,356
  • 5
  • 116
  • 166
  • Is it necessary to use GLSurfaceView if I want to decode directly to a Surface? I'd like to avoid the GL stuff if possible since that's a huge subject area and all-new territory to me. All I want to do is draw a 2D image to the screen. It looks like using GLSurfaceView requires that I also create a Renderer, which again is new and unfamiliar territory. (Or, if GLSurfaceView is required, do you know of an example of using it with MediaCodec, or perhaps of someone who could throw one together?) – Andrew Cottrell Nov 27 '13 at 21:14
  • 1
    You could try using a TextureView instead. The code would work like ExtractMpegFramesTest, but instead of creating a SurfaceTexture in CodecOutputSurface you'd just use the one obtained from the TextureView. Doesn't avoid GLES entirely, but keeps it to a minimum. – fadden Nov 28 '13 at 00:45
  • Sorry to be a pain but I'm still struggling with this. Here's what I've done: I create a TextureView, register the TextureView.SurfaceTextureListener, and get the onSurfaceTextureAvailable() callback. I create a Surface using the SurfaceTexture provided by that callback and pass it to the MediaCodec decoder.configure(). I register the onFrameAvailableListener. Fire up the decoder, get the first onFrameAvailable() callback, tell it to updateTexImage(), and get this error in LogCat: E/GLConsumer(10118): [unnamed-10118-0] checkAndUpdateEglState: invalid current EGLDisplay – Andrew Cottrell Dec 02 '13 at 23:03
  • 1
    That means the current thread doesn't have a valid `EGLDisplay` configured. The `EGLDisplay` and `EGLContext` must be set and must have the same value whenever `updateTexImage()` is called. These are set per-thread with `eglMakeCurrent()`. See the last paragraph in the `SurfaceTexture` doc; in particular, note that you can't call `updateTexImage()` directly from the `onFrameAvailable()` callback. – fadden Dec 03 '13 at 00:05
  • Getting closer! Now I get this delightful error blob: E/(5159): egl_android_pixel_format* _egl_android_get_native_buffer_format(android_native_buffer_t*) unsupported native buffer format (0x13) E/GLConsumer(5159): [unnamed-5159-1] error creating EGLImage: 0x3003 W/GLConsumer(5159): [unnamed-5159-1] releaseAndUpdate: unable to createImage on display=0x1 slot=0 E/AndroidRuntime(5159): java.lang.RuntimeException: Error during updateTexImage (see logcat for details) (I sincerely appreciate your help with this. I owe you pizza and Scotch or your favorite equivalent.) – Andrew Cottrell Dec 03 '13 at 20:03
  • 1
    That's a weird message -- only place I see a message like that is in Mesa code. Are you using the emulator? (I'm attempting to cobble together an APK with code that demonstrates this; might be another day or so.) – fadden Dec 04 '13 at 02:38
  • Any chance I can just e-mail or chat with you about this? Mine is my name as one word at gmail. I know that's a lot to ask but it'd be a huge help. I think that previous error was due to using a pbuffer instead of a window in my EGL config. I switched over and the new problem is that MediaCodec and EGL can't seem to share the surface. If I call ELG14.eglCreateWindowSurface with my SurfaceTexture then MediaCodec can't use it, and likewise vice-versa. I get a "BufferQueue - connect: already connected (cur=1, req=3)" error. I think I'm missing a fundamental concept here. – Andrew Cottrell Dec 04 '13 at 18:15
  • 1
    A source code snippet is temporarily available here: http://bigflake.com/mediacodec/PlayMovieActivity.java.txt . Hopefully to be replaced with a full project. The Activity has a layout with a "play" button and the TextureView. The movie is scaled to the size of the TextureView, which is determined by the view layout in this example, so that would need to be adjusted to make the video look right. – fadden Dec 04 '13 at 19:26
  • I have things working now. First I switched to the approach I found here: https://github.com/vecio/MediaCodecDemo It's simple and avoids GLES/EGL. When I still didn't see successful output I realized decoding of "video/mp4v-es" to a Surface wasn't working, so I switched my encode and decode to "video/avc" and that works. The color is hosed because NV21 has blue/red backwards compared to the H.264 encoder's YUV420SemiPlanar setting, but I can work around that. Thanks again for all your help. I didn't go the GLES/EGL approach but I'm still glad to have learned about it. – Andrew Cottrell Dec 04 '13 at 21:17
  • FWIW, the vecio SurfaceView approach is nearly identical (in terms of the code in the Activity) to the TextureView approach. – fadden Dec 04 '13 at 23:34
  • 1
    The full SDK app with `PlayMovieActivity` is now available -- see Grafika on GitHub (https://github.com/google/grafika). In particular, "Play video (TextureView)" is what I described above. – fadden Dec 20 '13 at 02:29
  • @fadden, you say "but it's a simple change to make it work with an on-screen SurfaceView". Could you maybe point out these changes? Or better yet check out my related question: https://stackoverflow.com/questions/49967214/modify-extractmpegframestest-example-to-render-decoded-output-on-screen. – Guy S Apr 23 '18 at 09:44