2

Based on reviewing samples available under Android NDK it appears that in order to use hardware decoders (OMX.xxx) we can use either MediaCodec or OpenMAXAL interface.

My questions are:

  1. Is this the best way to use hardware decoders on mobile (Snapdragon 820 on Android)?

  2. Is there any advantage of using OMX over MediaCodec. Current Android headers actually only support MPEG2/.ts mimetype. I am hoping that you can support streaming decoding of video (mp4 etc.) via OpenMax - is it correct assessment?

My requirements are:

  1. Support streaming audio and video playing for common containers.
  2. Leverage mobile hardware (GPU + DSP) for decoding.
  3. Perform image processing on per frame basis.
  4. Avoid writing own time sync for audio and video.

I have written basic player using ffmpeg but I have not been able to use hardware decoders, so not following it.

I am open to any other framework (free or commercial) that would accomplish above.

Cody Gray - on strike
  • 239,200
  • 50
  • 490
  • 574
Ketan
  • 1,017
  • 8
  • 17

1 Answers1

5

In general, MediaCodec is the one that would be recommended.

The OpenMAX AL API was added as a stopgap measure in Android 4.0, before MediaCodec became available in Android 4.1. It is practically deprecated (even though I'm not sure if there's any official statement saying that).

They operate on slightly different levels of abstraction, and for most cases, MediaCodec is less work.

With OpenMAX AL, you need to provide an MPEG TS stream of the data to decode and play back. It does not support other container formats. It does not give you direct access to the decoded data either, but it is played back directly. It does, however, take care of sync of audio and video.

Pros of OpenMAX AL:

  • If your input is MPEG TS, you avoid a whole extra step
  • Handles sync automatically

Cons:

  • Everything else

With MediaCodec, you need to provide individual packets of data to decode. It does not support any container format at all on its own, but you as a caller are supposed to take care of that. It does give you direct access to the decoded output data, but to present it, you need to handle sync manually. (In Android 6.0, there's a new class MediaSync, which can help with this though.)

Pros of MediaCodec:

  • Generic, flexible
  • Works equally well with any container (doesn't require repacking into MPEG TS)

Cons of MediaCodec:

  • Requires you to handle sync manually
  • Quite low level, requires you to do a lot of work

For extracting individual packets of data, there's the MediaExtractor class, which will be useful with some common file formats for static files. I don't think it is usable for streaming e.g. fragmented MP4 though.

So if you want to do streaming playback of a format other than MPEG TS, you need to handle extracting of the packets yourself (or use some other library, such as libavformat, for that task). If you use OpenMAX AL, you then would need to package the individual packets back into MPEG TS (using e.g. libavformat). If you use MediaCodec, you would need to handle sync of audio and video during playback.

If you need to do processing of the decoded frames, MediaCodec is probably the only way to go. You can either get the decoded image data as raw YUV, or get it in a GL surface that you can modify using shaders. (The latter might be possible using OpenMAX AL as well, though.)

mstorsjo
  • 12,983
  • 2
  • 39
  • 62
  • Hi mstorsjo, thanks you for quick pros and cons analysis. I have implemented basic streaming using FFmpeg + OpenCV - which does all the things that I need but sync is not 100% as I do have to drop audio frames and hardware acceleration is absent. I thought for MediaCodec, it can stream mp4, webp etc formats as mentioned online. I will double check. It appears that MediaCodec will be way to go - thanks again. – Ketan Nov 02 '16 at 17:11
  • One more question - Does MediaCodec guarantee that it will use hardware decoders and if fails then fallbacks to sw mode? – Ketan Nov 02 '16 at 17:18
  • With MediaCodec, you can manually choose what codec to use; if it is named `OMX.google.*`, it is a SW codec, otherwise it's probably a HW codec. If you use `MediaCodec.createDecoderByType`, you should in practice always get a HW codec first if that is available. See http://stackoverflow.com/questions/37715529/how-to-know-android-decoder-mediacodec-createdecoderbytypetype-is-hardware-or for more details. – mstorsjo Nov 02 '16 at 19:27