3

I am working on a video conferencing project. We were using software codec for encode and decode of video frames which will do fine for lower resolutions( up to 320p). We have planned to support our application for higher resolutions also up to 720p. I came to know that hardware acceleration will do this job fairly well.

As the hardware codec api Media codec is available from Jelly Bean onward I have used it for encode and decode and are working fine. But my application is supported from 2.3 . So I need to have an hardware accelerated video decode for H.264 frames of 720p at 30fps.

On research came across the idea of using OMX codec by modifying the stage fright framework.I had read that the hardware decoder for H.264 is available from 2.1 and encoder is there from 3.0. I have gone through many articles and questions given in this site and confirmed that I can go ahead.

I had read about stage fright architecture here -architecture and here- stagefright how it works

And I read about OMX codec here- use-android-hardware-decoder-with-omxcodec-in-ndk.

I am having a starting trouble and some confusions on its implementation.I would like to have some info about it.

  1. For using OMX codec in my code should I build my project with the whole android source tree or can I do by adding some files from AOSP source(if yes which all).
  2. What are the steps from scratch I should follow to achieve it.

Can someone give me a guideline on this

Thanks...

Kevin K
  • 589
  • 2
  • 7
  • 28
  • 2
    _"What are the steps from scratch I should follow to achieve it."_ is really quite broad. For a single StackOverflow question you should probably narrow down the scope quite a bit. For example, identify _one particular_ step that you feel is necessary but uncertain about how to perform. Then ask a question about that. – Michael Jun 04 '14 at 10:21
  • @Michael okay I will do that. But can you check my first question.. Should I take the full source code of android. – Kevin K Jun 04 '14 at 11:30
  • @androkid.. Do you require access to the codec from the `Java` layer or is it sufficient to have access to the codec in native layer which is abstracted by a `JNI` implementation of your application? – Ganesh Jun 05 '14 at 03:50
  • @Ganesh I was expecting an answer from you. I want work at JNI level itself.. – Kevin K Jun 05 '14 at 04:26
  • @Ganesh I don't have an experience in editing AOSP source code. That's why bit confused about how to start implementation. If you give me some suggestions I will work on that. I know you are a master in it..... – Kevin K Jun 05 '14 at 04:39
  • @androkid...Did the suggestion help?? – Ganesh Jun 06 '14 at 13:04
  • @Ganesh Thanks a lot for your follow up. I am going ahead step by step. I am trying to download the AOSP source code. I needed some extra things as per the requirements stated by of the developer site, so bit delayed the things. I have got an idea. Working on it. By the the way will I have to build separate .so for each android versions ?? – Kevin K Jun 06 '14 at 13:29
  • @Ganesh Can u check my question [here](http://stackoverflow.com/questions/24201178/android-how-to-build-and-replace-modified-aosp-code?) – Kevin K Jun 16 '14 at 07:26
  • @androkid.. I am not very sure if it would be necessary if you are building in a fashion very similar to the command line `stagefright` executable. You could start with a single `.so` file and then build others if necessary – Ganesh Jun 16 '14 at 13:25

1 Answers1

5

The best example to describe the integration of OMXCodec in native layer is the command line utility stagefright as can be observed here in GingerBread itself. This example shows how a OMXCodec is created.

Some points to note:

  1. The input to OMXCodec should be modeled as a MediaSource and hence, you should ensure that your application handles this requirement. An example for creating a MediaSource based source can be found in record utility file as DummySource.

  2. The input to decoder i.e. MediaSource should provide the data through the read method and hence, your application should provide individual frames for every read call.

  3. The decoder could be created with NativeWindow for output buffer allocation. In this case, if you wish to access the buffer from the CPU, you should probably refer to this query for more details.

Community
  • 1
  • 1
Ganesh
  • 5,880
  • 2
  • 36
  • 54
  • Could you please look at my question [here](http://stackoverflow.com/questions/24950470/android-dequeuing-native-buffer-returns-error-from-omxcodec-dequeued-unrecogni) – Kevin K Jul 28 '14 at 04:53