0

I`m trying to decode h264 video using HW with Stagefright library.

i have used an example in here. Im getting decoded data in MedaBuffer. For rendering MediaBuffer->data() i tried AwesomeLocalRenderer in AwesomePlayer.cpp.

but picture in screen are distorted

Here is The Link of original and crashed picture.

And also tried this in example`

sp<MetaData> metaData = mVideoBuffer->meta_data();
int64_t timeUs = 0;
metaData->findInt64(kKeyTime, &timeUs);
native_window_set_buffers_timestamp(mNativeWindow.get(), timeUs * 1000);
err = mNativeWindow->queueBuffer(mNativeWindow.get(), 
mVideoBuffer->graphicBuffer().get(), -1);`

But my native code crashes. I can`t get real picture its or corrupted or it black screen.

Thanks in Advance.

Arsen Davtyan
  • 1,891
  • 8
  • 23
  • 40
  • FWIW, there are public APIs for this in Android 4.1+. The bad image looks like a stride/alignment problem in a YUV buffer. – fadden Feb 20 '14 at 17:22
  • Thanks very much for your reply. But i want to work with Android 4.0+. Thats why i`m using Android NDK. – Arsen Davtyan Feb 21 '14 at 09:03

1 Answers1

0

If you are using a HW accelerated decoder, then the allocation on the output port of your component would have been based on a Native Window. In other words, the output buffer is basically a gralloc handle which has been passed by the Stagefright framework. (Ref: OMXCodec::allocateOutputBuffersFromNativeWindow). Hence, the MediaBuffer being returned shouldn't be interpreted as a plain YUV buffer.

In case of AwesomeLocalRenderer, the framework performs a software color conversion when mTarget->render is invoked as shown here. If you trace the code flow, you will find that the MediaBuffer content is directly interpreted as YUV buffer.

For HW accelerated codecs, you should be employing AwesomeNativeWindowRenderer. If you have any special conditions for employing AwesomeLocalRenderer, please do highlight the same. I can refine this response appropriately.

P.S: For debug purposes, you could also refer to this question which captured the methods to dump the YUV data and analyze the same.

Community
  • 1
  • 1
Ganesh
  • 5,880
  • 2
  • 36
  • 54
  • Thanks very much for your reply. First when i tried AwesomeNativeWindowRenderer,native code crashes. Here is how i used it. mVideoRenderer = new AwesomeNativeWindowRenderer (mNativeWindow, 0); mVideoRenderer->render(mVideoBuffer); //In while Loop. This is backtrace. #00 pc 00000000 #01 pc 00005678 mylib.so (AwesomeNativeWindowRenderer::render (android::MediaBuffer*)+260) And when im using addr2line, the error in this line of this status_t err = mNativeWindow->queueBuffer(mNativeWindow.get(), buffer->graphicBuffer().get(), -1); The error in probably in -1. – Arsen Davtyan Feb 21 '14 at 09:01
  • @user3215358.. In your code, how did you create `mNativeWindow`? The -1 is present in `AOSP` code and hence, I will not doubt it. Can you share your code for a review? `mNativeWindow` should point to a `Surface` or `SurfaceTextureClient` which internally has a `SurfaceTexture` attached to the same. My doubt is `mNativeWindow` may not be populated properly. – Ganesh Feb 21 '14 at 09:12
  • @Ganesh.. Im getting mNativeWindow from SurfaceView. Here is the code how im getting it [link](http://pastebin.com/ZefHRW2z). – Arsen Davtyan Feb 21 '14 at 09:21
  • @Ganesh.. Im using [This](https://github.com/roman10/android-ffmpeg-tutorial/blob/master/android-ffmpeg-tutorial02/jni/tutorial02.c) Tutorial And im changing reading and decoding part. – Arsen Davtyan Feb 21 '14 at 09:26
  • @user3215358.. I will require some time to review.. but can you quickly enable logs in `BufferQueue` and check? – Ganesh Feb 21 '14 at 09:30
  • @Ganesh.. How to enable logs? open comment //#define LOG_NDEBUG 0 and recompile it ? – Arsen Davtyan Feb 21 '14 at 09:48
  • @user3215358..Yes that's correct.. If you are enabling for `BufferQueue`, consider doing so for `SurfaceTextureClient` and `SurfaceTexture` also – Ganesh Feb 21 '14 at 10:27
  • @Ganesh.. i have added #defind LOG_NDEBUG 0 But nothing was changed. Was else i should do, to see the problem??? – Arsen Davtyan Feb 21 '14 at 11:50
  • @user3215358.. Did you capture the logs in `logcat`? If so, can you share the same? – Ganesh Feb 21 '14 at 15:15
  • @Ganesh.. [Here](http://pastebin.com/FCYUxrRr) is the paste of logcat, But i dont see any logs about BufferQueue. And i have a question, Am i doing something wrong in my code??? – Arsen Davtyan Feb 21 '14 at 15:50
  • @Ganesh.. in `queueBuffer`, second parameter is `NULL`. `buffer->graphicBuffer()` is `NULL`. What can be the problem? – Arsen Davtyan Feb 25 '14 at 09:24
  • @user3215358.. Sorry I couldn't read your code. Can you please reshare your code and logcat? I can have a quick look – Ganesh Feb 25 '14 at 09:49
  • @Ganesh.. Now i'm using [this](https://vec.io/posts/use-android-hardware-decoder-with-omxcodec-in-ndk) example. For rendering i can use or local rendering or native window rendering. what do you advise? [This](http://pastebin.com/iD81dSFY) is my main part of code. [This](http://pastebin.com/WAS2DwDd) is renderers. – Arsen Davtyan Feb 25 '14 at 10:04