1

I have tried to use MediaCodec to decode 1080p h264 raw data. But I found the latency is between 45ms~65ms with my SONY Z3(5.1.1). Is possible to reduce the latency? My frame is IPPP and GOP is 15, Have any h264 sps flag will effect the latency?

And I have another question, how to change the orientation of surface let the frame show in 90,180 or 270 degree.

I put my test project source in following google drive link. And the h264 raw data is in test_code\res\raw .

https://drive.google.com/file/d/0B688fdS1LxF4REtra0YteDh6TkE/view?usp=sharing

02-16 18:05:52.804: I/Process(10590): Sending signal. PID: 10590 SIG: 9
02-16 18:05:54.618: W/ResourceType(10706): Found multiple library tables, ignoring...
02-16 18:05:54.677: D/OpenGLRenderer(10706): Use EGL_SWAP_BEHAVIOR_PRESERVED: true
02-16 18:05:54.684: D/Atlas(10706): Validating map...
02-16 18:05:54.717: I/Adreno-EGL(10706): <qeglDrvAPI_eglInitialize:410>: EGL 1.4 QUALCOMM build: AU_LINUX_ANDROID_LA.BF.1.1.1_RB1.05.01.00.042.030_msm8974_LA.BF.1.1.1_RB1__release_AU ()
02-16 18:05:54.717: I/Adreno-EGL(10706): OpenGL ES Shader Compiler Version: E031.25.03.06
02-16 18:05:54.717: I/Adreno-EGL(10706): Build Date: 07/13/15 Mon
02-16 18:05:54.717: I/Adreno-EGL(10706): Local Branch: mybranch11906725
02-16 18:05:54.717: I/Adreno-EGL(10706): Remote Branch: quic/LA.BF.1.1.1_rb1.26
02-16 18:05:54.717: I/Adreno-EGL(10706): Local Patches: NONE
02-16 18:05:54.717: I/Adreno-EGL(10706): Reconstruct Branch: AU_LINUX_ANDROID_LA.BF.1.1.1_RB1.05.01.00.042.030 + 6151be1 + a1e0343 + 002d7d6 + 7d0e3f7 +  NOTHING
02-16 18:05:54.718: I/OpenGLRenderer(10706): Initialized EGL, version 1.4
02-16 18:05:54.733: D/OpenGLRenderer(10706): Enabling debug mode 0
02-16 18:05:54.800: I/Timeline(10706): Timeline: Activity_idle id: android.os.BinderProxy@1970952c time:27341192
02-16 18:05:56.804: I/OMXClient(10706): Using client-side OMX mux.
02-16 18:05:56.819: D/MediaCodec(10706): MediaCodec[kWhatConfigure]: video-output-protection: 00000000, audio-output-protection: 00000000
02-16 18:05:56.821: I/ACodec(10706): [OMX.qcom.video.decoder.avc] DRC Mode: Dynamic Buffer Mode
02-16 18:05:56.827: I/ExtendedCodec(10706): Decoder will be in frame by frame mode
02-16 18:05:56.830: D/ACodec(10706): Found video-output-protection flags set to 00000000
02-16 18:05:56.845: E/(10706): inputBuffers.size:4
02-16 18:05:56.845: E/(10706): outputBuffers.size:23
02-16 18:05:56.858: E/DecodeActivity(10706): dequeueOutputBuffer timed out!
02-16 18:05:56.870: E/DecodeActivity(10706): dequeueOutputBuffer timed out!
02-16 18:05:56.883: E/DecodeActivity(10706): dequeueOutputBuffer timed out!
02-16 18:05:56.884: E/DecodeActivity(10706): INFO_OUTPUT_BUFFERS_CHANGED
02-16 18:05:56.892: E/DecodeActivity(10706): New format {mime=video/raw, crop-top=0, crop-right=1919, slice-height=1088, color-format=2141391876, height=1088, width=1920, what=1869968451, crop-bottom=1079, crop-left=0, stride=1920}
02-16 18:05:56.898: E/DecodeActivity(10706): Receive first decode frame after 51 ms
Weian
  • 11
  • 4
  • Possibly related: http://stackoverflow.com/questions/21440820/how-to-reduce-latency-in-mediacodec-video-avc-decoding/ – fadden Feb 16 '16 at 17:01
  • Hi fadden, I have check the question thread before. It seem the MediaCodec need input some frames, then it will start to decode. But I want to use MediaCodec on live streaming, I can't feeding the initial frames in quickly. So I add this question, maybe have someone face the same problem as me. Maybe they have some tip can reduce the latency. – Weian Feb 17 '16 at 08:39
  • Did you solve it? – kar May 07 '19 at 14:15

1 Answers1

0

I see in your code that you set 10000 uS as the timeout to receive input data. That's a short amount of time.

while (!Thread.interrupted()) {
            if (!isEOS) {
                int inIndex = decoder.dequeueInputBuffer(10000);

In dequeueInputBuffer, put a very long timeout, like ten seconds (10000000). It will allow waiting longer. Setting -1 as the timeout will block until data is received.

You should maybe handle connectivity problems outside of your feeding loop, and have your own helper to buffer the initial data before launching the feeding loop.

Léon Pelletier
  • 2,701
  • 2
  • 40
  • 67
  • Hi Leon, I think enlarge the timeout can't shorting the latency. It will block thread in this function in a long time. – Weian Mar 02 '16 at 03:09
  • This was more about your comment to Fadden: You said "I can't feed the initial frames in quickly". This setting is more a timeout than a latency, so if your device is faster than this timeout setting, it won't change anything. 1080P seems like a huge input to handle, though. Anyway, this seems very similar to the question about reducing latency when decoding. If the problem is just the first frame, then I can't help since I don't know a lot about micro-optimisation with MediaCodec, which is already very performant. – Léon Pelletier Mar 02 '16 at 22:14
  • Ok, thanks for your reply. Do you know how to change the orientation of the surface? I want to show the frame in different degree. – Weian Mar 08 '16 at 11:17
  • It depends on how you are designing your app. Do you use OpenGLES and shaders? If so, search for how to rotate the result with your vertex shader. – Léon Pelletier Mar 08 '16 at 14:34
  • No, I didn't use OpenGL for this case. I direct output the MediaCodec result to surface. – Weian Mar 09 '16 at 13:26