1

What would be the best way to calculate the decode time of a frame decoded by mediacodec? the execution time of the code below is clearly not the correct time. Is there any way to know how long each frame/frames took to decode?

Thanks.

startTime...

dequeueInputBuffer();

getInputBuffer();

//   copy frame to input buffer

queueInputBuffer();

dequeueOutputBuffer();

releaseOutputBuffer();

stopTime...

exectime = startTime - StopTime

1 Answers1

2

It's difficult to get a meaningful measurement of the time required to decode a single frame, because you'll be measuring latency as well as throughput. Data has to be passed from the app to the mediaserver process, into the driver, decoded, and then the decoded data has to make the same journey in reverse. There can be additional pipelining in the driver itself.

You can get a reasonable approximation by decoding a few hundred frames and then dividing the total time by the number of frames.

What is it you're trying to accomplish?

fadden
  • 51,356
  • 5
  • 116
  • 166
  • Thanks for the response. I am trying to do real-time video streaming (The code above is inside NDK). Just needed to see how long the decoding time is to measure latency.I am not sure if the start and stop timers are on the correct line as the input is queued. – MyNameisAwesome May 06 '16 at 14:27
  • 3
    The time when input is queued shouldn't matter -- you want to keep the input stuffed as full as possible. Remember that some video formats allow encoded frames to appear out of order... if you try to feed one frame at a time, it would stall completely. You manage the pacing of the video at the point when the output buffer is released. Some additional notes about latency can be found in http://stackoverflow.com/questions/21440820/ – fadden May 06 '16 at 14:54