3

I'm working on app for android that using OpenCV.

I have mp4 video file, I need to read 300 frames of 1920x1080 from it and do some image processing manipulation on them.

After a large search I found at the end only this examples.

My problem is that I need a simple thing to do, I just want to read the frames and save them in the device memory or just convert them to OpenCV Matrix.

This is my try(explain at the end):

public void run() {

        extractor = new MediaExtractor();
        extractor.setDataSource(SAMPLE);

        for (int i = 0; i < extractor.getTrackCount(); i++) {
            MediaFormat format = extractor.getTrackFormat(i);
            String mime = format.getString(MediaFormat.KEY_MIME);
            if (mime.startsWith("video/")) {
                extractor.selectTrack(i);
                decoder = MediaCodec.createDecoderByType(mime);
                decoder.configure(format, surface, null, 0);

                break;
            }
        }

        if (decoder == null) {
            Log.e("DecodeActivity", "Can't find video info!");
            return;
        }

        decoder.start();

        ByteBuffer[] inputBuffers = decoder.getInputBuffers();
        ByteBuffer[] outputBuffers = decoder.getOutputBuffers();

        BufferInfo info = new BufferInfo();

        boolean isEOS = false;
        long startMs = System.currentTimeMillis();

        while (!Thread.interrupted()) {
            if (!isEOS) {
                int inIndex = decoder.dequeueInputBuffer(10000);
                if (inIndex >= 0) {
                    ByteBuffer buffer = inputBuffers[inIndex];
                    int sampleSize = extractor.readSampleData(buffer, 0);
                    if (sampleSize < 0) {
                        Log.d("DecodeActivity", "InputBuffer BUFFER_FLAG_END_OF_STREAM");
                        decoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                        isEOS = true;
                    } else {
                        decoder.queueInputBuffer(inIndex, 0, sampleSize, extractor.getSampleTime(), 0);
                        extractor.advance();
                    }

                }
            }
            int outIndex = decoder.dequeueOutputBuffer(info, 10000);
            switch (outIndex) {
            case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
                Log.d("DecodeActivity", "INFO_OUTPUT_BUFFERS_CHANGED");
                outputBuffers = decoder.getOutputBuffers();
                break;
            case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                Log.d("DecodeActivity", "New format " + decoder.getOutputFormat());
                break;
            case MediaCodec.INFO_TRY_AGAIN_LATER:
                Log.d("DecodeActivity", "dequeueOutputBuffer timed out!");
                break;
            default:
                ByteBuffer buffer = outputBuffers[outIndex];

                Log.v("DecodeActivity", "We can't use this buffer but render it due to the API limit, " + buffer);
                byte[] b = new byte[buffer.remaining()];

                // We use a very simple clock to keep the video FPS, or the video
                // playback will be too fast
                while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
                    try {
                        sleep(10);
                    } catch (InterruptedException e) {
                        e.printStackTrace();
                        break;
                    }
                }
                decoder.releaseOutputBuffer(outIndex, true);
                break;
            }

            // All decoded frames have been rendered, we can stop playing now
            if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                Log.d("DecodeActivity", "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
                break;
            }
        }

        decoder.stop();
        decoder.release();
        extractor.release();

    }

In this example I read the frames and show them as Surface.

What I need to change in order to save it as Bitmap/Matrix or save it in the device?

Thanks

Itay
  • 73
  • 6
  • Did you see http://bigflake.com/mediacodec/#ExtractMpegFramesTest ? – fadden Dec 03 '13 at 15:49
  • Yes, But I don't understand from all of this example how can I get what I want. I want to know if from the example above I can get the frames not as Surface Or if I need to take another example? Thanks – Itay Dec 04 '13 at 05:33
  • The ExtractMpegFramesTest converts the first 10 frames from Surface to Bitmap, and then saves each to disk as a PNG. How did you create the Surface used in your code? – fadden Dec 04 '13 at 06:00
  • Thanks for the direction, I'm going to check this. In this line I configure Surface decoder.configure(format, surface, null, 0); – Itay Dec 04 '13 at 06:18
  • I'm trying to call to this class from my Main activity like this: try { ExtractMpegFramesTest etmf = new ExtractMpegFramesTest(); etmf.testExtractMpegFrames(); } catch (Throwable e) { // TODO Auto-generated catch block e.printStackTrace(); } but it doesn't work. I've got an empty bitmaps. what do I miss? I need to do a pre configuration? – Itay Dec 04 '13 at 12:35
  • Check the logcat output for errors, or step through it with a debugger to see what it's doing. Setting `VERBOSE` to true may be useful. – fadden Dec 04 '13 at 15:40
  • The code example stucked at this function: outputSurface.awaitNewImage(); Is all of the example code is recommended? or there is another and easy example? – Itay Dec 05 '13 at 07:22
  • Nothing works for me! I'm trying really hard to understand what's goes there but I've got nothing. ExtractMPEGFramesTest.java doesn't work for me at all. If I will get how to do this, I'm gone write a tutorial for the next generation :) – Itay Dec 05 '13 at 11:36
  • Is the OnFrameAvailable callback firing? If not, make sure the test code is running in its own thread (see the note above ExtractMpegFramesWrapper for why this is necessary). – fadden Dec 05 '13 at 15:32
  • It reached to the "set" of the onFrameAvailable, but it doesn't get inside this event. you can see above how I call to the "testExtractMpegFrames" function. – Itay Dec 08 '13 at 08:57

1 Answers1

1

I see 2 paths for your code:

  1. For Android 4.3 and up, you can use examples from Grafika as fadden suggested. The MediaCodec decoder should use a surface for configure method, and releaseOutputBuffer with boolean render set to "true" so that rendering takes place to a surface. Then, you can perform manipulations using shaders while rendering to a surface that will be used by a MediaCodec encoder to encode it back as video. This solution is fast on most devices with some exceptions, but it is new and bugs still appear in device hardware.
  2. If you can accept a slower solution, OpenCV has a nice port/integration on Android, with code samples and all, even encoding back to h264 using ffmpeg. For decoding you can still use MediaCodec (if you render the results to a surface, you will need to use glReadPixels, which is slow, to get the data back to cpu). This works also for previous versions of Android.

Either way, you need time, patience and energy, as it will not go without some struggle.

Community
  • 1
  • 1
bog
  • 21
  • 2