17

I am trying to use MediaCodec to save a series of Images, saved as Byte Arrays in a file, to a video file. I have tested these images on a SurfaceView (playing them in series) and I can see them fine. I have looked at many examples using MediaCodec, and here is what I understand (please correct me if I am wrong):

Get InputBuffers from MediaCodec object -> fill it with your frame's image data -> queue the input buffer -> get coded output buffer -> write it to a file -> increase presentation time and repeat

However, I have tested this a lot and I end up with one of two cases:

  • All sample projects I tried to imitate have caused Media server to die when calling queueInputBuffer for the second time.
  • I tried calling codec.flush() at the end (after saving output buffer to file, although none of the examples I saw did this) and the media server did not die, however, I am not able to open the output video file with any media player, so something is wrong.

Here is my code:

MediaCodec codec = MediaCodec.createEncoderByType(MIMETYPE);
        MediaFormat mediaFormat = null;
        if(CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)){
            mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 1280 , 720);
        } else {
            mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 720, 480);
        }


        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 700000);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 10);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
        codec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

        codec.start();

        ByteBuffer[] inputBuffers = codec.getInputBuffers();
        ByteBuffer[] outputBuffers = codec.getOutputBuffers();
        boolean sawInputEOS = false;
        int inputBufferIndex=-1,outputBufferIndex=-1;
        BufferInfo info=null;

                    //loop to read YUV byte array from file

            inputBufferIndex = codec.dequeueInputBuffer(WAITTIME);
            if(bytesread<=0)sawInputEOS=true;

            if(inputBufferIndex >= 0){
                if(!sawInputEOS){
                    int samplesiz=dat.length;
                    inputBuffers[inputBufferIndex].put(dat);
                    codec.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
                    presentationTime += 100;

                    info = new BufferInfo();
                    outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);
                    Log.i("BATA", "outputBufferIndex="+outputBufferIndex);
                    if(outputBufferIndex >= 0){
                        byte[] array = new byte[info.size];
                        outputBuffers[outputBufferIndex].get(array);

                        if(array != null){
                            try {
                                dos.write(array);
                            } catch (IOException e) {
                                e.printStackTrace();
                            }
                        }

                        codec.releaseOutputBuffer(outputBufferIndex, false);
                        inputBuffers[inputBufferIndex].clear();
                        outputBuffers[outputBufferIndex].clear();

                        if(sawInputEOS) break;
                    }
                }else{
                    codec.queueInputBuffer(inputBufferIndex, 0, 0, presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);

                    info = new BufferInfo();
                    outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);

                    if(outputBufferIndex >= 0){
                        byte[] array = new byte[info.size];
                        outputBuffers[outputBufferIndex].get(array);

                        if(array != null){
                            try {
                                dos.write(array);
                            } catch (IOException e) {
                                e.printStackTrace();
                            }
                        }

                        codec.releaseOutputBuffer(outputBufferIndex, false);
                        inputBuffers[inputBufferIndex].clear();
                        outputBuffers[outputBufferIndex].clear();
                        break;
                    }
                }


            }
        }

        codec.flush();

        try {
            fstream2.close();
            dos.flush();
            dos.close();
        } catch (IOException e) {
            e.printStackTrace();
        }
        codec.stop();
        codec.release();
        codec = null;

        return true;

    }

My question is, how can I get a working video from a stream of images using MediaCodec. What am I doing wrong?

Another question (if I am not too greedy), I would like to add an Audio track to this video, can it be done with MediaCodec as well, or must I use FFmpeg?

Note: I know about MediaMux in Android 4.3, however, it is not an option for me as my App must work on Android 4.1+.

Update Thanks to fadden answer, I was able to reach EOS without Media server dying (Above code is after modification). However, the file I am getting is producing gibberish. Here is a snapshot of the video I get (only works as .h264 file).

Video Output

My Input image format is YUV image (NV21 from camera preview). I can't get it to be any playable format. I tried all COLOR_FormatYUV420 formats and same gibberish output. And I still can't find away (using MediaCodec) to add audio.

Mohamed_AbdAllah
  • 5,311
  • 3
  • 28
  • 47
  • Hey Mohamed, Buddy can you make a blog with example code for this much work. I don't find anyone made that much progress till date. – Harpreet Dec 02 '13 at 10:11
  • @Harpreet Sure, I am planning to do so. There are many working examples on stackoverflow and on Github, however, I agree that a blog is needed for more clear documentation. – Mohamed_AbdAllah Dec 02 '13 at 11:29
  • 1
    I tried to use every code available but each time it get's `IllegalStateException` at either `MediaCodec.CONFIGURE_FLAG_ENCODE` or at `codec.getOutputBuffers()`. I don't find any valid reason for it to happen when even you have the same code with success of running at-least. – Harpreet Dec 02 '13 at 11:44
  • The supported Color Formats differ from one phone to another. You can post a question on stackoverflow and I will help if I can – Mohamed_AbdAllah Dec 02 '13 at 11:46
  • Hi Mohamed, I want to do the same thing and I just get a similar gibberish result. Have you managed to produce the right video? I think it's related to the MediaFormat. If a separate question is needed, I am happy to do so. – Fanglin Dec 15 '13 at 17:51
  • 1
    Yes I was able to encode the video correctly. However, for adding audio, I had to use Jcodec and MP4Parser APIs to do so (since I had to develop for Android < 4.3 and was not able to use MediaMuxer). I faced 3 kinds of problems that produced this kind of gibberish. One problem happened when using KEY_COLOR_FORMAT that needed adjusting YUV frames. Second time, when I used a width x height for encoder not matching the Camera output. Third time, I wrongly input wrong buffer size to the encoder. Correcting these problems produced correct output. – Mohamed_AbdAllah Dec 15 '13 at 21:36
  • OK, I finally closed the loop, thanks for this SO. For those who wants a muxer below 4.3 to encapsulate raw h264 into mp4 container, take a look at there: http://stackoverflow.com/a/23000620/727768 – X.Y. Apr 10 '14 at 23:27
  • Hi Mohamed_AbdAllah, I am also facing similar problem. I have encoded the video from frames (provided by onPreviewFrame()), but not able to play the resulting video (I have tested on Samsung Galaxy S4 - Android 4.4.2). Any help or guidance will be very helpful. – abhishek kumar gupta Oct 08 '14 at 15:44
  • hi @mohamed-abdallah can you post working code. – Uma Achanta Jan 23 '17 at 13:17
  • have u solve your problem?if yes then please share your ans with me,this is a big issue for me,thanks – Abhishek Bhardwaj Aug 26 '17 at 15:58

1 Answers1

13

I think you have the right general idea. Some things to be aware of:

  • Not all devices support COLOR_FormatYUV420SemiPlanar. Some only accept planar. (Android 4.3 introduced CTS tests to ensure that the AVC codec supports one or the other.)
  • It's not the case that queueing an input buffer will immediately result in the generation of one output buffer. Some codecs may accumulate several frames of input before producing output, and may produce output after your input has finished. Make sure your loops take that into account (e.g. your inputBuffers[].clear() will blow up if it's still -1).
  • Don't try to submit data and send EOS with the same queueInputBuffer call. The data in that frame may be discarded. Always send EOS with a zero-length buffer.

The output of the codecs is generally pretty "raw", e.g. the AVC codec emits an H.264 elementary stream rather than a "cooked" .mp4 file. Many players won't accept this format. If you can't rely on the presence of MediaMuxer you will need to find another way to cook the data (search around on stackoverflow for ideas).

It's certainly not expected that the mediaserver process would crash.

You can find some examples and links to the 4.3 CTS tests here.

Update: As of Android 4.3, MediaCodec and Camera have no ByteBuffer formats in common, so at the very least you will need to fiddle with the chroma planes. However, that sort of problem manifests very differently (as shown in the images for this question).

The image you added looks like video, but with stride and/or alignment issues. Make sure your pixels are laid out correctly. In the CTS EncodeDecodeTest, the generateFrame() method (line 906) shows how to encode both planar and semi-planar YUV420 for MediaCodec.

The easiest way to avoid the format issues is to move the frames through a Surface (like the CameraToMpegTest sample), but unfortunately that's not possible in Android 4.1.

Community
  • 1
  • 1
fadden
  • 51,356
  • 5
  • 116
  • 166
  • I have tested Planar as well with same output. For the second point, how do I know when to use the output buffer? Also, I tried renaming the file to .h264 and it did not work either. And is it possible to add audio to the output using MediaCodec? – Mohamed_AbdAllah Sep 14 '13 at 01:06
  • H.264 elementary streams only represent the video. You'd need the MediaMuxer class (from 4.3) or equivalent external library to combine audio. Use the output buffer whenever one is returned; just continue to do so until you see the explicit EOS flag. – fadden Sep 14 '13 at 05:48
  • I got the video to produce an output (thanks to your comments), however, it is gibberish. Please see my update. – Mohamed_AbdAllah Sep 15 '13 at 20:43
  • Thank you very much :) I am almost there. I got the video to work finally (.h264 format with correct colors), but the frame rate is too fast. I tried playing with presentationTime, but as if it was not there. I am also checking jCodec & Mp4Parser and I think they can do what I want for the Audio, but their documentation is very bad. – Mohamed_AbdAllah Sep 16 '13 at 23:39
  • I don't believe the H.264 video stream includes embedded presentation time stamps. They only get passed through the `MediaCodec` encoder API because the encoder might choose to reorder the frames -- you need to maintain the pairing of encoded data frame and time stamp yourself. The way this works with `MediaMuxer` is the `writeSampleData()` method takes a `MediaCodec.BufferInfo` argument, which includes the presentation time, so if you feed the data and `BufferInfo` directly to `MediaMuxer` you get properly time-stamped .mp4. – fadden Sep 17 '13 at 06:06
  • can I use surface for recording frame with MediaCodec like with MediaRecorder? https://stackoverflow.com/questions/51332386/mediarecorder-and-videosource-surface-stop-failed-1007-a-serious-android-bug – user25 Jul 14 '18 at 08:31
  • also `CameraToMpegTest` sample doesn't work: `W/System.err: java.lang.IllegalStateException: Can't stop due to wrong state. at android.media.MediaMuxer.stop(MediaMuxer.java:242)` – user25 Jul 14 '18 at 08:36
  • @fadden Can you please help me for this **[Question](https://stackoverflow.com/questions/53312998/android-grafika-continuouscapture-activity-issues)** – Ali Nov 16 '18 at 13:45