1

I have list of Bitmap files on my sd card. Now, I want to create video using mediacodec. I have checked MediaCodec documents.I could not find a way to create video. I don't want to use FFmpeg. I have tried below code. Any help would be appreciated!!

protected void MergeVideo() throws IOException {
        // TODO Auto-generated method stub
        MediaCodec mMediaCodec;
        MediaFormat mMediaFormat;
        ByteBuffer[] mInputBuffers;
        mMediaCodec = MediaCodec.createEncoderByType("video/avc");
        mMediaFormat = MediaFormat.createVideoFormat("video/avc", 320, 240);
        mMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 125000);
        mMediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
        mMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
        mMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
        mMediaCodec.configure(mMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        mMediaCodec.start();
        mInputBuffers = mMediaCodec.getInputBuffers();
        //for (int i = 0; i<50; i++) {
        int i=0;
            int j=String.valueOf(i).length()<1?Integer.parseInt("0"+i) : i;
           File imagesFile = new File(Environment.getExternalStorageDirectory() + "/VIDEOFRAME/","frame-"+j+".png");

         Bitmap bitmap = BitmapFactory.decodeFile(imagesFile.getAbsolutePath());
        ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
        bitmap.compress(Bitmap.CompressFormat.PNG, 100, byteArrayOutputStream); // image is the bitmap
        byte[] input = byteArrayOutputStream.toByteArray();

        int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
        if (inputBufferIndex >= 0) {
            ByteBuffer inputBuffer = mInputBuffers[inputBufferIndex];
            inputBuffer.clear();
            inputBuffer.put(input);
            mMediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
        }
Bhoomika Brahmbhatt
  • 7,404
  • 3
  • 29
  • 44

1 Answers1

2

You're missing a few pieces. The answer to this question has some of the information you need, but it was written for someone specifically wanting support in API 16. If you're willing to target API 18 and later, your life will be easier.

The biggest problem with what you have is that MediaCodec input from a ByteBuffer is always in uncompressed YUV format, but you seem to be passing compressed PNG images in. You will need to convert the bitmap to YUV. The exact layout and best method for doing this varies between devices (some use planar, some use semi-planar), but you can find code for doing so. Or just look at the way frames are generated in the buffer-to-buffer parts of EncodeDecodeTest.

Alternatively, use Surface input to the MediaCodec. Attach a Canvas to the input surface and draw the bitmap on it. The EncodeAndMuxTest does essentially this, but with OpenGL ES.

One potential issue is that you're passing in 0 for the frame timestamps. You should pass a real (generated) timestamp in, so that the value gets forwarded to MediaMuxer along with the encoded frame.

On very recent devices (API 21+), MediaRecorder can accept Surface input. This may be easier to work with than MediaCodec.

Community
  • 1
  • 1
fadden
  • 51,356
  • 5
  • 116
  • 166
  • 1
    _"Attach a Canvas to the input surface and draw the bitmap on it. The EncodeAndMuxTest does essentially this, but with OpenGL ES."_ - this is the first answer out of a hundred here that actually answers the question. – Attila Tanyi Dec 02 '16 at 11:02
  • "The Surface must be rendered with a hardware-accelerated API, such as OpenGL ES. Surface.lockCanvas(android.graphics.Rect) may fail or produce unexpected results." - You can't use a canvas to do this. And using OpenGL ES will require you to use EGL_RECORDABLE_ANDROID to record a video from OpenGL ES. That means you have to use API level 26, which as of today is only used on 60% of devices. – Johann Apr 26 '21 at 16:37