36

I want to call a function and build a video out of list of images, and then save it locally on the device:

public void CreateAndSaveVideoFile(List<Bitmap> MyBitmapArray)
{
   // ..
}

Trials:

  • Following java/xuggle - encode array of images into a movie, the link in the answer is a dead link

  • Following How to encode images into a video file in Java through programming?, The suggested library in the accepted answer does not support Android.

  • The next answer in the above has an approach for Android users however it is not clear for me the input and the output of that function (where did he give the images? and where did he get the video?) - I left a question comment

  • The next answer in the above provides a whole class, however the required library to be included has a corrupted file (when I try and download it from the provided link) - I left a question comment

  • Following Java: How do I create a movie from an array of images?, the suggested library in the top answer uses commands that I am not familiar with and I don't even know how to use them. Like:

Creating an MPEG-4 file from all the JPEG files in the current directory:

mencoder mf://*.jpg -mf w=800:h=600:fps=25:type=jpg -ovc lavc \ -lavcopts vcodec=mpeg4:mbd=2:trell -oac copy -o output.avi

I don't know how can I use the above in a Java / Android project..

Can anyone help in guiding me or/and providing me with an approach to my task? Thanks in advance.

Khalil Khalaf
  • 9,259
  • 11
  • 62
  • 104
  • 1
    Do not make bitmaps of the frames first. And dont put them in a list as soon you will be out of memory. Just write your frames to file directly. – greenapps Oct 29 '16 at 09:02
  • Please compare frame size/length with bitmap needed memory size and report. If you want to use a list then better put the frames in it. Just compare the needed memories. – greenapps Oct 29 '16 at 09:06
  • @greenapps on button click RecordVideo, you want me to start saving ReceivedImages locally on the device. Would you recommend this way of saving [Saving and Reading Bitmaps/Images](http://stackoverflow.com/questions/17674634/saving-and-reading-bitmaps-images-from-internal-memory-in-android)? I go ahead and implement to save a single image then report to you the size of it. – Khalil Khalaf Oct 29 '16 at 17:43
  • @greenapps Hi, I am still [not succeeding in saving an image](http://stackoverflow.com/questions/40323126/where-do-i-find-the-saved-image-in-android). Would this help instead? https://s11.postimg.org/4a1kdregz/Capture.png – Khalil Khalaf Oct 29 '16 at 20:36
  • 1
    host a service whose inputs are a zip ( photos) outputs - video file name for mp4. When called , it unpacks the zips photos - calling ffmpeg or something else with unpacked media as inputs . Output file is an MP4. This is generic , media muxing service that can probably be installed as a Node lib with min revisions. wud not have 2 write it. When mp4 is ready on server, then its POST'd to your CDN (serving mp4 media). Then the client requests the mp4 from the CDN. Reason for alt architect - a server is a much better place to create/host/serve the mp4 from media captured on mobile device – Robert Rowntree Oct 31 '16 at 18:36
  • http://stackoverflow.com/questions/16695485/node-js-realtime-conversion-from-jpeg-images-to-video-file for example... – Robert Rowntree Oct 31 '16 at 19:57

7 Answers7

36

You can use jcodec SequenceEncoder to convert sequence of images to MP4 file.

Sample code :

import org.jcodec.api.awt.SequenceEncoder;
...
SequenceEncoder enc = new SequenceEncoder(new File("filename"));
// GOP size will be supported in 0.2
// enc.getEncoder().setKeyInterval(25);
for(...) {
    BufferedImage image = ... // Obtain an image to encode
    enc.encodeImage(image);
}
enc.finish();

It's a java library so it's easy to import it into Android project, you don't have to use NDK unlike ffmpeg.

Refer http://jcodec.org/ for sample code & downloads.

Abhishek V
  • 12,488
  • 6
  • 51
  • 63
  • Awesome, looks like what I am looking for. I will give it a try, thanks! – Khalil Khalaf Nov 02 '16 at 18:49
  • @KhalilKhalaf, where you able to get it working? I am trying the same code, it generates a corrupt .mp4 file. – nishant1000 Mar 22 '18 at 15:03
  • @nishant1000 This feature was dropped and I got redirected to work on something else. Did you try the "Bitmap to BufferedImage" method in the other answer before executing JCodec on the images? – Khalil Khalaf Mar 23 '18 at 16:12
  • @Abhishek can we also add Audio to video and transition animations while changing frames in video? – Mayank Pandya Aug 05 '18 at 10:20
  • java awt is not available in Android it seems, so I don't know that BufferedImage is a viable solution here – caitcoo0odes Oct 09 '19 at 03:38
  • I think it should be clearly underlined that, as the library developers say, the expectations about encoding efficiency/speed should be set very low. – Antonio Jan 24 '23 at 13:46
18

Using JCodec as demonstrated by Stanislav Vitvitskyy here.

public static void main(String[] args) throws IOException {
    SequenceEncoder encoder = new SequenceEncoder(new File("video.mp4"));
    for (int i = 1; i < 100; i++) {
        BufferedImage bi = ImageIO.read(new File(String.format("img%08d.png", i)));
        encoder.encodeImage(bi);
        }
    encoder.finish();}

Now to convert your Bitmap to BufferedImage you can use this class:

import java.awt.image.BufferedImage;
import java.awt.image.DataBufferByte;
import java.awt.image.DataBufferInt;
import java.io.IOException;
import java.io.InputStream;

/**
  * Utility class for loading windows bitmap files
  * <p>
  * Based on code from author Abdul Bezrati and Pepijn Van Eeckhoudt
  */
public class BitmapLoader {

/**
 * Static method to load a bitmap file based on the filename passed in.
 * Based on the bit count, this method will either call the 8 or 24 bit
 * bitmap reader methods
 *
 * @param file The name of the bitmap file to read
 * @throws IOException
 * @return A BufferedImage of the bitmap
 */
public static BufferedImage loadBitmap(String file) throws IOException {
    BufferedImage image;
    InputStream input = null;
    try {
        input = ResourceRetriever.getResourceAsStream(file);

        int bitmapFileHeaderLength = 14;
        int bitmapInfoHeaderLength = 40;

        byte bitmapFileHeader[] = new byte[bitmapFileHeaderLength];
        byte bitmapInfoHeader[] = new byte[bitmapInfoHeaderLength];

        input.read(bitmapFileHeader, 0, bitmapFileHeaderLength);
        input.read(bitmapInfoHeader, 0, bitmapInfoHeaderLength);

        int nSize = bytesToInt(bitmapFileHeader, 2);
        int nWidth = bytesToInt(bitmapInfoHeader, 4);
        int nHeight = bytesToInt(bitmapInfoHeader, 8);
        int nBiSize = bytesToInt(bitmapInfoHeader, 0);
        int nPlanes = bytesToShort(bitmapInfoHeader, 12);
        int nBitCount = bytesToShort(bitmapInfoHeader, 14);
        int nSizeImage = bytesToInt(bitmapInfoHeader, 20);
        int nCompression = bytesToInt(bitmapInfoHeader, 16);
        int nColoursUsed = bytesToInt(bitmapInfoHeader, 32);
        int nXPixelsMeter = bytesToInt(bitmapInfoHeader, 24);
        int nYPixelsMeter = bytesToInt(bitmapInfoHeader, 28);
        int nImportantColours = bytesToInt(bitmapInfoHeader, 36);

        if (nBitCount == 24) {
            image = read24BitBitmap(nSizeImage, nHeight, nWidth, input);
        } else if (nBitCount == 8) {
            image = read8BitBitmap(nColoursUsed, nBitCount, nSizeImage, nWidth, nHeight, input);
        } else {
            System.out.println("Not a 24-bit or 8-bit Windows Bitmap, aborting...");
            image = null;
        }
    } finally {
        try {
            if (input != null)
                input.close();
        } catch (IOException e) {
        }
    }
    return image;
}

/**
 * Static method to read a 8 bit bitmap
 *
 * @param nColoursUsed Number of colors used
 * @param nBitCount The bit count
 * @param nSizeImage The size of the image in bytes
 * @param nWidth The width of the image
 * @param input The input stream corresponding to the image
 * @throws IOException
 * @return A BufferedImage of the bitmap
 */
private static BufferedImage read8BitBitmap(int nColoursUsed, int nBitCount, int nSizeImage, int nWidth, int nHeight, InputStream input) throws IOException {
    int nNumColors = (nColoursUsed > 0) ? nColoursUsed : (1 & 0xff) << nBitCount;

    if (nSizeImage == 0) {
        nSizeImage = ((((nWidth * nBitCount) + 31) & ~31) >> 3);
        nSizeImage *= nHeight;
    }

    int npalette[] = new int[nNumColors];
    byte bpalette[] = new byte[nNumColors * 4];
    readBuffer(input, bpalette);
    int nindex8 = 0;

    for (int n = 0; n < nNumColors; n++) {
        npalette[n] = (255 & 0xff) << 24 |
                (bpalette[nindex8 + 2] & 0xff) << 16 |
                (bpalette[nindex8 + 1] & 0xff) << 8 |
                (bpalette[nindex8 + 0] & 0xff);

        nindex8 += 4;
    }

    int npad8 = (nSizeImage / nHeight) - nWidth;
    BufferedImage bufferedImage = new BufferedImage(nWidth, nHeight, BufferedImage.TYPE_INT_ARGB);
    DataBufferInt dataBufferByte = ((DataBufferInt) bufferedImage.getRaster().getDataBuffer());
    int[][] bankData = dataBufferByte.getBankData();
    byte bdata[] = new byte[(nWidth + npad8) * nHeight];

    readBuffer(input, bdata);
    nindex8 = 0;

    for (int j8 = nHeight - 1; j8 >= 0; j8--) {
        for (int i8 = 0; i8 < nWidth; i8++) {
            bankData[0][j8 * nWidth + i8] = npalette[((int) bdata[nindex8] & 0xff)];
            nindex8++;
        }
        nindex8 += npad8;
    }

    return bufferedImage;
}

/**
 * Static method to read a 24 bit bitmap
 *
 * @param nSizeImage size of the image  in bytes
 * @param nHeight The height of the image
 * @param nWidth The width of the image
 * @param input The input stream corresponding to the image
 * @throws IOException
 * @return A BufferedImage of the bitmap
 */
private static BufferedImage read24BitBitmap(int nSizeImage, int nHeight, int nWidth, InputStream input) throws IOException {
    int npad = (nSizeImage / nHeight) - nWidth * 3;
    if (npad == 4 || npad < 0)
        npad = 0;
    int nindex = 0;
    BufferedImage bufferedImage = new BufferedImage(nWidth, nHeight, BufferedImage.TYPE_4BYTE_ABGR);
    DataBufferByte dataBufferByte = ((DataBufferByte) bufferedImage.getRaster().getDataBuffer());
    byte[][] bankData = dataBufferByte.getBankData();
    byte brgb[] = new byte[(nWidth + npad) * 3 * nHeight];

    readBuffer(input, brgb);

    for (int j = nHeight - 1; j >= 0; j--) {
        for (int i = 0; i < nWidth; i++) {
            int base = (j * nWidth + i) * 4;
            bankData[0][base] = (byte) 255;
            bankData[0][base + 1] = brgb[nindex];
            bankData[0][base + 2] = brgb[nindex + 1];
            bankData[0][base + 3] = brgb[nindex + 2];
            nindex += 3;
        }
        nindex += npad;
    }

    return bufferedImage;
}

/**
 * Converts bytes to an int
 *
 * @param bytes An array of bytes
 * @param index
 * @returns A int representation of the bytes
 */
private static int bytesToInt(byte[] bytes, int index) {
    return (bytes[index + 3] & 0xff) << 24 |
            (bytes[index + 2] & 0xff) << 16 |
            (bytes[index + 1] & 0xff) << 8 |
            bytes[index + 0] & 0xff;
}

/**
 * Converts bytes to a short
 *
 * @param bytes An array of bytes
 * @param index
 * @returns A short representation of the bytes
 */
private static short bytesToShort(byte[] bytes, int index) {
    return (short) (((bytes[index + 1] & 0xff) << 8) |
            (bytes[index + 0] & 0xff));
}

/**
 * Reads the buffer
 *
 * @param in An InputStream
 * @param buffer An array of bytes
 * @throws IOException
 */
private static void readBuffer(InputStream in, byte[] buffer) throws IOException {
    int bytesRead = 0;
    int bytesToRead = buffer.length;
    while (bytesToRead > 0) {
        int read = in.read(buffer, bytesRead, bytesToRead);
        bytesRead += read;
        bytesToRead -= read;
    }
}
}
Bhargav Rao
  • 50,140
  • 28
  • 121
  • 140
Vinicius DSL
  • 1,839
  • 1
  • 18
  • 26
13

If the minimum version of you application Android SDK is greater or equal to 16 (Android 4.1) the best way of video encoding is use Android Media Codec API.

From Android 4.3 APIs.

When encoding video, Android 4.1 (SDK 16) required that you provide the media with a ByteBuffer array, but Android 4.3 (SDK 18) now allows you to use a Surface as the input to an encoder. For instance, this allows you to encode input from an existing video file or using frames generated from OpenGL ES.

Media Muxer added in Android 4.3 (SDK 18) so for convenient way of writing mp4 file with Media Muxer you should have SDK>=18.

Using Media Codec API way you will get hardware accelerated encoding and you are easily encode up to 60 FPS.

You can start from 1) How to encode Bitmaps into a video using MediaCodec? or use 2) Google Grafika or 3) Bigflake.

Starting from Grafika RecordFBOActivity.java. Replace Choreographer event with you own containing bitmap to encode, remove On Screen drawing, load you bitmap as Open GL Texture and draw it on Media Codec Input Surface.

Community
  • 1
  • 1
AndreyICE
  • 3,574
  • 29
  • 27
10

jCodec has added Android support.

You need to add these to your gradle...

implementation 'org.jcodec:jcodec:0.2.3'
implementation 'org.jcodec:jcodec-android:0.2.3'

...and

android {
    ...
    configurations.all {
        resolutionStrategy.force 'com.google.code.findbugs:jsr305:3.0.2'
    }
}

I can confirm this works as expected, but with caveats. First being I tried some full size images and the file wrote, but gave an error on playback. When I scaled down, I would get an error if the width or height of the image was not even because it requires a multiple of 2 for YUV420J colorspace.

Also worthy of note, this makes your package HEAVY, heavy. My small project went over the dex limit by adding this and required enabling multidex.

FileChannelWrapper out = null;
File dir = what ever directory you use...
File file = new File(dir, "test.mp4");

try { out = NIOUtils.writableFileChannel(file.getAbsolutePath());
      AndroidSequenceEncoder encoder = new AndroidSequenceEncoder(out, Rational.R(15, 1));
      for (Bitmap bitmap : bitmaps) {
          encoder.encodeImage(bitmap);
      }
      encoder.finish();
} finally {
    NIOUtils.closeQuietly(out);
}
JaydeepW
  • 3,237
  • 1
  • 25
  • 27
a54studio
  • 965
  • 11
  • 11
  • I was using CameraX and `ImageAnalysis.Analyzer` to analyze the images at the same time I wanted to store what the user is seeing into a video file which would later be used for debugging and this part of code really helped. Thanks. – JaydeepW Sep 25 '20 at 07:30
  • it takes a lot of time, in my case when call this function with 145 images (low quality images- 200 kb each) it takes around 2 mins. is there any way to boost this ?? – Ankush Shrivastava Oct 06 '20 at 21:02
5

You can use Bitmp4 to convert sequence of images to MP4 file.

Sample code :

...

val encoder = MP4Encoder()
     encoder.setFrameDelay(50)
     encoder.setOutputFilePath(exportedFile.path)
     encoder.setOutputSize(width, width)

 startExport()

 stopExport()

 addFrame(bitmap) //called intervally

It's a java library so it's easy to import it into Android project, you don't have to use NDK unlike ffmpeg.

Refer https://github.com/dbof10/Bitmp4 for sample code & downloads.

Ken Zira
  • 1,170
  • 2
  • 10
  • 12
3

I created a project that should be able to handle this. The code is light and fairly straight forward.

https://github.com/dburckh/bitmap2video

Dustin
  • 2,064
  • 1
  • 16
  • 12
1

Abhishek V was right, more information about jcodec SequenceEncoder: see Android make animated video from list of images

Recently I have built a real-time video system using raspberry pi and Android devices, met the same problem as yours. Instead of saving a list of image files, I used some real-time streaming protocols like RTP/RTCP to transfer data stream to user. If your requirement is something like this, maybe you could change your strategies.

Another suggestion is that you may explore some C/C++ libraries, using NDK/JNI to break the limitation of Java.

Hope the suggestions make sense to you :)

Community
  • 1
  • 1
Tommy
  • 301
  • 2
  • 16
  • Yes it made sense to me :) Thanks for your suggestions and for the link. Could we go to a chat room maybe we can talk more about the strategy? – Khalil Khalaf Nov 03 '16 at 17:28
  • The contents on this page helped me a lot, the example there is totally practical: [android-streaming-live-camera-video-to-web](http://www.androidhive.info/2014/06/android-streaming-live-camera-video-to-web-page/). If this strategy fits your requirement and you have further more questions about the implementation, we may continue to chat – Tommy Nov 04 '16 at 06:52