5

The situation

I should show 200-350 frames animation in my application. Images have 500x300ish resolution. If user wants to share animation, i have to convert it to Video. For convertion i am using ffmpeg command.

ffmpeg -y -r 1 -i /sdcard/videokit/pic00%d.jpg -i /sdcard/videokit/in.mp3 -strict experimental -ar 44100 -ac 2 -ab 256k -b 2097152 -ar 22050 -vcodec mpeg4 -b 2097152 -s 320x240 /sdcard/videokit/out.mp4

To convert images to video ffmpeg wants actual files not Bitmap or byte[].

Problem

Compressing bitmaps to image files taking to much time. 210 image convertion takes about 1 minute to finish on average device(HTC ONE m7). Converting image files to mp4 takes about 15 seconds on the same device. All together user have to wait about 1.5 minutes.

What i have tried

  1. I changed comrpession format form PNG to JPEG(1.5 minute result is achieved with JPEG compression(quality=80),with PNG it takes about 2-2.5 minutes) success
  2. Tried to find how pass byte[] or bitmap to ffmpeg - no succes.

QUESTION

  1. Is there any way(library (even native)) to make saving process faster.
  2. Is there any way to pass byte[] or Bitmap objects (i mean png file decompressed to Android Bitmap Class Object) to ffmpeg library video creating method
  3. Is there any other working library which will create mp4(or any supported format(supported by main Social Networks)) from byte[] or Bitmap objects in about 30 seconds(for 200 frames).
Arsen Sench
  • 438
  • 4
  • 18
  • You need to provide more info. What do you mean by "bitmap object"? What `ffmpeg` commands did you try? You should show the complete console output/log form one of these commands. – llogan Oct 11 '16 at 18:44
  • I updated my question. There is no need for LOG because everything working, but it's taking a lot of time. My question is there any way to make conversion to jpeg faster or is there any way to pass byte[] or Bitmap[] to ffmpeg command. Command is mentioned in question. – Arsen Sench Oct 13 '16 at 08:49
  • I'm appalled 5 years passed still no answer to a question with bounty. If you're still interested in this, I have some ideas to accelerate this, involving some c/cpp programing against ffmpeg's `libav*` libraries. – halfelf Oct 19 '16 at 11:09
  • That is "Oct 11 2016", NOT "Oct 2011" ;O) five years hehe – Jon Goodwin Oct 19 '16 at 22:32
  • @halfelf it's not 5 years. i will appreciate your help – Arsen Sench Oct 19 '16 at 22:35
  • Jffmpeg is a plugin that allows the playback of a number of common audio and video formats. It is based around a Java port of parts of the FFMPEG project, supporting a number of codecs in pure Java code. Where codecs have not yet been ported, a JNI wrapper allows calls directly into the full FFMPEG code. This may offer a way to do it in memory rather than file io, but still use the speed of c++ FFMPEG. – Jon Goodwin Oct 19 '16 at 22:49
  • ouch... This is embarrassing. I'll post some codes later, have to try some first. – halfelf Oct 20 '16 at 03:24
  • @JonGoodwin i can not find normal documentation for jffmpeg. You know where to find ? – Arsen Sench Oct 20 '16 at 08:47
  • @halfelf i will wait thank you. – Arsen Sench Oct 20 '16 at 08:48
  • maybe I´m a bit late, but here is a very informative video of image compression https://www.youtube.com/watch?v=r_LpCi6DQME – Andrey Solera Oct 21 '16 at 07:24
  • Andrea, i watched the video, thank you very much. aaptOptions { cruncherEnabled = false } this command save me about 5% time only. But no real change . . . – Arsen Sench Oct 21 '16 at 09:06
  • What about [MediaMuxer](https://developer.android.com/reference/android/media/MediaMuxer.html)? It's an Android component and works with ByteBuffers. – Mimmo Grottoli Oct 22 '16 at 22:45
  • MediaMuxer is very very slow compared to ffmpeg, – Arsen Sench Oct 22 '16 at 22:57

3 Answers3

2

You can convert Bitmap (or byte[]) to YUV format quickly, using renderscript (see https://stackoverflow.com/a/39877029/192373). You can pass these YUV frames to ffmpeg library (as suggests halfelf), or use the built-in native MediaCodec which uses dedicated hardware on modt devices (but compression options are less flexible than all-software ffmpeg).

Community
  • 1
  • 1
Alex Cohn
  • 56,089
  • 9
  • 113
  • 307
1

There are two steps slow us down. Compressing image to PNG/JPG and writing them to disk. Both can be skipped if we directly code against ffmpeg libs, instead of calling ffmpeg command. (There are other improvements too, such like GPU encoding and multithreading, but much more complicated.)

Some approaches to code:

  1. Only use C/C++ NDK for android programming. FFmpeg will happily work. But I guess it's not an option here.
  2. Build it from scratch by Java JNI. Not much experience here. I only know this could link java to c/c++ libs.
  3. Some java wrapper. Luckily I found javacpp-presets. (There are others too, but this one is good enough and up to date.)

This library includes a good example ported from famous dranger's ffmpeg tutorial, though it is a demuxing one.

We can try to write a muxing one, following ffmpeg's muxing.c example.

import java.io.*;
import org.bytedeco.javacpp.*;
import static org.bytedeco.javacpp.avcodec.*;
import static org.bytedeco.javacpp.avformat.*;
import static org.bytedeco.javacpp.avutil.*;
import static org.bytedeco.javacpp.swscale.*;

public class Muxer {

    public class OutputStream {
        public AVStream Stream;
        public AVCodecContext Ctx;

        public AVFrame Frame;

        public SwsContext SwsCtx;

        public void setStream(AVStream s) {
            this.Stream = s;
        }

        public AVStream getStream() {
            return this.Stream;
        }

        public void setCodecCtx(AVCodecContext c) {
            this.Ctx = c;
        }

        public AVCodecContext getCodecCtx() {
            return this.Ctx;
        }

        public void setFrame(AVFrame f) {
            this.Frame = f;
        }

        public AVFrame getFrame() {
            return this.Frame;
        }

        public OutputStream() {
            Stream = null;
            Ctx = null;
            Frame = null;
            SwsCtx = null;
        }

    }

    public static void main(String[] args) throws IOException {
        Muxer t = new Muxer();
        OutputStream VideoSt = t.new OutputStream();
        AVOutputFormat Fmt = null;
        AVFormatContext FmtCtx = new AVFormatContext(null);
        AVCodec VideoCodec = null;
        AVDictionary Opt = null;
        SwsContext SwsCtx = null;
        AVPacket Pkt = new AVPacket();

        int GotOutput;
        int InLineSize[] = new int[1];

        String FilePath = "/path/xxx.mp4";

        avformat_alloc_output_context2(FmtCtx, null, null, FilePath);
        Fmt = FmtCtx.oformat();

        AVCodec codec = avcodec_find_encoder_by_name("libx264");
        av_format_set_video_codec(FmtCtx, codec);

        VideoCodec = avcodec_find_encoder(Fmt.video_codec());
        VideoSt.setStream(avformat_new_stream(FmtCtx, null));
        AVStream stream = VideoSt.getStream();
        VideoSt.getStream().id(FmtCtx.nb_streams() - 1);
        VideoSt.setCodecCtx(avcodec_alloc_context3(VideoCodec));

        VideoSt.getCodecCtx().codec_id(Fmt.video_codec());

        VideoSt.getCodecCtx().bit_rate(5120000);

        VideoSt.getCodecCtx().width(1920);
        VideoSt.getCodecCtx().height(1080);
        AVRational fps = new AVRational();
        fps.den(25); fps.num(1);
        VideoSt.getStream().time_base(fps);
        VideoSt.getCodecCtx().time_base(fps);
        VideoSt.getCodecCtx().gop_size(10);
        VideoSt.getCodecCtx().max_b_frames();
        VideoSt.getCodecCtx().pix_fmt(AV_PIX_FMT_YUV420P);

        if ((FmtCtx.oformat().flags() & AVFMT_GLOBALHEADER) != 0)
            VideoSt.getCodecCtx().flags(VideoSt.getCodecCtx().flags() | AV_CODEC_FLAG_GLOBAL_HEADER);

        avcodec_open2(VideoSt.getCodecCtx(), VideoCodec, Opt);

        VideoSt.setFrame(av_frame_alloc());
        VideoSt.getFrame().format(VideoSt.getCodecCtx().pix_fmt());
        VideoSt.getFrame().width(1920);
        VideoSt.getFrame().height(1080);

        av_frame_get_buffer(VideoSt.getFrame(), 32);

        // should be at least Long or even BigInteger
        // it is a unsigned long in C
        int nextpts = 0;

        av_dump_format(FmtCtx, 0, FilePath, 1);
        avio_open(FmtCtx.pb(), FilePath, AVIO_FLAG_WRITE);

        avformat_write_header(FmtCtx, Opt);

        int[] got_output = { 0 };
        while (still_has_input) {

            // convert or directly copy your Bytes[] into VideoSt.Frame here
            // AVFrame structure has two important data fields: 
            // AVFrame.data (uint8_t*[]) and AVFrame.linesize (int[]) 
            // data includes pixel values in some formats and linesize is size of each picture line.
            // For example, if formats is RGB. linesize should has 3 valid values equaling to `image_width * 3`. And data will point to three arrays containing rgb values.
            // But I guess we'll need swscale() to convert pixel format here. From RGB to yuv420p (or other yuv family formats).

            Pkt = new AVPacket();
            av_init_packet(Pkt);

            VideoSt.getFrame().pts(nextpts++);
            avcodec_encode_video2(VideoSt.getCodecCtx(), Pkt, VideoSt.getFrame(), got_output);

            av_packet_rescale_ts(Pkt, VideoSt.getCodecCtx().time_base(), VideoSt.getStream().time_base());
            Pkt.stream_index(VideoSt.getStream().index());
            av_interleaved_write_frame(FmtCtx, Pkt);

            av_packet_unref(Pkt);
        }

        // get delayed frames
        for (got_output[0] = 1; got_output[0] != 0;) {
            Pkt = new AVPacket();
            av_init_packet(Pkt);

            avcodec_encode_video2(VideoSt.getCodecCtx(), Pkt, null, got_output);
            if (got_output[0] > 0) {
                av_packet_rescale_ts(Pkt, VideoSt.getCodecCtx().time_base(), VideoSt.getStream().time_base());
                Pkt.stream_index(VideoSt.getStream().index());
                av_interleaved_write_frame(FmtCtx, Pkt);
            }

            av_packet_unref(Pkt);
        }

        // free c structs
        avcodec_free_context(VideoSt.getCodecCtx());
        av_frame_free(VideoSt.getFrame());
        avio_closep(FmtCtx.pb());
        avformat_free_context(FmtCtx);
    }
}

For porting C code, normally several changes should be done:

  • Mostly the work is to replace every C struct member access (. and ->) to java getter/setter.
  • Also there are many C address-of operators &, just delete them.
  • Change C NULL macro and C++ nullptr pointer to Java null object.
  • C codes used to check bool result of an int type in if, for, while. Have to compare them with 0 in java.

And there may be other API changes, as long as referencing to javacpp-presets docs, it'll be ok.

Note that I omitted all error handling codes here. It may be needed in real development/production.

halfelf
  • 9,737
  • 13
  • 54
  • 63
-1

Really I don't want to make publicity but to use pkzip and its SDK may be a good

solution. Pkzip compress file to 95% as they say.

The Smartcrypt SDK is available in all major programming languages, including C++, Java, and C#, and can be used to encrypt both structured and unstructured data. Changes to existing applications typically consist of two or three lines of code.