1

I've found multiple questions and tutorials regarding FFMPEG, but I don't seem to understand most of them. All of the guides I have read, miss out large gaps and don't tend to explain things.

I have an existing Android app that streams audio using a third party library called AAC Decoder. For various reasons, I need to switch to use FFMPEG, but cannot figure out how. I have managed to follow guides to build FFMPEG, but then I don't understand what I am supposed to do with the output.

My app needs to stream audio only, from a remote URL. The streams can be in a variety of formats.

If anyone could link me to some comprehensive, detailed guides, or provide me with instructions, it would be great.

Thanks.

SteveEdson
  • 2,485
  • 2
  • 28
  • 46
  • See http://stackoverflow.com/questions/12136078/ffmpeg-for-android-toolchains-arm-linux-armeabi-eabi-pkg-config-is-there-any for interesting information – SirDarius Dec 10 '12 at 10:02

2 Answers2

8

I created scripts to build FFmpeg, see my answer here:

arm-linux-androideabi-gcc is unable to create an executable - compile ffmpeg for android armeabi devices

One you have FFmpeg compiled create a "jni" folder in the root of your project. In the jni folder create Android.mk with these contents:

include $(call all-subdir-makefiles)

Then create Application.mk with these contents:

APP_ABI := armeabi armeabi-v7a x86

Next, in the jni folder create the following folder structure:

ffmpeg/ffmpeg/

In the first ffmpeg folder create another Android.mk:

LOCAL_PATH := $(call my-dir)

include $(CLEAR_VARS)
LOCAL_MODULE := libavcodec
LOCAL_SRC_FILES := ffmpeg/$(TARGET_ARCH_ABI)/lib/$(LOCAL_MODULE).so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/ffmpeg/$(TARGET_ARCH_ABI)/include
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := libavformat
LOCAL_SRC_FILES := ffmpeg/$(TARGET_ARCH_ABI)/lib/$(LOCAL_MODULE).so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/ffmpeg/$(TARGET_ARCH_ABI)/include
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := libavutil
LOCAL_SRC_FILES := ffmpeg/$(TARGET_ARCH_ABI)/lib/$(LOCAL_MODULE).so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/ffmpeg/$(TARGET_ARCH_ABI)/include
include $(PREBUILT_SHARED_LIBRARY)

LOCAL_PATH:= $(call my-dir)

Finally, move the contents of the build folder (from the build script) to /jni/ffmpeg/ffmpeg/

From the project root run:

ndk-build clean

Then run:

ndk-build 

If you are feeling lazy you can simply download the jni folder from my project here and delete the "metadata" and "player" folders:

http://svn.code.sf.net/p/servestream/code/trunk/jni/

Let me know if you have any additional questions.

Community
  • 1
  • 1
William Seemann
  • 3,440
  • 10
  • 44
  • 78
  • 1
    Thanks. I'll be able to build the library, but I don't understand how to proceed from there, do you have any guides about this? – SteveEdson Dec 11 '12 at 09:23
  • Thanks for the edit, I'm trying to use your build script, but right at the end, just before exit message, I get the error `i686-linux-android-gcc is unable to create an executable file. C compiler test failed.`. I'm using android-ndk-r8c. Any ideas? – SteveEdson Dec 12 '12 at 15:21
  • Nope, Mac OSX, I'll try on the Ubuntu VM now. Unless theres a fix? – SteveEdson Dec 12 '12 at 16:10
  • I did have the Mac OS X NDK on the Ubuntu VM, thanks. It compiled fine afterwards. Can I just copy the contents of final-build in the VM, to the JNI folder in the Project, on the Mac, and proceed from there? Thanks – SteveEdson Dec 13 '12 at 11:00
  • Yes, you should be able to copy the contents of that folder to your Mac. One other thing worth mentioning is that before running "build.ffmpeg.sh" you should change the "PACKAGE" variable to match your applications package. By default the value is set to mine and this will cause issues if left unchanged. – William Seemann Dec 13 '12 at 15:16
  • Sorry for the trouble. I changed the package. I've moved the files to the Mac folder, but when I run `ndk-build clean` I get the following errors: `/Users/Steve/Downloads/android-ndk-r8c/build/core/add-application.mk:128: Android NDK: WARNING: APP_PLATFORM android-14 is larger than android:minSdkVersion 8 in ./AndroidManifest.xml Android NDK: ERROR:jni/ffmpeg/Android.mk:avcodec: LOCAL_SRC_FILES points to a missing file Android NDK: Check that /Users/Steve/Downloads/android-ndk-r8c/sources/cxx-stl/system/ffmpeg/armeabi/lib/libavcodec.so exists or that its path is correct`. – SteveEdson Dec 13 '12 at 17:16
  • Ignore that, I missed the `LOCAL_PATH := $(call my-dir)` at the beginning on the Android.mk file. You hadn't included it in the code tags so I missed it – SteveEdson Dec 13 '12 at 17:18
  • Its ok. Can you please point me where to go now? I don't know how I can start to implement the FFMpeg functions into my app. Is there any documentation available anywhere? – SteveEdson Dec 14 '12 at 14:30
  • i have followed the steps . and everything works fine . but when i load the library "System.load("libavcodec");" it gives a exception. what will be the reason ?? – Mr.G Nov 27 '13 at 10:23
  • Did you change the soname property in the build script to match your application package? What error are you receiving? An UnsatisfiedLinkException? – William Seemann Nov 27 '13 at 15:48
  • @WilliamSeemann. yes. what i did, i put the "net.sourceforge.servestream" as my application package name. What i can t understand is that how to load ffmpeg in to my java code "activity" i recieve An UnsatisfiedLinkException. – Mr.G Nov 28 '13 at 05:46
  • @WilliamSeemann do i have to change "com.bambuser.broadcaster" to my application package name – Mr.G Nov 28 '13 at 05:59
  • @WilliamSeemann thanks . this is my build-ffmpeg.sh file looks like – Mr.G Nov 28 '13 at 08:51
  • PACKAGE="\/data\/data\/net.sourceforge.servestream\/lib\/" sed -i "s/\/data\/data\/net.sourceforge.servestream\/lib\//$PACKAGE/g" arm-build.sh sed -i "s/\/data\/data\/net.sourceforge.servestream\/lib\//$PACKAGE/g" x86-build.sh – Mr.G Nov 28 '13 at 08:52
  • 1
    @WilliamSeemann . hi i have managed to build it correctly , but wat my question is now, how to use it to compress videos . from here how should i continue – Mr.G Dec 03 '13 at 11:04
2

You need to cross Compile FFMPEG for android support. create jni folder inside your project and put FFMPEG folder inside jni. setup android NDK.

Here is a copy of Config.sh which i have used to cross compile ffmpeg for android.

Config.sh


#!/bin/sh
PLATFORM=/home/nishant/Desktop/android/android-ndk-r5b/platforms/android-8/arch-arm
PREBUILT=/home/nishant/Desktop/android/android-ndk-r5b/toolchains/arm-eabi-4.4.0/prebuilt/linux-x86
LIBX264=/home/nishant/Desktop/android/workspace/DemoProject/jni/x264
LIB=/home/nishant/Desktop/android/workspace/DemoProject/jni
EXTRA_LIBS="-lgcc -lm -ldl -lz -lc"
#EXTRA_EXE_LDFLAGS="$PLATFORM/usr/lib/crtbegin_dynamic.o $PLATFORM/usr/lib/crtend_android.o"

./configure --target-os=linux \
    --arch=arm \
    --enable-version3 \
    --enable-gpl \
    --enable-nonfree \
    --disable-stripping \
    --disable-ffmpeg \
    --disable-ffplay \
    --disable-ffserver \
    --disable-ffprobe \
    --enable-encoders \
    --enable-libfaac \
    --disable-muxers \
    --disable-devices \
    --disable-protocols \
    --enable-protocol=file \
    --enable-avfilter \
    --disable-network \
    --disable-mpegaudio-hp \
    --disable-avdevice \
    --enable-cross-compile \
    --cc=$PREBUILT/bin/arm-eabi-gcc \
    --nm=$PREBUILT/bin/arm-eabi-nm \
    --prefix=/home/nishant/Desktop/android/workspace/DemoProject/jni \
    --cross-prefix=$PREBUILT/bin/arm-eabi- \
    --enable-postproc \
    --extra-libs="$EXTRA_LIBS" \
    --extra-cflags="-I$PLATFORM/usr/include/ -I$LIB/include/ -I/home/admin1/x264 -std=gnu99 -fPIC -DANDROID -fpic -mthumb-interwork -ffunction-sections -funwind-tables -fstack-protector -fno-short-enums -D__ARM_ARCH_5__ -D__ARM_ARCH_5T__ -D__ARM_ARCH_5E__ -D__ARM_ARCH_5TE__  -Wno-psabi -march=armv5te -mtune=xscale -msoft-float -mthumb -Os -fomit-frame-pointer -fno-strict-aliasing -finline-limit=64 -DANDROID -Wa,--noexecstack -MMD -MP" \
    --disable-asm \
    --enable-neon \
    --enable-armv5te \
    --enable-static \
    --disable-shared \
    --extra-ldflags="-Wl,-rpath-link=$LIB/lib -L$LIB/lib -nostdlib -Bdynamic  -Wl,--no-undefined -Wl,-z,noexecstack  -Wl,-z,nocopyreloc -Wl,-soname,/system/lib/libz.so -Wl,-rpath-link=$PLATFORM/usr/lib,-dynamic-linker=/system/bin/linker -L/usr/lib -L$PLATFORM/usr/lib -nostdlib $PLATFORM/usr/lib/crtbegin_dynamic.o $PLATFORM/usr/lib/crtend_android.o"

You can use this Config file to cross compile ffmpeg with some modifications in it.

Compile config.sh using the ndk-build command.

EDIT :

FFMPEG is Bundled with all Audio Encoders and Decoders. For AAC Encoding and Decoding use libfaac and libfaad. You can find Audio Decoding example in libavcodecs api-example.c file. You need to create a JNI Wrapper Class to manipulate the codecs. One of the decoding example from that file i am posting here.

static void audio_decode_example(const char *outfilename, const char *filename)
{
    AVCodec *codec;
    AVCodecContext *c= NULL;
    int out_size, len;
    FILE *f, *outfile;
    uint8_t *outbuf;
    uint8_t inbuf[AUDIO_INBUF_SIZE + FF_INPUT_BUFFER_PADDING_SIZE];
    AVPacket avpkt;

    av_init_packet(&avpkt);

    printf("Audio decoding\n");

    /* find the mpeg audio decoder */
    codec = avcodec_find_decoder(CODEC_ID_MP2);
    if (!codec) {
        fprintf(stderr, "codec not found\n");
        exit(1);
    }

    c= avcodec_alloc_context();

    /* open it */
    if (avcodec_open(c, codec) < 0) {
        fprintf(stderr, "could not open codec\n");
        exit(1);
    }

    outbuf = malloc(AVCODEC_MAX_AUDIO_FRAME_SIZE);

    f = fopen(filename, "rb");
    if (!f) {
        fprintf(stderr, "could not open %s\n", filename);
        exit(1);
    }
    outfile = fopen(outfilename, "wb");
    if (!outfile) {
        av_free(c);
        exit(1);
    }

    /* decode until eof */
    avpkt.data = inbuf;
    avpkt.size = fread(inbuf, 1, AUDIO_INBUF_SIZE, f);

    while (avpkt.size > 0) {
        out_size = AVCODEC_MAX_AUDIO_FRAME_SIZE;
        len = avcodec_decode_audio3(c, (short *)outbuf, &out_size, &avpkt);
        if (len < 0) {
            fprintf(stderr, "Error while decoding\n");
            exit(1);
        }
        if (out_size > 0) {
            /* if a frame has been decoded, output it */
            fwrite(outbuf, 1, out_size, outfile);
        }
        avpkt.size -= len;
        avpkt.data += len;
        if (avpkt.size < AUDIO_REFILL_THRESH) {
            /* Refill the input buffer, to avoid trying to decode
             * incomplete frames. Instead of this, one could also use
             * a parser, or use a proper container format through
             * libavformat. */
            memmove(inbuf, avpkt.data, avpkt.size);
            avpkt.data = inbuf;
            len = fread(avpkt.data + avpkt.size, 1,
                        AUDIO_INBUF_SIZE - avpkt.size, f);
            if (len > 0)
                avpkt.size += len;
        }
    }

    fclose(outfile);
    fclose(f);
    free(outbuf);

    avcodec_close(c);
    av_free(c);
}

Hope it will help you.

Nishant Rajput
  • 2,053
  • 16
  • 28
  • Thanks, once the FFMPEG lib has been built into the folder, where do I go from there? I think I understand that I have to create some sort of wrapper around the native functions. Where do I find the functions that I need? What do I do after this? Cheers – SteveEdson Dec 10 '12 at 16:53