-11

I have 2 audio files. the first is background music and the second is a speech. (Each of them is about 4 minutes in length)

Now I want to mix them and receive a 4 minute speech with a background music.

Raedwald
  • 46,613
  • 43
  • 151
  • 237
Sadegh
  • 862
  • 1
  • 8
  • 23
  • 12
    -1 You shouldn't come here for people to do your work for you. Post code that you have attempted and what you think is wrong with it, with helpful information like stacktraces and outputs – Stu Whyte Jul 11 '14 at 15:36
  • dear Stu! this is just a very very little of my project. my project is almost completely and I just trapped with this part of my project. if you search Stack about this topic you can find more than 20 question and answer , and I almost try all them but no one of them helped me. my purpose of word "complete" is that give a clear and correct answer not saying use "FFMPEG" library. say how to use this library – Sadegh Jul 11 '14 at 21:59
  • I was put that bounty for this question but you answered late and the bounty was expired . and there is no choice to give it now to you. also your answer works with 1 audio not 2 audio and this is Ambiguous for me. but I gave you "correct answer tick" for gratitude instead your Disrespect.Good luck – Sadegh Jul 22 '14 at 13:12
  • It is not my problem that you disappeared from the planet. I was one day before the deadline, so please contact the moderators to award that bounty. The answer I gave shows you how to load any audio that is supported by android codecs. The second part explains how you overlay two audiotracks. –  Jul 22 '14 at 15:40
  • 1
    This question appears to be off-topic because it is about asking for a free developer. – Sumurai8 Jul 27 '14 at 16:31

2 Answers2

4

You don't need FFMPEG, you can use the standard codecs available in android.

   public void playFile(String fileToPlay)
    {
     // see where we find a suitable autiotrack
        MediaExtractor extractor = new MediaExtractor();
        try
        {
            extractor.setDataSource(fileToPlay);
        }
        catch (IOException e)
        {
            out.release();
            return;
        }
   extractor.selectTrack(0);

    String fileType=typeForFile(fileToPlay);
    if (fileType==null)
    {
        out.release();
        extractor.release();
        return;
    }

    MediaCodec codec = MediaCodec.createDecoderByType(fileType);
    MediaFormat wantedFormat=extractor.getTrackFormat(0);
    codec.configure(wantedFormat,null,null,0);
    codec.start();

    ByteBuffer[] inputBuffers = codec.getInputBuffers();
    ByteBuffer[] outputBuffers = codec.getOutputBuffers();

    // Allocate our own buffer
    int maximumBufferSizeBytes = 0;
    for(ByteBuffer bb:outputBuffers)
    {
        int c=bb.capacity();
        if (c>maximumBufferSizeBytes) maximumBufferSizeBytes=c;
    }
    setupBufferSizes(maximumBufferSizeBytes/4);

    final MediaCodec.BufferInfo bufferInfo=new MediaCodec.BufferInfo();
    MediaFormat format=null;
    while(true)
    {
        long timeoutUs=1000000;
        int inputBufferIndex = codec.dequeueInputBuffer(timeoutUs);
        if (inputBufferIndex >= 0)
        {
            ByteBuffer targetBuffer = inputBuffers[inputBufferIndex];
            int read = extractor.readSampleData(targetBuffer, 0);
            int flags=extractor.getSampleFlags();
            if (read>0)
                codec.queueInputBuffer(inputBufferIndex, 0,read, 0, flags);
            else
                codec.queueInputBuffer(inputBufferIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
            extractor.advance();
        }

        int outputBufferIndex = codec.dequeueOutputBuffer(bufferInfo,timeoutUs);
        if (outputBufferIndex >= 0)
        {
            final boolean last = bufferInfo.flags == MediaCodec.BUFFER_FLAG_END_OF_STREAM;

            int s=bufferInfo.size/4;
            ByteBuffer bytes=outputBuffers[outputBufferIndex];
            ((ByteBuffer)bytes.position(bufferInfo.offset)).asShortBuffer().get(shorts,0,s*2);
            process(shorts,0,s*2);

            codec.releaseOutputBuffer(outputBufferIndex, false);
            if (last)
                break;
        }
        else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED)
        {
            outputBuffers = codec.getOutputBuffers();
        }
        else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED)
        {
            format = codec.getOutputFormat();
        }
    }

    extractor.release();
    codec.stop();
    codec.release();

The above code deals with the reading of 1 file. The 'process' routine (which isn't defined yet) receives an interleaved array of samples (left/right/left/right/left/right). All you need to do now is add these channels to a target buffer. E.g

short[] target=new short[LENGTH OF 4 MINUTES];
int idx=0;
process(short[] audio, int l)
{
  for(int i=0;i<l;i++)
  target[idx++]+=audio[i]/2;
}

The resulting target array contains then your overlaid samples.

0

For merging two audio files, you will have to use FFMPEG util compiled for Android. Here are some links I found regarding compiling FFMPEG for android :

  1. How to Build FFmpeg for Android
  2. Download FFMPEG library for Android (SourceForge)

Here is the Google groups thread regarding how to use FFMPEG library in android to merge 2 audio files.

Also have a look at this SO post too. Hope it helps.

Community
  • 1
  • 1
akshay7692
  • 601
  • 1
  • 8
  • 19
  • give a sample or complete answer. I already tests it and can't work with that lib . also that library that you give to me is for C# not java (but I have that java lib but can't use that) – Sadegh Jul 11 '14 at 15:26
  • @Sadegh nice to see a little appreciation! – Stu Whyte Jul 11 '14 at 15:37
  • 1
    I know its frustrating when we get stumbled upon a problem for several days. I just searched about it and gave you some pointers so that you can try it out. At least show what you have tried and what problems you came across ! – akshay7692 Jul 11 '14 at 15:55