10

I'm trying to develop an aplication like iRig for android, so the first step is to capture the mic input and play it at the same time.

I have it, but the problem is that i get some latency that makes this unusable, and if I start processing the buffer i'm afraid it will get totally unusable.

I use audiorecord and audiotrack like this:

    new Thread(new Runnable() {
        public void run() {
            while(mRunning){
                mRecorder.read(mBuffer, 0, mBufferSize);
                //Todo: Apply filters here into the buffer and then play it modified
                mPlayer.write(mBuffer, 0, mBufferSize);         
                //Log.v("MY AMP","ARA");
            }

And the inicialization this way:

// ==================== INITIALIZE ========================= //
public void initialize(){

    mBufferSize = AudioRecord.getMinBufferSize(mHz, 
                AudioFormat.CHANNEL_CONFIGURATION_MONO, 
                AudioFormat.ENCODING_PCM_16BIT);

    mBufferSize2 = AudioTrack.getMinBufferSize(mHz, 
                AudioFormat.CHANNEL_CONFIGURATION_MONO, 
                AudioFormat.ENCODING_PCM_16BIT);

    mBuffer = new byte[mBufferSize];

    Log.v("MY AMP","Buffer size:" + mBufferSize);

    mRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 
                mHz,
                AudioFormat.CHANNEL_CONFIGURATION_MONO,
                AudioFormat.ENCODING_PCM_16BIT, 
                mBufferSize);

    mPlayer = new AudioTrack(AudioManager.STREAM_MUSIC,
                mHz,
                AudioFormat.CHANNEL_CONFIGURATION_MONO,
                AudioFormat.ENCODING_PCM_16BIT,
                mBufferSize2, 
                AudioTrack.MODE_STREAM);    

}

do you know how to get a faster response? Thanks!

slezadav
  • 6,104
  • 7
  • 40
  • 61
Jordi Puigdellívol
  • 1,710
  • 3
  • 23
  • 31
  • 1
    How u solved this problem ? I am also having a similar problem please see my question http://stackoverflow.com/questions/9413998/live-audio-recording-and-playing-in-android – Amit Feb 27 '12 at 12:36

4 Answers4

10

Android's AudioTrack\AudioRecord classes have high latency due to minimum buffer sizes. The reason for those buffer sizes is to minimize drops when GC's occur according to Google (which is a wrong decision in my opinion, you can optimize your own memory management).

What you want to do is use OpenSL, which is available from 2.3. It contains native APIs for streaming audio. Here's some docs: http://mobilepearls.com/labs/native-android-api/opensles/index.html

SirKnigget
  • 3,614
  • 2
  • 28
  • 61
  • 1
    Just as a side note, half of the Android market runs 2.2 or older unfortunately. Not trying to muddy the waters -- just frustrated, too, that the new "solution"s that the Google Android platform offers are repeatedly only useful for half of the available market, usually without cause other than greed of the phone manufacturers. – Kaganar Aug 03 '11 at 23:24
  • 2
    This answer is wrong to suggest OpenSL for low latency. Don't waste your time on OpenSL for low latency audio. Android does not have that ability now and sometime soon. Refer to following issue for details: http://code.google.com/p/android/issues/detail?id=3434 – Tae-Sung Shin Dec 07 '11 at 22:37
  • Paul, did you see the date of the issue you linked to? This is obsolete. They treated that in Android 2.3. – SirKnigget Dec 28 '11 at 08:38
  • What is the status now ? How one can get a faster response using AudioRecord and AudioTrack ? – Amit Apr 12 '12 at 10:46
2

As mSparks pointed out, streaming should be made using smaller read size: you don't need to read the full buffer to stream data!

int read = mRecorder.read(mBuffer, 0, 256); /* Or any other magic number */
if (read>0) {
    mPlayer.write(mBuffer, 0, read);  
}

This will reduce drastically your latency. If mHz is 44100 and your are in MONO configuration with 256 your latency will be no less then 1000 * 256/44100 milliseconds = ~5.8 ms. 256/44100 is the conversion from samples to seconds, so multiplying by 1000 gives you milliseconds. The problems is internal implementation of the player. You don't have control about that from java. Hope this helps someone :)

kernel78
  • 21
  • 3
  • 1
    This was very helpful, thank you. Reading the max buffer size from the get-go introduced a long delay in my processing but your trick fixed it. – bobasaurus Feb 22 '22 at 23:00
2

Just a thought, but shouldn't you be reading < mBufferSize

mSparks
  • 21
  • 1
0

My first instict was to suggest initting AudioTrack into static mode rather than streaming mode, since static mode has notably smaller latency. However, Static Mode is more appropriate for short sounds that fit entirely in memory rather than a sound you are capturing from elsewhere. But just as a wild guess, what if you set AudioTrack to static mode and feed it discrete chunks of your input audio?

If you want tighter control over audio, I'd recommend taking a look at OpenSL ES for Android. The learning curve will be a bit steeper, but you get much more fine-grained control and lower latency.

Bruno Oliveira
  • 5,056
  • 18
  • 24