21

In my Android App, I would like to take in some audio from the mic of the smartphone and play it immediately, live, like a microphone, with no lag. I am currently thinking of using AudioRecord and AudioTrack classes (from what I have read), but I'm not quite sure how to proceed.

I checked out some other questions on Stack Overflow but they don't exactly answer what I would like to do. And most are from 2012.

So how can I use these classes to input and output audio simultaneously?

ALSO: I had a look at the MediaRecorder API, but from what I read, that requires you to save the audio to a file, which I don't want to do. Can it be tweeked to meet my requirements? Or am I better off just using AudioRecord?

Thanks

EDIT:

Here is my updated code below as @Pradip Pramanick suggested:

final Thread record = new Thread(new Runnable() {
        @Override
        public void run() {
            while (!Thread.interrupted()) {
                MediaRecorder microphone = new MediaRecorder();
                microphone.setAudioSource(MediaRecorder.AudioSource.MIC);
                    microphone.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
                microphone.setOutputFile(filename);
                microphone.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
                try {
                    microphone.prepare();
                } catch (IOException e) {
                    e.printStackTrace();
                }
                microphone.start();
            }
        }
    });

    final Thread play = new Thread(new Runnable() {
        @Override
        public void run() {
            while (!Thread.interrupted()) {
                player = new MediaPlayer();
                try {
                    player.setDataSource(filename);
                    player.prepare();
                    player.start();
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        }
    });

I am getting an Illegal State Exception | Start failed: -38. But I am calling microphone.start after microphone.prepare... What seems to be the problem? I searched other threads which said there might be other background apps using the microphone. I searched my device: Moto X Play (3rd Gen) and found none. (I even turned off "Ok Google" voice recognition, but the error kept coming).

ERROR LOG:

Here is the log-cat showing the most recent errors:

01-31 09:37:21.064 344-3339/? E/MediaPlayerService: offset error
01-31 09:37:21.065 1835-1922/com.synerflow.testapp E/MediaPlayer: Unable to create media player

01-31 09:37:21.065 1835-1922/com.synerflow.testapp I/Player: player.prepare() has failed
01-31 09:37:21.065 1835-1922/com.synerflow.testapp W/System.err: java.io.IOException: setDataSourceFD failed.: status=0x80000000

The IO Exception seems to be at player.setDataSource(filename), the filename variable is a string: Environment.getExternalStorageDirectory().getAbsolutePath() + "\voice.3gp"

Adifyr
  • 2,649
  • 7
  • 41
  • 62
  • 1
    Should just be able to transfer streams something like this http://stackoverflow.com/questions/5381969/android-how-to-record-mp3-radio-audio-stream/5384161#5384161. Maybe with a small buffer too. Just wondering why really - and whether you would suffer audio feedback quite badly if you did. – ste-fu Jan 26 '16 at 09:00
  • 1
    Yes the problem is due to the fact that player and recorder are both trying to read and write the same file. Use a small buffer that can hold say 1ms of audio. Then use two synchronized threads: recorder thread puts data into buffer and sets a flag. Player thread on checking the flag start playing. Refer to the classical Producer-Consumer problem – 0x5050 Feb 01 '16 at 04:26

4 Answers4

4

As far as I can think it can be done in a very simple way. I haven't tried it,but you try it. I think it'll work:

Create two threads one for recording another for playing. Say the threads are TRecord and TPlay.

In TRecord's run method do this :

public void run(){
        MediaRecorder mRecorder = null;
        mRecorder = new MediaRecorder();
        mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
        mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
        mRecorder.setOutputFile(mFileName);
        mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);

        try {
            mRecorder.prepare();
        } catch (IOException e) {
            //todo
        }

        mRecorder.start();
}

And it TPlay's run method do this :

public void run() {
MediaPlayer   mPlayer = null;
mPlayer = new MediaPlayer();
        try {
            mPlayer.setDataSource(mFileName);
            mPlayer.prepare();
            mPlayer.start();
        } catch (IOException e) {
            //todo
        }
}

Now on mainactivity simply create two threads. First start the TRecord thread then Tplay . Try it.

here is the code for file extension:

mFileName = Environment.getExternalStorageDirectory().getAbsolutePath();
 mFileName += "/audiorecordtest.3gp";
0x5050
  • 1,221
  • 1
  • 17
  • 32
  • Hi, what is the mFileName in this case? Is it a predefined file? Also, I'd like to get rid of the file after I'm done speaking. How can this be done? – Adifyr Jan 27 '16 at 14:36
  • Also, what should the extension of the file be in this case? Could you include the code that sets up the output file / data source? – Adifyr Jan 27 '16 at 16:34
  • Hi, It is giving me an error: `Start Failed: -38 | IllegalStateException`. But I am calling start after prepare. What seems to be the problem? – Adifyr Jan 28 '16 at 15:58
  • Turns out you can not use 'MediaRecorder' cant be used when some other service using the mic. Check out [this](http://stackoverflow.com/questions/23971817/mediarecorder-start-failed-38/) . Check if any similar thing is happening. Else use some other class such as AudioRecord. – 0x5050 Jan 28 '16 at 16:49
  • You can also try doing either of the tasks (record/play) using AsynkTask and check if it meets your latency requirements – 0x5050 Jan 28 '16 at 16:54
  • I put the code in AsyncTasks. It doesn't give any errors (initially, after I stop and start again, it gives an IOException) but it still does not work. – Adifyr Jan 28 '16 at 17:18
  • It would be better if you give some details like logcat. I think the IO exception is caused by synchronization problems because ultimately you are reading and writing from the same stream. Try using buffer .There is Producer-Consumer solution for threads. Check [this](http://crunchify.com/java-producer-consumer-example-handle-concurrent-read-write/) – 0x5050 Jan 29 '16 at 05:23
2

This is actually really tricky on Android. Google themselves have a very good (but slightly long) video explaining the issues.

They also have a page explaining their latency testing methods and benchmarks for various devices.

Essentially, stock Android can't do zero-latency audio, however there's nothing stopping hardware partners from adding the required hardware and platform extensions to do so.

Kas Hunt
  • 3,098
  • 1
  • 12
  • 15
1

You can try Google Oboe.

Oboe is a C++ library that makes it easy to build high-performance audio apps on Android.

Oboe has already an example for "Immediate Audio Input & Output Android"

LiveEffect Sample

This sample simply loops audio from the input stream to the output stream to demonstrate the usage of the 2 stream interfaces.

https://github.com/google/oboe/tree/master/samples/LiveEffect

Community
  • 1
  • 1
Konrad Nowicki
  • 1,929
  • 1
  • 14
  • 11
0

try this. I have not run this.

      recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
      RECORDER_SAMPLERATE, RECORDER_CHANNELS,
      RECORDER_AUDIO_ENCODING, bufferSize * BytesPerElement);

    recorder.startRecording();

    isRecording = true;

    recordingThread = new Thread(new Runnable() {

     public void run() {

      try {
          int intSize = android.media.AudioTrack.getMinBufferSize(RECORDER_SAMPLERATE,AudioFormat.CHANNEL_OUT_MONO , RECORDER_AUDIO_ENCODING);
          byte[] sData = new byte[bufferSize];
          AudioTrack at = new AudioTrack(AudioManager.STREAM_MUSIC, RECORDER_SAMPLERATE, AudioFormat.CHANNEL_OUT_MONO, RECORDER_AUDIO_ENCODING, intSize, AudioTrack.MODE_STREAM);
          while(isRecording){
          recorder.read(sData, 0, bufferSize);  //isRecording = false; onStop button

          if (at!=null) { 
              at.play();
              // Write the byte array to the track
              at.write(sData, 0, sData.length); 
              at.stop();
              at.release();
          }
          }
    } catch (IOException e) {
        e.printStackTrace();
    }
 }
  }, "AudioRecorder Thread");
    recordingThread.start();
kAmol
  • 1,297
  • 1
  • 18
  • 29
  • Its part of one of my project earlier, and is working fine. You need to assemble it properly and initialize variables like sample rate 44100 etc. :) – kAmol Jan 28 '16 at 07:02
  • So, it works. But the voice is very cracky and extremely odd. Any reason why this could be happening? Also, how much is `BytesPerElement`? – Adifyr Jan 28 '16 at 12:45
  • Try sample rate 44100 and `RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT` . I hope you have considered buffersize = `AudioRecord.getMinBufferSize(RECORDER_SAMPLERATE, RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING)` If it works, dont forget to mark as an answer. :) – kAmol Jan 28 '16 at 14:05
  • Hi, I have exactly those parameters as the sample rate and encoding. It's still giving a cracky audio. What is BytesPerElement and what is the value? Also my channel is CHANNEL_IN_MONO. Is that correct? – Adifyr Jan 28 '16 at 14:14
  • `bytesperelement=2` channel is correct. You may refer developer android website for more options – kAmol Jan 28 '16 at 14:20
  • the voice quality and latency is still terrible. – Adifyr Jan 28 '16 at 15:17
  • Then you need to try with different sample rates and modes. – kAmol Jan 28 '16 at 18:59