15

I'm designing a simple proof of concept for multitrack recorder.

Obvious starting point is to play from file A.caf to headphones while simultaneously recording microphone input into file B.caf

This question -- Record and play audio Simultaneously -- points out that there are three levels at which I can work:

  • AVFoundation API (AVAudioPlayer + AVAudioRecorder)
  • Audio Queue API
  • Audio Unit API (RemoteIO)

What is the best level to work at? Obviously the generic answer is to work at the highest level that gets the job done, which would be AVFoundation.

But I'm taking this job on from someone who gave up due to latency issues (he was getting a 0.3sec delay between the files), so maybe I need to work at a lower level to avoid these issues?

Furthermore, what source code is available to springboard from? I have been looking at SpeakHere sample ( http://developer.apple.com/library/ios/#samplecode/SpeakHere/Introduction/Intro.html ). if I can't find something simpler I will use this.

But can anyone suggest something simpler/else? I would rather not work with C++ code if I can avoid it.

Is anyone aware of some public code that uses AVFoundation to do this?

EDIT: AVFoundation example here: http://www.iphoneam.com/blog/index.php?title=using-the-iphone-to-record-audio-a-guide&more=1&c=1&tb=1&pb=1

EDIT(2): Much nicer looking one here: http://www.switchonthecode.com/tutorials/create-a-basic-iphone-audio-player-with-av-foundation-framework

EDIT(3): How do I record audio on iPhone with AVAudioRecorder?

Community
  • 1
  • 1
P i
  • 29,020
  • 36
  • 159
  • 267

3 Answers3

6

To avoid latency issues, you will have to work at a lower level than AVFoundation alright. Check out this sample code from Apple - Auriotouch. It uses Remote I/O.

Viraj
  • 1,880
  • 22
  • 31
  • 1
    Actually I did get it working spot on using AVFoundation. You just have to make sure everything is readied before you kick it off. In addition to that you need to start the recorder 70ms after the player and it lines up nicely. (this may change depending on the device). – P i Aug 26 '11 at 16:14
  • 2
    Good to know. Maybe you should answer your question. – Viraj Aug 27 '11 at 12:00
5

As suggested by Viraj, here is the answer.

Yes, you can achieve very good results using AVFoundation. Firstly you need to pay attention to the fact that for both the player and the recorder, activating them is a two step process.

First you prime it.

Then you play it.

So, prime everything. Then play everything.

This will get your latency down to about 70ms. I tested by recording a metronome tick, then playing it back through the speakers while holding the iPhone up to the speakers and simultaneously recording.

The second recording had a clear echo, which I found to be ~70ms. I could have analysed the signal in Audacity to get an exact offset.

So in order to line everything up I just performSelector:x withObject: y afterDelay: 70.0/1000.0

There may be hidden snags, for example the delay may differ from device to device. it may even differ depending on device activity. It is even possible the thread could get interrupted/rescheduled in between starting the player and starting the recorder.

But it works, and is a lot tidier than messing around with audio queues / units.

P i
  • 29,020
  • 36
  • 159
  • 267
  • 1
    How do you "prime it"? Can you explain what you mean by that, please? – Bjorn Roche Dec 10 '13 at 15:46
  • 1
    Sadly, there's still no way to do this right (at least that I can find). 70ms is simply too much off-sync for me, and with the range of different devices today, the difference in desync is too big. On iPhone 5s, I get about 66ms, but on iPhone 6, I get about 40ms. I noticed this post is from 2011. Did you ever find a better way of doing this? – Sti Oct 20 '14 at 15:17
1

I had this problem and I solved it in my project simply by changing the PreferredHardwareIOBufferDuration parameter of the AudioSession. I think I have just 6ms latency now, that is good enough for my app.

Check this answer that has a good explanation.

Community
  • 1
  • 1
ernewston
  • 923
  • 6
  • 22
  • can you provide code please. i'm struggling from last 4 days. – Ramkumar chintala Jul 27 '16 at 07:26
  • Sure, the code I have is in Xamarin but I guess is not hard to write it in swift or objective. Look at the link I provide in the answer, there is an example. Also, be sure you prepare your recorder (I use AVAudioRecorder and prepareToRecord) and your player (I crate an AVPlayer instance with the filename) before so they start with low latency. Also, when you activate you AudioSession be sure you set the category to playAndRecord. In Xamarin looks like this: `AVAudioSession.SharedInstance().SetPreferredIOBufferDuration(0.005, out error);` – ernewston Jul 27 '16 at 18:00
  • I have an AudioSessionHelper class with a function called ActivateMyAudioSesion, where I set the category, set the Mode, the preferredIOBufferDuration and then I make it active. I call this function before starting the first recording (do not call it every time as it is expensive), or you can do it when your app starts. Hope it helps! – ernewston Jul 27 '16 at 18:10
  • Link : http://www.stefanpopp.de/2011/capture-iphone-microphone/ reference from this code i cant able to play in bluetooth speakers. its playing in in ear speaker(top) while talking. i want to play audio in bluetooth speakers while talking. can you help me on this . i tried so many links and different ways but im not getting. – Ramkumar chintala Jul 28 '16 at 07:12
  • Are you trying to record and play at the same time with low latency using bluetooth? I'm not sure how to do that, and not sure if it is possible to do it with low latency as some bluetooth devices are not reliable. – ernewston Jul 29 '16 at 02:59
  • with delay is it possible . if yes how much delayed with its played ? – Ramkumar chintala Jul 29 '16 at 07:15
  • I'm sorry but I never worked with bluetooth, I can't help you here. – ernewston Jul 29 '16 at 20:29