2

I want to build a simple metronome app using AVAudioEngine with these features:

  • Solid timing (I know, I know, I should be using Audio Units, but I'm still struggling with Core Audio stuff / Obj-C wrappers etc.)
  • Two different sounds on the "1" and on beats "2"/"3"/"4" of the bar.
  • Some kind of visual feedback (at least a display of the current beat) which needs to be in sync with audio.

So I have created two short click sounds (26ms / 1150 samples @ 16 bit / 44,1 kHz / stereo wav files) and load them into 2 buffers. Their lengths will be set to represent one period.

My UI setup is simple: A button to toggle start / pause and a label to display the current beat (my "counter" variable).

When using scheduleBuffer's loop property the timing is okay, but as I need to have 2 different sounds and a way to sync/update my UI while looping the clicks I cannot use this. I figured out to use the completionHandler instead which the restarts my playClickLoop() function - see my code attach below.

Unfortunately while implementing this I didn't really measure the accuracy of the timing. As it now turns out when setting bpm to 120, it plays the loop at only about 117,5 bpm - quite steadily but still way too slow. When bpm is set to 180, my app plays at about 172,3 bpm.

What's going on here? Is this delay introduced by using the completionHandler? Is there any way to improve the timing? Or is my whole approach wrong?

Thanks in advance! Alex

import UIKit
import AVFoundation

class ViewController: UIViewController {
    
    private let engine = AVAudioEngine()
    private let player = AVAudioPlayerNode()
    
    private let fileName1 = "sound1.wav"
    private let fileName2 = "sound2.wav"
    private var file1: AVAudioFile! = nil
    private var file2: AVAudioFile! = nil
    private var buffer1: AVAudioPCMBuffer! = nil
    private var buffer2: AVAudioPCMBuffer! = nil
    
    private let sampleRate: Double = 44100
    
    private var bpm: Double = 180.0
    private var periodLengthInSamples: Double { 60.0 / bpm * sampleRate }
    private var counter: Int = 0
    
    private enum MetronomeState {case run; case stop}
    private var state: MetronomeState = .stop
    
    @IBOutlet weak var label: UILabel!
    
    override func viewDidLoad() {
        
        super.viewDidLoad()
        
        //
        // MARK: Loading buffer1
        //
        let path1 = Bundle.main.path(forResource: fileName1, ofType: nil)!
        let url1 = URL(fileURLWithPath: path1)
        do {file1 = try AVAudioFile(forReading: url1)
            buffer1 = AVAudioPCMBuffer(
                pcmFormat: file1.processingFormat,
                frameCapacity: AVAudioFrameCount(periodLengthInSamples))
            try file1.read(into: buffer1!)
            buffer1.frameLength = AVAudioFrameCount(periodLengthInSamples)
        } catch { print("Error loading buffer1 \(error)") }
        
        //
        // MARK: Loading buffer2
        //
        let path2 = Bundle.main.path(forResource: fileName2, ofType: nil)!
        let url2 = URL(fileURLWithPath: path2)
        do {file2 = try AVAudioFile(forReading: url2)
            buffer2 = AVAudioPCMBuffer(
                pcmFormat: file2.processingFormat,
                frameCapacity: AVAudioFrameCount(periodLengthInSamples))
            try file2.read(into: buffer2!)
            buffer2.frameLength = AVAudioFrameCount(periodLengthInSamples)
        } catch { print("Error loading buffer2 \(error)") }
        
        //
        // MARK: Configure + start engine
        //
        engine.attach(player)
        engine.connect(player, to: engine.mainMixerNode, format: file1.processingFormat)
        engine.prepare()
        do { try engine.start() } catch { print(error) }
    }
    
    //
    // MARK: Play / Pause toggle action
    //
    @IBAction func buttonPresed(_ sender: UIButton) {
        
        sender.isSelected = !sender.isSelected
        
        if player.isPlaying {
            state = .stop
        } else {
            state = .run
            
            try! engine.start()
            player.play()
            
            playClickLoop()
        }
    }
    
    private func playClickLoop() {
        
        //
        //  MARK: Completion handler
        //
        let scheduleBufferCompletionHandler = { [unowned self] /*(_: AVAudioPlayerNodeCompletionCallbackType)*/ in
            
            DispatchQueue.main.async {
                
                switch state {
                
                case .run:
                    self.playClickLoop()
            
                case .stop:
                    engine.stop()
                    player.stop()
                    counter = 0
                }
            }
        }
        
        //
        // MARK: Schedule buffer + play
        //
        if engine.isRunning {
            
            counter += 1; if counter > 4 {counter = 1} // Counting from 1 to 4 only
            
            if counter == 1 {
                //
                // MARK: Playing sound1 on beat 1
                //
                player.scheduleBuffer(buffer1,
                                      at: nil,
                                      options: [.interruptsAtLoop],
                                      //completionCallbackType: .dataPlayedBack,
                                      completionHandler: scheduleBufferCompletionHandler)
            } else {
                //
                // MARK: Playing sound2 on beats 2, 3 & 4
                //
                player.scheduleBuffer(buffer2,
                                      at: nil,
                                      options: [.interruptsAtLoop],
                                      //completionCallbackType: .dataRendered,
                                      completionHandler: scheduleBufferCompletionHandler)
            }
            //
            // MARK: Display current beat on UILabel + to console
            //
            DispatchQueue.main.async {
                self.label.text = String(self.counter)
                print(self.counter)
            }
        }
    }
}
McNail
  • 100
  • 8
  • 1
    Any time you say `DispatchQueue.main.async` you are throwing away all possibility of accurate timing. You are now asynchronous, meaning "do this at some future time and I don't know or care when". – matt May 01 '21 at 00:50
  • 1
    Did you look at any of the many other metronome questions / answers? For instance the very first one, https://stackoverflow.com/questions/32641990/using-avaudioengine-to-schedule-sounds-for-low-latency-metronome?rq=1, looks helpful. – matt May 01 '21 at 00:52
  • Thanks for your comment. Removing DispatchQueue.main.async command from my completionHandler didn't change timing issue, it still plays constantly at 117,5 bpm instead of 120 bpm, or at 172,3 bpm instead of 180 bpm as before. Could it be something else is wrong here? Unfortunately I can't get rid of my second DispatchQueue.main.async command - it's needed to access the UI. – McNail May 01 '21 at 09:16
  • To answer your second comment: Of course I did read plenty of other questions / answers here. Unfortunately your link doesn't help in my case: I can NOT use the loop property of scheduleBuffer - I need a way to sync my UI to the audio, that's why I'm using the completionHandler to re-trigger the play function. – McNail May 01 '21 at 09:19
  • By the way, it's on GitHub now, in case anyone wants to try it on their machine: https://github.com/Alexander-Nagel/AVAudioEngine-Metronome-with-timing-issues) – McNail May 01 '21 at 10:12

2 Answers2

2

As Phil Freihofner suggested above, here's the solution to my own problem:

The most important lesson I learned: The completionHandler callback provided by the scheduleBuffer command is not called early enough to trigger re-scheduling of another buffer while the first one is still playing. This will result in (inaudible) gaps between the sounds and mess up the timing. There must already be another buffer "in reserve", i.e. having been schdeduled before the current one has been scheduled.

Using the completionCallbackType parameter of scheduleBuffer didn't change much considering the time of the completion callback: When setting it to .dataRendered or .dataConsumed the callback was already too late to re-schedule another buffer. Using .dataPlayedback made things only worse :-)

So, to achieve seamless playback (with correct timing!) I simply activated a timer that triggers twice per period. All odd numbered timer events will re-schedule another buffer.

Sometimes the solution is so easy it's embarrassing... But sometimes you have to try almost every wrong approach first to find it ;-)

My complete working solution (including the two sound files and the UI) can be found here on GitHub:

https://github.com/Alexander-Nagel/Metronome-using-AVAudioEngine

import UIKit
import AVFoundation

private let DEBUGGING_OUTPUT = true

class ViewController: UIViewController{
    
    private var engine = AVAudioEngine()
    private var player = AVAudioPlayerNode()
    private var mixer = AVAudioMixerNode()
    
    private let fileName1 = "sound1.wav"
    private let fileName2 = "sound2.wav"
    private var file1: AVAudioFile! = nil
    private var file2: AVAudioFile! = nil
    private var buffer1: AVAudioPCMBuffer! = nil
    private var buffer2: AVAudioPCMBuffer! = nil
    
    private let sampleRate: Double = 44100
    
    private var bpm: Double = 133.33
    private var periodLengthInSamples: Double {
        60.0 / bpm * sampleRate
    }
    private var timerEventCounter: Int = 1
    private var currentBeat: Int = 1
    private var timer: Timer! = nil
    
    private enum MetronomeState {case running; case stopped}
    private var state: MetronomeState = .stopped
        
    @IBOutlet weak var beatLabel: UILabel!
    @IBOutlet weak var bpmLabel: UILabel!
    @IBOutlet weak var playPauseButton: UIButton!
    
    override func viewDidLoad() {
        
        super.viewDidLoad()
        
        bpmLabel.text = "\(bpm) BPM"
        
        setupAudio()
    }
    
    private func setupAudio() {
        
        //
        // MARK: Loading buffer1
        //
        let path1 = Bundle.main.path(forResource: fileName1, ofType: nil)!
        let url1 = URL(fileURLWithPath: path1)
        do {file1 = try AVAudioFile(forReading: url1)
            buffer1 = AVAudioPCMBuffer(
                pcmFormat: file1.processingFormat,
                frameCapacity: AVAudioFrameCount(periodLengthInSamples))
            try file1.read(into: buffer1!)
            buffer1.frameLength = AVAudioFrameCount(periodLengthInSamples)
        } catch { print("Error loading buffer1 \(error)") }
        
        //
        // MARK: Loading buffer2
        //
        let path2 = Bundle.main.path(forResource: fileName2, ofType: nil)!
        let url2 = URL(fileURLWithPath: path2)
        do {file2 = try AVAudioFile(forReading: url2)
            buffer2 = AVAudioPCMBuffer(
                pcmFormat: file2.processingFormat,
                frameCapacity: AVAudioFrameCount(periodLengthInSamples))
            try file2.read(into: buffer2!)
            buffer2.frameLength = AVAudioFrameCount(periodLengthInSamples)
        } catch { print("Error loading buffer2 \(error)") }
        
        //
        // MARK: Configure + start engine
        //
        engine.attach(player)
        engine.connect(player, to: engine.mainMixerNode, format: file1.processingFormat)
        engine.prepare()
        do { try engine.start() } catch { print(error) }
    }
    
    //
    // MARK: Play / Pause toggle action
    //
    @IBAction func buttonPresed(_ sender: UIButton) {
        
        sender.isSelected = !sender.isSelected
        
        if state == .running {
            
            //
            // PAUSE: Stop timer and reset counters
            //
            state = .stopped
            
            timer.invalidate()
            
            timerEventCounter = 1
            currentBeat = 1
            
        } else {
            
            //
            // START: Pre-load first sound and start timer
            //
            state = .running
            
            scheduleFirstBuffer()
            
            startTimer()
        }
    }
    
    private func startTimer() {
        
        if DEBUGGING_OUTPUT {
            print("# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #  ")
            print()
        }
        
        //
        // Compute interval for 2 events per period and set up timer
        //
        let timerIntervallInSamples = 0.5 * self.periodLengthInSamples / sampleRate
        
        timer = Timer.scheduledTimer(withTimeInterval: timerIntervallInSamples, repeats: true) { timer in
            
            //
            // Only for debugging: Print counter values at start of timer event
            //
            // Values at begin of timer event
            if DEBUGGING_OUTPUT {
                print("timerEvent #\(self.timerEventCounter) at \(self.bpm) BPM")
                print("Entering \ttimerEventCounter: \(self.timerEventCounter) \tcurrentBeat: \(self.currentBeat) ")
            }
            
            //
            // Schedule next buffer at 1st, 3rd, 5th & 7th timerEvent
            //
            var bufferScheduled: String = "" // only needed for debugging / console output
            switch self.timerEventCounter {
            case 7:
                
                //
                // Schedule main sound
                //
                self.player.scheduleBuffer(self.buffer1, at:nil, options: [], completionHandler: nil)
                bufferScheduled = "buffer1"
                
            case 1, 3, 5:
                
                //
                // Schedule subdivision sound
                //
                self.player.scheduleBuffer(self.buffer2, at:nil, options: [], completionHandler: nil)
                bufferScheduled = "buffer2"
                
            default:
                bufferScheduled = ""
            }
            
            //
            // Display current beat & increase currentBeat (1...4) at 2nd, 4th, 6th & 8th timerEvent
            //
            if self.timerEventCounter % 2 == 0 {
                DispatchQueue.main.async {
                    self.beatLabel.text = String(self.currentBeat)
                }
                self.currentBeat += 1; if self.currentBeat > 4 {self.currentBeat = 1}
            }
            
            //
            // Increase timerEventCounter, two events per beat.
            //
            self.timerEventCounter += 1; if self.timerEventCounter > 8 {self.timerEventCounter = 1}
            
            
            //
            // Only for debugging: Print counter values at end of timer event
            //
            if DEBUGGING_OUTPUT {
                print("Exiting \ttimerEventCounter: \(self.timerEventCounter) \tcurrentBeat: \(self.currentBeat) \tscheduling: \(bufferScheduled)")
                print()
            }
        }
    }
    
    private func scheduleFirstBuffer() {
        
        player.stop()
        
        //
        // pre-load accented main sound (for beat "1") before trigger starts
        //
        player.scheduleBuffer(buffer1, at: nil, options: [], completionHandler: nil)
        player.play()
        beatLabel.text = String(currentBeat)
    }
}

Thanks so much for your help everyone! This is a wonderful community.

Alex

Phil Freihofner
  • 7,645
  • 1
  • 20
  • 41
McNail
  • 100
  • 8
0

How accurate is the tool or process which you are using to get your measure?

I can't tell for sure that your files have the correct number of PCM frames as I am not a C programmer. It looks like data from the wav header is included when you load the files. This makes me wonder if maybe there is some latency incurred with the playbacks while the header information is processed repeatedly at the start of each play or loop.

I had good luck building a metronome in Java by using a plan of continuously outputting an endless stream derived from reading PCM frames. Timing is achieved by counting PCM frames and routing in either silence (PCM datapoint = 0) or the click's PCM data, based on the period of the chosen metronome setting and the length of the click in PCM frames.

Phil Freihofner
  • 7,645
  • 1
  • 20
  • 41
  • Thank you for your thoughts. I'm no C Programmer neither - my code is in Swift :-) As to your suggestions: There's no real tool/process involved in computing my buffers. I just compute the periodLengthInSamples (60 / bpm * sampleRate), then after loading my (very short!) click files into the buffers, I use `buffer1.frameLength = AVAudioFrameCount(periodLengthInSamples)` to set the buffers to the needed length. This actually IS a PCM buffer, so there's no header data that could mess with my timing. – McNail May 02 '21 at 20:04
  • Unfortunately, using the `completionCallbackType:` parameter of `scheduleBuffer` doesn't help either: Setting it to `.dataRendered` or `.dataConsumed` should in theory result in an earlier callback (instead of setting it to `.dataPlayedback`) but as it turns out, it's still not early enough to get a seamless timing-accurate playback. – McNail May 02 '21 at 20:04
  • My new (working!) approach: I simply set up a timer to trigger twice per sound. Just before starting the timer, I schedule the very 1st sound, then start the player, then a few moments later the timer starts: Every odd timer event re-schedules a further buffer. The first timer event happens just a few ms after playing of the first sound has begun. That's way earlier than waiting for the first sounds' completion callback as I did it in my code above. – McNail May 02 '21 at 20:06
  • So there's always another buffer scheduled waiting to played & buffer playback buffers is seamless & perfect. No more callbacks, no more need for `DispatchQueue.main.async{}`, hooray! ;) – McNail May 02 '21 at 20:06
  • What has helped me a lot understanding and visualizing the whole AVAudioEngine / scheduling issue was [this slideshare introduction to AVAudioEngine](http://www.slideshare.net/bobmccune/building-modern-audio-apps-with-avaudioengine). – McNail May 02 '21 at 20:06
  • 1
    Glad to hear you were able to figure out a solution you are happy with. I recommend you write it up and post it as an answer, so people will more easily find and learn from it. Was a bit of a sleepy head when I posted my suggestion this AM and the use of buffers reminded me of C. – Phil Freihofner May 03 '21 at 01:50
  • Thanks for your suggestion! I just posted my answer. – McNail May 03 '21 at 15:31