3

I have the following code for generating an audio tone of given frequency and duration. It's loosely based on this answer for doing the same thing on Android (thanks: @Steve Pomeroy):

https://stackoverflow.com/a/3731075/973364

import Foundation
import CoreAudio
import AVFoundation
import Darwin

   class AudioUtil {

    class func play(frequency: Int, durationMs: Int) -> Void {
        let sampleRateHz: Double = 8000.0
        let numberOfSamples = Int((Double(durationMs) / 1000 * sampleRateHz))
        let factor: Double = 2 * M_PI / (sampleRateHz/Double(frequency))

        // Generate an array of Doubles.
        var samples = [Double](count: numberOfSamples, repeatedValue: 0.0)

        for i in 1..<numberOfSamples {
            let sample = sin(factor * Double(i))
            samples[i] = sample
        }

        // Convert to a 16 bit PCM sound array.
        var index = 0
        var sound = [Byte](count: 2 * numberOfSamples, repeatedValue: 0)

        for doubleValue in samples {
            // Scale to maximum amplitude. Int16.max is 37,767.
            var value = Int16(doubleValue * Double(Int16.max))

            // In a 16 bit wav PCM, first byte is the low order byte.
            var firstByte = Int16(value & 0x00ff)
            var secondByteHighOrderBits = Int32(value) & 0xff00
            var secondByte = Int16(secondByteHighOrderBits >> 8) // Right shift.

            // println("\(doubleValue) -> \(value) -> \(firstByte), \(secondByte)")

            sound[index++] = Byte(firstByte)
            sound[index++] = Byte(secondByte)
        }

        let format = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatInt16, sampleRate: sampleRateHz, channels:AVAudioChannelCount(1), interleaved: false)
        let buffer = AudioBuffer(mNumberChannels: 1, mDataByteSize: UInt32(sound.count), mData: &sound)
        let pcmBuffer = AVAudioPCMBuffer(PCMFormat: format, frameCapacity: AVAudioFrameCount(sound.count))
        let audioEngine = AVAudioEngine()
        let audioPlayer = AVAudioPlayerNode()

        audioEngine.attachNode(audioPlayer)
        // Runtime error occurs here:
        audioEngine.connect(audioPlayer, to: audioEngine.mainMixerNode, format: format)
        audioEngine.startAndReturnError(nil)

        audioPlayer.play()
        audioPlayer.scheduleBuffer(pcmBuffer, atTime: nil, options: nil, completionHandler: nil)
    }
}

The error I get at runtime when calling connect() on the AVAudioEngine is this:

ERROR:     [0x3bfcb9dc] AVAudioNode.mm:521: AUSetFormat: error -10868
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'

Is what I'm generating not really AVAudioCommonFormat.PCMFormatInt16?

[EDIT]

Here's another, simpler attempt using only one buffer as PCMFormatFloat32. There's no error, but no sound either.

import AVFoundation

class AudioManager:NSObject {

    let audioPlayer = AVAudioPlayerNode()

    lazy var audioEngine: AVAudioEngine = {
        let engine = AVAudioEngine()

        // Must happen only once.
        engine.attachNode(self.audioPlayer)

        return engine
    }()

    func play(frequency: Int, durationMs: Int, completionBlock:dispatch_block_t!) {
        var error: NSError?

        var mixer = audioEngine.mainMixerNode
        var sampleRateHz: Float = Float(mixer.outputFormatForBus(0).sampleRate)
        var numberOfSamples = AVAudioFrameCount((Float(durationMs) / 1000 * sampleRateHz))

        var format = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: Double(sampleRateHz), channels: AVAudioChannelCount(1), interleaved: false)

        var buffer = AVAudioPCMBuffer(PCMFormat: format, frameCapacity: numberOfSamples)
        buffer.frameLength = numberOfSamples

        // Generate sine wave
        for var i = 0; i < Int(buffer.frameLength); i++ {
            var val = sinf(Float(frequency) * Float(i) * 2 * Float(M_PI) / sampleRateHz)

            // log.debug("val: \(val)")

            buffer.floatChannelData.memory[i] = val * 0.5
        }

        // Audio engine
        audioEngine.connect(audioPlayer, to: mixer, format: format)

        log.debug("Sample rate: \(sampleRateHz), samples: \(numberOfSamples), format: \(format)")

        if !audioEngine.startAndReturnError(&error) {
            log.debug("Error: \(error)")
        }

        // Play player and buffer
        audioPlayer.play()
        audioPlayer.scheduleBuffer(buffer, atTime: nil, options: nil, completionHandler: completionBlock)
    }
}

Thanks: Thomas Royal (http://www.tmroyal.com/playing-sounds-in-swift-audioengine.html)

Community
  • 1
  • 1
Eliot
  • 2,349
  • 3
  • 28
  • 45

5 Answers5

4

The problem was that when falling out of the play() function, the player was getting cleaned up and never completed (or barely started) playing. Here's one fairly clumsy solution to that: sleep for as long as the sample before returning from play().

I'll accept a better answer that avoids having to do this by not having the player cleaned up if anyone wants to post one.

import AVFoundation

class AudioManager: NSObject, AVAudioPlayerDelegate {

    let audioPlayerNode = AVAudioPlayerNode()

    var waveAudioPlayer: AVAudioPlayer?

    var playing: Bool! = false

    lazy var audioEngine: AVAudioEngine = {
        let engine = AVAudioEngine()

        // Must happen only once.
        engine.attachNode(self.audioPlayerNode)

        return engine
    }()

    func playWaveFromBundle(filename: String, durationInSeconds: NSTimeInterval) -> Void {
        var error: NSError?
        var sound = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(filename, ofType: "wav")!)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        self.waveAudioPlayer = AVAudioPlayer(contentsOfURL: sound, error: &error)
        self.waveAudioPlayer!.delegate = self

        AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: &error)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        log.verbose("Playing \(sound)")

        self.waveAudioPlayer!.prepareToPlay()

        playing = true

        if !self.waveAudioPlayer!.play() {
            log.error("Failed to play")
        }

        // If we don't block here, the player stops as soon as this function returns. While we'd prefer to wait for audioPlayerDidFinishPlaying() to be called here, it's never called if we block here. Instead, pass in the duration of the wave file and simply sleep for that long.
        /*
        while (playing!) {
            NSThread.sleepForTimeInterval(0.1) // seconds
        }
        */

        NSThread.sleepForTimeInterval(durationInSeconds)

        log.verbose("Done")
    }

    func play(frequency: Int, durationInMillis: Int, completionBlock:dispatch_block_t!) -> Void {
        var session = AVAudioSession.sharedInstance()
        var error: NSError?

        if !session.setCategory(AVAudioSessionCategoryPlayAndRecord, error: &error) {
            log.error("Error: \(error)")
            return
        }

        var mixer = audioEngine.mainMixerNode
        var sampleRateHz: Float = Float(mixer.outputFormatForBus(0).sampleRate)
        var numberOfSamples = AVAudioFrameCount((Float(durationInMillis) / 1000 * sampleRateHz))

        var format = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: Double(sampleRateHz), channels: AVAudioChannelCount(1), interleaved: false)

        var buffer = AVAudioPCMBuffer(PCMFormat: format, frameCapacity: numberOfSamples)
        buffer.frameLength = numberOfSamples

        // Generate sine wave
        for var i = 0; i < Int(buffer.frameLength); i++ {
            var val = sinf(Float(frequency) * Float(i) * 2 * Float(M_PI) / sampleRateHz)

            // log.debug("val: \(val)")

            buffer.floatChannelData.memory[i] = val * 0.5
        }

        AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: &error)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        // Audio engine
        audioEngine.connect(audioPlayerNode, to: mixer, format: format)

        log.debug("Sample rate: \(sampleRateHz), samples: \(numberOfSamples), format: \(format)")

        if !audioEngine.startAndReturnError(&error) {
            log.error("Error: \(error)")
            return
        }

        // TODO: Check we're not in the background. Attempting to play audio while in the background throws:
        //   *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error 561015905'

        // Play player and schedule buffer
        audioPlayerNode.play()
        audioPlayerNode.scheduleBuffer(buffer, atTime: nil, options: nil, completionHandler: completionBlock)

        // If we don't block here, the player stops as soon as this function returns.
        NSThread.sleepForTimeInterval(Double(durationInMillis) * 1000.0) // seconds
    }

    // MARK: AVAudioPlayerDelegate

    func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool) {
        log.verbose("Success: \(flag)")

        playing = false
    }

    func audioPlayerDecodeErrorDidOccur(player: AVAudioPlayer!, error: NSError!) {
        log.verbose("Error: \(error)")

        playing = false
    }

    // MARK: NSObject overrides

    deinit {
        log.verbose("deinit")
    }

}

For context, this AudioManager is a lazy loaded property on my AppDelegate:

lazy var audioManager: AudioManager = {
        return AudioManager()
    }()
Eliot
  • 2,349
  • 3
  • 28
  • 45
  • your code works very well, but when I put frequency to above 20kHz which exceed human's hearing range I can still hear the tone. – Saorikido Oct 29 '15 at 05:00
  • I've also experienced that if you put a high frequency (22000hz let's say), there are lots of noisy, I can capture those noisy via an apps and displays to me frequency lines (Y axis is the frequency value, X axis is the time), I can see clearly 22000hz line is on the top, however the noises line come between 0 - 22000hz, I think I can hear the 20khz tone because of those noises, any idea why there wound be so many noisy? thanks. – Saorikido Oct 29 '15 at 07:40
  • @Eliot can you please help me I got same issue, But I was playing it directly from buffer. – Dipen Chudasama Aug 30 '17 at 06:25
  • @DipenChudasama You're going to need to post some code, possibly in a new question with a link to this one. – Eliot Sep 06 '17 at 12:34
  • @Eliot What is the class of that object referred to as "log", where you check for an error, at the line that says "log.error(...)"? I've been looking for code used for logging. Can I get that from you? – daniel Feb 08 '19 at 19:08
  • @Eliot Would you be willing to show what updated code would look lie for the line that says "buffer.floatChannelData.memory[i] = val * 0.5"? The "memory" property is no longer available in current Swift. – daniel Feb 08 '19 at 20:01
1

Try setting your session category to "AVAudioSessionCategoryPlay or AVAudioSessionCategoryPlayAndRecord." I'm using record and playback and calling it before the recording seems to work out fine. I'm guessing it has to go before you start connecting nodes.

        var session = AVAudioSession.sharedInstance()
    session.setCategory(AVAudioSessionCategoryPlayAndRecord, error: &error)
Pescolly
  • 922
  • 11
  • 18
  • Thanks. I've tried this with both categories and I still get the same exception. – Eliot Feb 09 '15 at 11:29
  • @Eliot Have you tried using a standard 32Bit Float/ 2 channel output? I've noticed that the output node is set to 32 Bit Float by default and it starts acting crazy if you want 16Bit integers. – Pescolly Feb 12 '15 at 06:56
  • That's interesting. Using PCMFormatFloat32 fixes the crash, but I still don't get any sound. I tried playing around with the number of channels and the interleaved flag and got this: 1 channel not interleaved: no crash, 1 channel interleaved: no crash, 2 channels not interleaved: no crash, 2 channels interleaved: crash. I've checked the sound works on the hardware. Have also tested this with and without your session category suggestion. – Eliot Feb 12 '15 at 08:45
  • 1
    Just to give you full credit, the setCategory() call was necessary for this to work but not sufficient. Thanks. – Eliot Jul 03 '15 at 12:20
1

Regarding the issue of not getting sound, even when using PCMFormatFloat32:

I've wrestled with the same issue for a few days now and finally found the (or at least one) problem: you need to manually set the frameLength of the PCM Buffer:

pcmBuffer.frameLength = AVAudioFrameCount(sound.count/2)

The division by two account for the two bytes per frame (16 bit encoded in two bytes).

Besides that, another change that I made, and which I don't yet know whether it matters or not is that I made the AVAudioEngine and the AVAudioPlayerNode members of the class, so as to avoid them being destroyed before playback ends.

Christian Fritz
  • 20,641
  • 3
  • 42
  • 71
  • Thanks. Have made both these changes and I still hear no sound! A completionHandler is called a second after I call scheduleBuffer() as expected, but I hear nothing. – Eliot Apr 02 '15 at 10:04
  • is your app in the foreground when you call this, or in the background? – Christian Fritz Apr 02 '15 at 15:15
  • Foreground, for sure. I'm calling it from the AppDelegate on applicationDidBecomeActive(). Even tried moving the method into the AppDelegate itself and making the audio engine and player properties on that class. – Eliot Apr 02 '15 at 19:14
1

I have been encountering the same behaviour like you, that means I was helping myself with NSThread.sleepForTimeInterval(). Right now I figured out the solution, which works for me. The point is, that the AudioEngine() object needs to be initialised out of the function Play(). It has to be initialised on the class level, so the engine can work and play the sound even after the function quits (which is immediately). Right after I moved the line initialising the AudioEngine, the sound can be heard even without the waiting "helper". Hope it will help you.

Antonin Charvat
  • 920
  • 9
  • 16
-1

To get the right number of samples(numberOfSamples): mixer.outputFormatForBus(0).sampleRate gives back 44100.0 Multiply by 1000 is not necessary in the second example.

For me first call play() und afterwards set scheduleBuffer on playernode seems not logical. I would do revers.

Joe
  • 149
  • 1
  • 5