2

I've jumped off the deep end, and have decided to figure out low-latency audio on iOS using Audio Units. I've read as much documentation (from Apple and forums galore) as I can find, and the overall concepts make sense, but I'm still scratching my head on some concepts that I need help with:

  1. I saw somewhere that AU Graphs are deprecated and that I should instead connect Audio Units directly. I'm cool with that... but how? Do I just need to use the Connection property of an Audio Unit to connect it to a source AU, and off I go? Initialize and Start the Units, and watch the magic happen? (cause it doesn't for me...)

  2. What's the best Audio Unit setup to use if I simply want to grab audio from my mic, do some processing to the audio data, and then store that audio data without sending it out to the RemoteIO speaker, bus 0 output? I tried hooking up a GenericOutput AudioUnit to catch the data in a callback without any luck...

That's it. I can provide code when requested, but it's way too late, and this has wiped me out. If there's know easy answer, that's cool. I'll send any code snippets at will. Suffice it to say, I can easily get a simple RemoteIO, mic in, speaker out setup working great. Latency seems non-existant (at least to my ears). I just want to do something with the mic data and store it in memory without it going out to the speaker. Eventually hooking in the eq and mixer would be hip, but one step at a time.

FWIW, I'm coding in Xamarin Forms/C# land, but code examples in Objective C, Swift or whatever is fine. I'm stuck on the concepts, not necessarily the exact code.

THANKS!

E Ludema
  • 87
  • 8
  • 1
    low latency means actually only that your sample-rate is small. So that your CPU is not working to hot and does not become glitchy because `it's eating the soup with a very small spoon` sweating like hell or even worse in latency `eating with a spoon that is too large` so it would eat slow by definition but not sweating while doing so (typical on slow CPU's and bad processing). – Ol Sen Aug 02 '20 at 12:07
  • i suggest you look into [TheAmazingAudioEngine2](https://github.com/TheAmazingAudioEngine/TheAmazingAudioEngine2) written by the also amazing [Michael Tyson](https://stackoverflow.com/users/196159/michael-tyson) – Ol Sen Aug 02 '20 at 12:14
  • 1
    Apart from that, audio programming is kind of pointer to buffer ping pong. :-P So try to keep your buffer block size (smal spoon) as needed and large as wanted (big spoon). And try to avoid Objective-C .. Michael has written nice stuff in his blog, you should read before you start to spend days in the basics. – Ol Sen Aug 02 '20 at 12:18
  • Thanks, Ol Sen for the metaphor... it's a great one! I'll use that with my music tech students when they're first running into buffers and latency in the DAWs they're getting into for the first time :) It looks like TheAmazingAudioEngine2 is unfortunately out of commission, or else I'd go for it. Then again, this is good for me to figure this stuff out! I think after reading another 200 forum posts today, and going over the WWDC notes from 2017, I'm going to try the AVAudioEngine route, which looks closely tied to the AudioUnit/AUGraph world... – E Ludema Aug 03 '20 at 07:17
  • 1
    `AUGraph` is deprecated but its replacement is `AVAudioEngine`. `AVAudioEngine` makes it much easier to deal with audio units. I would avoid trying to connect them directly- it is not always an easy task depending on the graph. https://developer.apple.com/documentation/avfoundation/avaudioengine – sbooth Aug 03 '20 at 15:36

2 Answers2

3

Working with audio units without a graph is pretty simple and very flexible. To connect two units, you call AudioUnitSetProperty this way :

AudioUnitConnection connection;
connection.sourceAudioUnit = sourceUnit;
connection.sourceOutputNumber = sourceOutputIndex;
connection.destInputNumber = destinationInputIndex;
 
AudioUnitSetProperty(
    destinationUnit,
    kAudioUnitProperty_MakeConnection,
    kAudioUnitScope_Input,
    destinationInputIndex,
    &connection,
    sizeof(connection)
);

Note that it is required for the units connected this way to have their Stream Format set uniformly and that it must be done before their initialization.

dspr
  • 2,383
  • 2
  • 15
  • 19
1

Your question mentions Audio Units, and Graphs. As said in the comments, the graph concept has been replaced with the idea of attaching "nodes" to an AVAudioEngine. These nodes then "connect" to other nodes. Connecting nodes creates signal paths and starting the engine makes it all happen. This may be obvious, but I am trying to respond generally here. You can do this all in Swift or in Objective-C.

Two high level perspectives to consider with iOS audio are the idea of a "host" and that of a "plugin". The host is an app and it hosts plugins. The plugin is usually created as an "app extension" and you can look up audio unit extensions for more about that as needed. You said you have one doing what you want, so this is all explaining the code used in a host

Attach AudioUnit to an AVaudioEngine

var components = [AVAudioUnitComponent]()

let description =
    AudioComponentDescription(
        componentType: 0,
        componentSubType: 0,
        componentManufacturer: 0,
        componentFlags: 0,
        componentFlagsMask: 0
    )

components = AVAudioUnitComponentManager.shared().components(matching: description)
.compactMap({ au -> AVAudioUnitComponent? in
    if AudioUnitTypes.codeInTypes(
        au.audioComponentDescription.componentType,
        AudioUnitTypes.instrumentAudioUnitTypes,
        AudioUnitTypes.fxAudioUnitTypes,
        AudioUnitTypes.midiAudioUnitTypes
        ) && !AudioUnitTypes.isApplePlugin(au.manufacturerName) {
        return au
    }
    return nil
})

guard let component = components.first else { fatalError("bugs") }

let description = component.audioComponentDescription

AVAudioUnit.instantiate(with: description) { (audioUnit: AVAudioUnit?, error: Error?) in
        
    if let e = error {
        return print("\(e)")
    }
    // save and connect
    guard let audioUnit = audioUnit else {
        print("Audio Unit was Nil")
        return
    }
    let hardwareFormat = self.engine.outputNode.outputFormat(forBus: 0)      
        
    self.engine.attach(au)
    self.engine.connect(au, to: self.engine.mainMixerNode, format: hardwareFormat)
}

Once you have your AudioUnit loaded, you can connect your Athe AVAudioNodeTapBlock below, it has more to it since it need to be a binary or something that other host apps that aren't yours can load.

Recording an AVAudioInputNode

(You can replace the audio unit with the input node.)

In an app, you can record audio by creating an AVAudioInputNode or just reference the 'inputNode' property of the AVAudioEngine, which is going to be connected to the system's selected input device(mic, line in, etc) by default

Once you have the input node you want to process the audio of, next "install a tap" on the node. You can also connect your input node to a mixer node and install a tap there.

https://developer.apple.com/documentation/avfoundation/avaudionode/1387122-installtap

func installTap(onBus bus: AVAudioNodeBus, 
                bufferSize: AVAudioFrameCount, 
                format: AVAudioFormat?, 
                block tapBlock: @escaping AVAudioNodeTapBlock)

The installed tap will basically split your audio stream into two signal paths. It will keep sending the audio to the AvaudioEngine's output device and also send the audio to a function that you define. This function(AVAudioNodeTapBlock) is passed to 'installTap' from AVAudioNode. The AVFoundation subsystem calls the AVAudioNodeTapBlock and passes you the input data one buffer at a time along with the time at which the data arrived.

https://developer.apple.com/documentation/avfoundation/avaudionodetapblock

typealias AVAudioNodeTapBlock = (AVAudioPCMBuffer, AVAudioTime) -> Void

Now the system is sending the audio data to a programmable context, and you can do what you want with it. To use it elsewhere, you can create a separate AVAudioPCMBuffer and write each of the passed in buffers to it in the AVAudioNodeTapBlock.

Matty H
  • 325
  • 4
  • 11
  • This is great. Thanks for the awesome detail. I had to drop the project once the school year started since we were just in survival mode, but this seems to also dodge the issue I ended up running into with Xamarin and the SinkNode... I'll have to give you solution a try when I can get back to the project. Sorry for the slow [thumbs up] rate! – E Ludema May 12 '21 at 03:39