5

I've got a motu UltraLite mk4 USB audio interface attached to a Mac running MacOS 10.13.3.

I'm trying to play three stereo sound files, each to its own stereo speaker.

To start with less complexity, I'm just trying to get the stereo sound of one AVAudioPlayerNode to come out of something other than the default output channels 0 and 1.

USB audio interface front

USB audio interface back

I'm hoping to accomplish this with AVAudioEngine + a bit of low level Core Audio.

So far, I've successfully managed to play audio to my USB audio interface, by configuring AVAudioEngine's output node:

AudioUnit audioUnit = [[self.avAudioEngine outputNode] audioUnit];
OSStatus error = AudioUnitSetProperty(audioUnit,
    kAudioOutputUnitProperty_CurrentDevice,
    kAudioUnitScope_Global,
    0,
    &deviceID,
    sizeof(deviceID));
if (error) {
    NSLog(@"Failed to set desired output audio device: %d", (int)
}

I've also successfully managed to set up a channel map which lets one player play a stereo file to channels 4+5 of my USB audio interface, as described by theanalogkid in Apple's dev forums.

The trouble with the channel map is though that it is globally applied to the output of AVAudioEngine, and it affects all AVAudioPlayerNodes. So this is not the solution.

So instead of using that channel map on AVAudioEngine's outputNode's AudioUnit, I tried to create an AVAudioFormat with a custom channel layout that specifies the discrete channel indices 4 and 5:

// Create audio channel layout struct with two channels
int numChannels = 2;
AudioChannelLayout *audioChannelLayout = calloc(1, sizeof(AudioChannelLayout) + (numChannels - 1) * sizeof(AudioChannelDescription));
audioChannelLayout->mNumberChannelDescriptions = numChannels;
audioChannelLayout->mChannelLayoutTag = kAudioChannelLayoutTag_UseChannelDescriptions;
audioChannelLayout->mChannelBitmap = 0;

// Configure channel descriptions
audioChannelLayout->mChannelDescriptions[0].mChannelFlags = kAudioChannelFlags_AllOff;
audioChannelLayout->mChannelDescriptions[0].mChannelLabel = kAudioChannelLabel_Discrete_4;
audioChannelLayout->mChannelDescriptions[0].mCoordinates[0] = 0;
audioChannelLayout->mChannelDescriptions[0].mCoordinates[1] = 0;
audioChannelLayout->mChannelDescriptions[0].mCoordinates[2] = 0;

audioChannelLayout->mChannelDescriptions[1].mChannelFlags = kAudioChannelFlags_AllOff;
audioChannelLayout->mChannelDescriptions[1].mChannelLabel = kAudioChannelLabel_Discrete_5;
audioChannelLayout->mChannelDescriptions[1].mCoordinates[0] = 0;
audioChannelLayout->mChannelDescriptions[1].mCoordinates[1] = 0;
audioChannelLayout->mChannelDescriptions[1].mCoordinates[2] = 0;

// Create AVAudioChannelLayout
AVAudioChannelLayout *avAudioChannelLayout = [[AVAudioChannelLayout alloc] initWithLayout:audioChannelLayout];

// Create AVAudioFormat
AVAudioOutputNode *outputNode = engine.avAudioEngine.outputNode;
AVAudioFormat *outputHWFormat = [outputNode outputFormatForBus:0];
AVAudioFormat *targetFormat = [[AVAudioFormat alloc] initWithStreamDescription:player.avAudioFile.processingFormat.streamDescription
                                                                 channelLayout:avAudioChannelLayout];

Then I connected the player's mixer to the engine's main mixer, using the output hardware format:

[engine connect:speakerMixer to:[engine mainMixerNode] format:outputHWFormat];

And the player to its mixer, using the custom AVAudioFormat with the tweaked channel layout:

[engine connect:player to:speakerMixer format:targetFormat];

Result? Sound plays, but still only out of the default 0 and 1 channels of the USB audio interface.

If I apply the targetFormat to the mixer-to-mainMixer connection, the USB audio interface receives no sound at all.

I've also attempted to apply a channel map to the AVAudioMixerNode's underlying AVAudioUnit, however I can't seem to obtain the underlying bare-metal AudioUnit from AVAudioUnit in order to apply a channel map. Perhaps this is not possible with AVAudioEngine. If so, are you aware of any other way to do this?

Probably I've overlooked something critical. I've spent many days researching this, and feel stuck. Appreciating any help I can get.

codingChicken
  • 191
  • 12
  • I think you need to set the bus 0 input scope `kAudioUnitProperty_StreamFormat` on the output node `AudioUnit` to describe your 6 (right?) channels. Not sure if `AVAudioEngine` will be happy with that. If not you could switch to CoreAudio. This seems to be at odds with linked post, although that refers to iOS multiroute AudioSession category, which probably does this step for you. – Rhythmic Fistman Apr 09 '18 at 09:54
  • @RhythmicFistman The output channels are indexed from 0 to 19, and the first 8 are routed to the analog outputs of the USB audio interface in discrete order. When I apply a channel map to the outputNode's underlying AudioUnit, it works. How would you be able to let individual player nodes or mixer nodes route their audio to specific channels if you apply the format globally? – codingChicken Apr 09 '18 at 10:56
  • This I don’t know. Could the bus argument be used for that when connecting nodes? – Rhythmic Fistman Apr 09 '18 at 13:30

2 Answers2

6

Many years ago there was a warning in the documentation of AVAudioEngine that an app should have only one instance.

Apparently, this limitation disappeared, and I've successfully created multiple instances, each with their own channel map applied to the engine's outputNode's AudioUnit.

The documentation still claims that the engine's outputNode is a "singleton", though my tests on the Mac revealed that every AVAudioEngine instance has its own output node instance and that it is possible to set a channel map on one instance without affecting the output of another.

This isn't exactly what I had been looking for, though it does seem to work. I'd still prefer a solution where it is possible to route channels from player or mixer nodes to specific output channels of the audio hardware and do it all with one single AVAudioEngine instance. On the other hand, right now I struggle to come up with a good reason why it would be terrible to have multiple AVAudioEngine instances running.

codingChicken
  • 191
  • 12
1

@codingChicken I don't know if this can be helpful, but I stumbled across your answer and it helped me in the right direction so I'll post my solution here. I have been able to do what you aim with only one AVAudioEngine instance.

Goal

To recap, I am trying to read several audio files and output them in different channels. In my setup, I have 2 devices that both have two channels so I have 4 channels in the end. I use an aggregate device and run the iOS app with Catalyst (I couldn't make aggregate device work with AVAudioEngine on macOS ‍♂️).

I use AVAudioSourceNode as input but it should also work with AVAudioPlayerNode since both have an auAudioUnit property.

Setup formats

When the engine is being setup, I make sure that the channels count of the output node is 4:

let outputFormat = audioEngine.outputNode.outputFormat(forBus: 0)
print("Channels count:", outputFormat.channelCount) // 4

If not, this might be because the AVAudioSession is not configured with the multiRoute category.

Mixer -> Output

Then, I had to connect the engine main mixer node to the output node using the output node format.

audioEngine.connect(audioEngine.mainMixerNode, to: audioEngine.outputNode, format: format)

I don't know why it's necessary. When I read the doc, it seems that it's already done by default.

If the client never sets the connection format between the mainMixerNode and the outputNode, the engine always updates the format to track the format of the outputNode on startup or restart, even after an AVAudioEngineConfigurationChange.

Source -> Mixer

I also had to connect each source node to the main mixer node using the output format. Otherwise the channel mapping will not be effective.

engine.connect(sourceNode, to: engine.mainMixerNode, format: outputFormat)

Source channels mapping

In the end, I could set the auAudioUnit.channelMap of each source node to output to the desired channels. For instance

sourceNode.auAudioUnit.channelMap = [0, 1, -1, -1]

to output in the first pair of speakers (default) and

sourceNode.auAudioUnit.channelMap = [-1, -1, 0, 1]

to output to the second pair of speakers.

Woody
  • 151
  • 1
  • 6