2

How do I access additional audio hardware outputs other than 1-2 using AVFoundation? I'm writing swift code for a Mac OS-X app which plays mp3 files through various output devices (USB interface, dante, soundflower) which looks like the following:

myPlayer = AVPlayer(URL: myFilePathURL)
myPlayer.audioOutputDeviceUniqueID = myAudioOutputDevices[1].deviceUID()
myPlayer.play()

But, I'm not sure how to play the audio file to channels other than just 1-2. For instance I'd like to play an mp3 to outputs 3-4.

Can I do this through AVPlayer? Or do I need to look elsewhere? Maybe AVAudioEngine along with mixer nodes? I looked through the AVAudioEngine examples, and couldn't find hardware channels referenced anywhere. Thanks for any help!

Michael Sweet
  • 51
  • 1
  • 7
  • It's my understanding that AVAudioEngine is only an iOS framework - so that won't work. The Apple documentation for iOS and OS-X frameworks is very intertwined - sadly without much explanation about what's available for each platform separately. – Michael Sweet Jul 27 '16 at 01:42
  • I'm trying another path which uses AVAudioEngine and AVAudioPlayerNode, along with the mainMixerNode. I haven't figured out whether I can assign the mainMixerNode to a different hardware channel other than the default. Maybe this is a better path than using AVPlayer? – Michael Sweet Jul 27 '16 at 12:14
  • Started working with channels maps as described [here](https://developer.apple.com/library/prerelease/content/technotes/tn2091/_index.html#//apple_ref/doc/uid/DTS10003118-CH1-CHANNELMAPPING) Not working yet, but hopefully soon. – Michael Sweet Jul 27 '16 at 13:54
  • [This post](http://stackoverflow.com/questions/21832733/how-to-use-avaudiosessioncategorymultiroute-on-iphone-device/35009801#35009801) is also helping me a lot for audio channel routing using channel maps in objective-c. Trying to convert it to Swift. – Michael Sweet Jul 27 '16 at 14:52
  • There is several more sites containing information about routing channels including a [WWDC session](https://developer.apple.com/videos/play/wwdc2012/505/), and [this discussion](https://forums.developer.apple.com/thread/15416) through apple developers forums. None of it has Swift code, but it explains the processes well. – Michael Sweet Jul 28 '16 at 00:56
  • Still working on this. Here is a good link on [which Apple audio frameworks are supported by iOS and OS-X](https://developer.apple.com/library/ios/documentation/MusicAudio/Conceptual/CoreAudioOverview/WhatsinCoreAudio/WhatsinCoreAudio.html#//apple_ref/doc/uid/TP40003577-CH4-SW4). – Michael Sweet Jul 31 '16 at 16:10

2 Answers2

2

I've iterated this code over time - but the basic outline will work. Here is my current setup code to send audio to multi channel setups. I'm currently doing this with Dante Virtual Soundcard with 16 stereo instanced streams using the code below:

func setupAudioPath(){
    //print("setupAudioPath")

    // get output hardware format
    let output = engine.outputNode
    outputHWFormat = output.outputFormat(forBus: 0)

    //print("outputHWFormat = \(outputHWFormat)")
    //print("outputHWFormat.channelCount = \(outputHWFormat.channelCount)")

    // connect mixer to output
    mixer = engine.mainMixerNode

    //then work on the player end by first attaching the player to the engine
    engine.attach(player)

    engine.connect(mixer, to: output, format: outputHWFormat)

    var channelMap: [sint32] = []
    //UInt32 numOfChannels = fileFormat.NumberChannels();
    let numOfChannels: UInt32 = UInt32(numberOfStreams) * UInt32(2);    // Number of output device channels
    let mapSize: UInt32 = numOfChannels * UInt32(MemoryLayout<sint32>.size);
    for _ in 0...(numOfChannels-1) {
        channelMap.append(-1)
    }
    //channelMap[desiredInputChannel] = deviceOutputChannel;
    channelMap[leftChannel - 1] = 0;
    channelMap[leftChannel]     = 1;


    //print(channelMap)
    //print("number of channels in my map: \(channelMap.count)")

    let code: OSStatus = AudioUnitSetProperty((engine.outputNode.audioUnit)!,
                                              kAudioOutputUnitProperty_ChannelMap,
                                              kAudioUnitScope_Global,
                                              1,
                                              channelMap,
                                              mapSize);


    print("osstatus = \(code)")
}
Michael Sweet
  • 51
  • 1
  • 7
1

I have a swift version that is working with 2 channels setting the channel map property. I haven't tested it with a full multichannel systems, but the principles should be the same.

let engine = AVAudioEngine()
let player = AVAudioPlayerNode()

func testCode(){

    // get output hardware format
    let output = engine.outputNode
    let outputHWFormat = output.outputFormatForBus(0)
    // connect mixer to output
    let mixer = engine.mainMixerNode
    engine.connect(mixer, to: output, format: outputHWFormat)


    //then work on the player end by first attaching the player to the engine
    engine.attachNode(player)


    //find the audiofile
    guard let audioFileURL = NSBundle.mainBundle().URLForResource("tones", withExtension: "wav") else {
        fatalError("audio file is not in bundle.")
    }


    var songFile:AVAudioFile?
    do {
        songFile = try AVAudioFile(forReading: audioFileURL)
        print(songFile!.processingFormat)

        // connect player to mixer
        engine.connect(player, to: mixer, format: songFile!.processingFormat)

    } catch {
        fatalError("canot create AVAudioFile \(error)")
    }



    let channelMap: [Int32] = [0, 1] //left out left, right out right
    //let channelMap: [Int32] = [1, 0] //right out left, left out right


    let propSize: UInt32 = UInt32(channelMap.count) * UInt32(sizeof(sint32))

    let code: OSStatus = AudioUnitSetProperty((engine.inputNode?.audioUnit)!,
                                              kAudioOutputUnitProperty_ChannelMap,
                                              kAudioUnitScope_Global,
                                              1,
                                              channelMap,
                                              propSize);

    print(code)


    do {
        try engine.start()
    } catch {
        fatalError("Could not start engine. error: \(error).")
    }

    player.scheduleFile(songFile!, atTime: nil) {
        print("done")
        self.player.play()
    }

    player.play()



}
Michael Sweet
  • 51
  • 1
  • 7
  • Do you know how to do this with the inputs? I have an 8ch interface but only want my mixer node to listen to a specific channel on the interface. – JoeBayLD Dec 12 '16 at 18:58
  • I tried this example and it still plays the file out of both left and right channels... – tsugua Aug 25 '17 at 19:38
  • I haven't done this with the input side. But I've used it with the output side to send feeds to 16 stereo outputs (hardware or via sound flower/virtual dante). – Michael Sweet Aug 26 '17 at 20:29
  • @MichaelSweet I'm trying to do something similar with 15 mono files in and route them to 15 discrete mono outputs but I'm not having much luck. I've used your example as a basis but I'm not sure how the routing works. Did you get something like this working? https://stackoverflow.com/questions/46041563/playing-multiple-wav-out-multiple-channels-avaudioengine – tsugua Sep 05 '17 at 05:04