I have a fairly complex app that has been working with the AKAppleSequencer up until now, but due to some strange behavior and bugs that pop up now and then with that sequencer, I've been hoping to move to the newer AKSequencer. Unfortunately, the new sequencer doesn't seem to be represented in the Playgrounds or much documentation, so I have been doing some guesswork. I have everything wired up in a way that seems to make sense (to me) and, as I mentioned, was working fine with AKAppleSequencer, but with AKSequencer it runs but no output is produced.
The structure of my code is broken out into multiple pieces so the node graph gets built up in disparate locations, so I'll have to show it here in chunks, with irrelevant lines deleted.
// This happens during setup
mainMixer = AKMixer()
mainMixer.volume = volume
AudioKit.output = mainMixer
// In later code, the sequencer is constructed
sequencer = AKSequencer()
sequencer!.tempo = tempo
// After the sequencer is created, I create various nodes and tracks, like this
let trackNode = trackDefinition.createNode()
let track = sequencer.addTrack(for: trackNode)
track >>> mainMixer
There's a line up there where I'm calling "createNode()" on a thing called trackDefinition. I don't think the details of that class are relevant here, but here's an example of the body of that method's code. It's pretty straightforward.
func createNode() -> AKNode {
let pad = AKMIDISampler()
do {
try pad.loadSoundFont(partConfiguration.settings["soundFontName"]!,
preset: Int(partConfiguration.settings["preset"]!)!,
bank: Int(partConfiguration.settings["bank"]!)!)
} catch {
print("Error while loading Sound Font in PadTrackDefinition: \(error)")
}
return pad
}
That code seems to be working fine. I just wanted to illustrate that I'm creating an AKMIDISampler node, loading a soundfont, and then using that node to create a track in the AKSequencer. Then I attach the track to the main mixer for output.
I used AudioKit.printConnections() to get some confirmation, and here's what that looks like.
(1]AUMultiChannelMixer <2 ch, 44100 Hz, Float32, non-inter> -> (0]AudioDeviceOutput) bus: 0
(2]Local AKSequencerTrack <2 ch, 44100 Hz, Float32, non-inter> -> (1]AUMultiChannelMixer) bus: 0
Pretty simple... Track >>> Mixer >>> Output Doesn't make any sound when playing.
I also tried it this way:
(0]AUSampler <2 ch, 44100 Hz, Float32, non-inter> -> (2]AUMultiChannelMixer) bus: 0
(2]AUMultiChannelMixer <2 ch, 44100 Hz, Float32, non-inter> -> (1]AudioDeviceOutput) bus: 0
So that's AKMIDISampler >>> Mixer >>> Output (and the sampler was used to create a track). That also doesn't make any sound.
I also saw this answer to a similar question on StackOverflow, so I tried that approach. That gave me this connection graph:
(0]AUMultiChannelMixer <2 ch, 44100 Hz, Float32, non-inter> -> (1]AudioDeviceOutput) bus: 0
(2]Local AKSequencerTrack <2 ch, 44100 Hz, Float32, non-inter> -> (0]AUMultiChannelMixer) bus: 0
(3]AUSampler <2 ch, 44100 Hz, Float32, non-inter> -> (0]AUMultiChannelMixer) bus: 1
That would be [AKMIDISampler, Track] >>> Mixer >>> Output. Still...no sound.
What am I doing wrong here? Is there some more specific way that the new sequencer tracks have to be connected into the signal graph that I'm not understanding?
UPDATE: Weird/fun/interesting addendum, if I add this code immediately after the node construction code, it produces the expected note, so I know that at least the audio engine itself is hooked up:
let midiNode = trackNode as! AKMIDISampler
try! midiNode.play(noteNumber: 60,
velocity: MIDIVelocity(127),
channel: MIDIChannel(8))