3

I've been poking around with AVAudioEngine and I'm having trouble integrating AVAudioUnitEffect classes. For example, with AVAudioUnitDelay...

@implementation ViewController {
AVAudioEngine *engine;
AVAudioPlayerNode *player;
}

...

- (IBAction)playButtonHit:(id)sender {
if (!player){
    NSURL *bandsURL = [[NSBundle mainBundle] URLForResource:@"Bands With Managers" withExtension:@"mp3"];
    AVAudioFile *file = [[AVAudioFile alloc] initForReading:bandsURL error:nil];

    engine = [[AVAudioEngine alloc] init];
    player = [[AVAudioPlayerNode alloc] init];
    [engine attachNode:player];

    AVAudioUnitDelay *delay = [[AVAudioUnitDelay alloc] init];
    delay.wetDryMix = 50;

    [engine connect:player to:delay format:file.processingFormat];
    [engine connect:delay to:[engine outputNode] format:file.processingFormat];

    [player scheduleFile:file atTime:nil completionHandler:nil];
    [engine prepare];
    [engine startAndReturnError:nil];
}
[player play];  

}

When the method is called the app crashes and I get this error: "* Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: [_nodes containsObject: node1] && [_nodes containsObject: node2]'"

I'm modeling this after some of the examples from the "AVAudioEngine in Practice" session from WWDC. I know there's probably something obvious I'm missing but can't figure it out....

JDMS
  • 33
  • 1
  • 6

2 Answers2

7

You forgot to attach your AvAudioUnitDelay object to your AvAudioEngine nodes before linking them ;)

Here is the working code :

- (IBAction)playMusic:(id)sender {
    if (!player){
        NSURL *bandsURL = [[NSBundle mainBundle] URLForResource:@"Bands With Managers" withExtension:@"mp3"];
        AVAudioFile *file = [[AVAudioFile alloc] initForReading:bandsURL error:nil];

        engine = [[AVAudioEngine alloc] init];
        player = [[AVAudioPlayerNode alloc] init];
        [engine attachNode:player];

        AVAudioUnitDelay *delay = [[AVAudioUnitDelay alloc] init];
        delay.wetDryMix = 50;
        [engine attachNode:delay]; 

        [engine connect:player to:delay format:file.processingFormat];
        [engine connect:delay to:[engine outputNode] format:file.processingFormat];

        [player scheduleFile:file atTime:nil completionHandler:nil];
        [engine prepare];
        [engine startAndReturnError:nil];
    }
    [player play]; 
}
Mailoman
  • 71
  • 1
1

It is not a problem of the AVAudioUnitEffect! I tried it with that code

NSError *err = nil;

self.engine = [[AVAudioEngine alloc] init];
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[self.engine attachNode:player];

NSURL *fileURL = [[NSBundle mainBundle] URLForResource:@"sound" withExtension:@"m4a"];
AVAudioFile *file = [[AVAudioFile alloc] initForReading:fileURL error:&err];

AVAudioMixerNode *mainMixer = [self.engine mainMixerNode];
[self.engine connect:player to:mainMixer format:file.processingFormat];

[player scheduleFile:file atTime:nil completionHandler:nil];

[self.engine startAndReturnError:&err];

if (err != nil) {
    NSLog(@"An error occured");
}

[player play];

while self.engine is defined by

@property (nonatomic, strong) AVAudioEngine *engine;

I think that is a bug in AVAudioEngine, because it causes a memory leak: It starts playing the first samples and then it crashes because of heavy memory usage (more than 300 MB in my case of a 16 kB m4a file).


Update 12/07/2014: Apple fixed this issue with iOS 8 Seed 3 (Build 12A4318c)!

Michael Dorner
  • 17,587
  • 13
  • 87
  • 117