1

I have an application that should also work as an AudioUnit plugin. In Xcode, I go to File > New > Target, select AudioUnit and make sure that the selected language is "Swift". In the generated code, though, I have that the actual code of the plugin is within "h", "m" Objective C Files:

#import "ChordezAuAudioUnit.h"

#import <AVFoundation/AVFoundation.h>

// Define parameter addresses.
const AudioUnitParameterID myParam1 = 0;

@interface ChordezAuAudioUnit ()

@property (nonatomic, readwrite) AUParameterTree *parameterTree;
@property AUAudioUnitBusArray *inputBusArray;
@property AUAudioUnitBusArray *outputBusArray;
@end


@implementation ChordezAuAudioUnit
@synthesize parameterTree = _parameterTree;

- (instancetype)initWithComponentDescription:(AudioComponentDescription)componentDescription

[...]

How do I develop the plugin in Swift? In this Github project, the author seems to be doing it, but I don't know how to replace the generated code with the Swift one: https://github.com/inquisitiveSoft/MIDIAudioUnitExample/blob/main/TransposeOctaveAudioUnit/MIDIAudioUnit/MIDIAudioUnit.swift

Tushar Sharma
  • 2,839
  • 1
  • 16
  • 38
pistacchio
  • 56,889
  • 107
  • 278
  • 420

2 Answers2

1

In a 2017 WWDC session on Core Audio, Apple specifically recommended against using Swift inside the real-time audio context, due to a small probability that memory allocation or other locks might occur in the Swift runtime. AFAIK, that recommendation has not been recinded by Apple (yet?).

So, if you want your Audio Unit plug-in to be reliable, the answer is to NOT develop that portion of your plug-in in Swift. Stick to the C subset of Objective C (no object messaging or instance variables) for any critical real-time code.

hotpaw2
  • 70,107
  • 14
  • 90
  • 153
1

No Swift "inside the real-time audio context".

Absolutely. You will commonly see C and Objective-C++ in the render block.

You can use Swift in the rest of the audio unit and its UI. You can even use SwiftUI too.

Gene De Lisa
  • 3,628
  • 1
  • 21
  • 36