0

I am using RemoteIO Audio Unit for audio playback in my app with kAudioUnitProperty_ScheduledFileIDs. Audio files are in PCM format. How can I implement a render callback function for this case, so I could manually modify buffer samples?
Here is my code:

static AudioComponentInstance audioUnit;

AudioComponentDescription desc;

desc.componentType = kAudioUnitType_Output;
desc.componentSubType = kAudioUnitSubType_RemoteIO;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;

AudioComponent comp = AudioComponentFindNext(NULL, &desc);

CheckError(AudioComponentInstanceNew(comp, &audioUnit), "error AudioComponentInstanceNew");


NSURL *playerFile = [[NSBundle mainBundle] URLForResource:@"short" withExtension:@"wav"];

AudioFileID audioFileID;

CheckError(AudioFileOpenURL((__bridge CFURLRef)playerFile, kAudioFileReadPermission, 0, &audioFileID), "error AudioFileOpenURL");

// Determine file properties
UInt64 packetCount;
UInt32 size = sizeof(packetCount);
CheckError(AudioFileGetProperty(audioFileID, kAudioFilePropertyAudioDataPacketCount, &size, &packetCount),
                "AudioFileGetProperty(kAudioFilePropertyAudioDataPacketCount)");

AudioStreamBasicDescription dataFormat;
size = sizeof(dataFormat);
CheckError(AudioFileGetProperty(audioFileID, kAudioFilePropertyDataFormat, &size, &dataFormat),
                "AudioFileGetProperty(kAudioFilePropertyDataFormat)");

// Assign the region to play
ScheduledAudioFileRegion region;
memset (&region.mTimeStamp, 0, sizeof(region.mTimeStamp));
region.mTimeStamp.mFlags = kAudioTimeStampSampleTimeValid;
region.mTimeStamp.mSampleTime = 0;
region.mCompletionProc = NULL;
region.mCompletionProcUserData = NULL;
region.mAudioFile = audioFileID;
region.mLoopCount = 0;
region.mStartFrame = 0;
region.mFramesToPlay = (UInt32)packetCount * dataFormat.mFramesPerPacket;
CheckError(AudioUnitSetProperty(audioUnit, kAudioUnitProperty_ScheduledFileRegion, kAudioUnitScope_Global, 0, &region, sizeof(region)),
                "AudioUnitSetProperty(kAudioUnitProperty_ScheduledFileRegion)");

// Prime the player by reading some frames from disk
UInt32 defaultNumberOfFrames = 0;
CheckError(AudioUnitSetProperty(audioUnit, kAudioUnitProperty_ScheduledFilePrime, kAudioUnitScope_Global, 0, &defaultNumberOfFrames, sizeof(defaultNumberOfFrames)),
                "AudioUnitSetProperty(kAudioUnitProperty_ScheduledFilePrime)");

AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = MyCallback;
callbackStruct.inputProcRefCon = (__bridge void * _Nullable)(self);

CheckError(AudioUnitSetProperty(audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Input, 0, &callbackStruct, sizeof(callbackStruct)), "error AudioUnitSetProperty[kAudioUnitProperty_setRenderCallback]");

CheckError(AudioUnitInitialize(audioUnit), "error AudioUnitInitialize");

Callback function:

static OSStatus MyCallback(void *inRefCon,
                             AudioUnitRenderActionFlags *ioFlags,
                             const AudioTimeStamp *inTimeStamp,
                             UInt32 inBusNumber,
                             UInt32 inNumberFrames,
                             AudioBufferList *ioData){

    printf("my callback");  
    return noErr;
}

Audio Unit start playback on button press:

- (IBAction)playSound:(id)sender {

 CheckError(AudioOutputUnitStart(audioUnit), "error AudioOutputUnitStart");

}

This code fails during compiling with kAudioUnitErr_InvalidProperty(-10879) error. The goal is to modify buffer samples that has been read from the AudioFileID and send the result to the speakers.

jangofett
  • 59
  • 1
  • 12
  • Where is `MyCallback()` ? – user3078414 Jun 12 '16 at 20:48
  • @user3078414, added to `UPD:` section – jangofett Jun 13 '16 at 05:03
  • `kAudioUnitErr_InvalidElement` aka `-10877` seems an initialization error, _not_ a runtime error, AFAIK. I don't understand what do you mean by stating that "the rest of the code works fine". What's the "structure" of your program? You state about "using AudioUnits". Is it a single AU program, or you use _multiple AUs_? What for? If so, do you connect them explicitly or using a `AUGraph` API? – user3078414 Jun 13 '16 at 10:28
  • Which audio unit are you setting the render callback on? – dave234 Jun 13 '16 at 17:11
  • Looks from the code OP is setting the callback on the audio unit named `audioUnit`, @Dave – user3078414 Jun 13 '16 at 17:26
  • I meant to ask what subtype it is. – dave234 Jun 13 '16 at 17:29
  • There's quite a few things unclear in this question (please see my previous comment). There are setups in which registering a callback in this way wouldn't be possible, or possible at cost of breaking the API, @Dave – user3078414 Jun 13 '16 at 17:36
  • @Dave, @user3078414 the subtype is `kAudioUnitSubType_RemoteIO, no AudioGraphs, just trying to send sound directly to the speakers using one RemoteIO audio unit. I modifed the question with more source code. I created a pure new project with similar code that I had, and got a new error (-10879), any thoughts how to fix it? Thanks in advance – jangofett Jun 14 '16 at 17:42
  • Does the new code compile and work if you comment out setting render callback line ( `AudioUnitSetProperty(audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Input, 0, &callbackStruct, sizeof(callbackStruct))` ) ? – user3078414 Jun 14 '16 at 19:12
  • Nope, it breaks when I try to set _kAudioUnitProperty_ScheduledFileRegion_ property. As Dave pointed out in the answer, it is because of wrong AudioUnit type. – jangofett Jun 14 '16 at 20:08

1 Answers1

1

Seeing as how you are just getting familiar with core audio, I suggest you first get your remoteIO callback working independently of your file player. Just remove all of your file player related code and try to get that working first.

Then, once you have that working, move on to incorporating your file player.

As far as what I can see that's wrong, I think you are confusing the Audio File Services API with an audio unit. This API is used to read a file into a buffer which you would manually feed to to remoteIO, if you do want to go this route, use the Extended Audio File Services API, it's a LOT easier. The kAudioUnitProperty_ScheduledFileRegion property is supposed to be called on a file player audio unit. To get one of those, you would need to create it the same way as your remmoteIO with the exception that AudioComponentDescription's componentSubType and componentType are kAudioUnitSubType_AudioFilePlayer and kAudioUnitType_Generator respectively. Then, once you have that unit you would need to connect it to the remoteIO using the kAudioUnitProperty_MakeConnection property.

But seriously, start with just getting your remoteIO callback working, then try making a file player audio unit and connecting it (without the callback), then go from there.

Ask very specific questions about each of these steps independently, posting code you have tried that's not working, and you'll get a ton of help.

dave234
  • 4,793
  • 1
  • 14
  • 29
  • thanks a lot for pointing to the right direction. Can you please look through this question? http://stackoverflow.com/questions/37870051/ios-core-audio-audiofileplayer-unit-render-callback. I got my File player and RemoteIO Units connected and I hear the sound from speakers, but stuck again on Callback function. Thanks in advance. – jangofett Jun 16 '16 at 22:06