4

So I've cobbled together some routines for recording audio based on some posts here. The posts I've referenced are here and here, along with reading the sites they reference.

My setup: I have an existing AUGraph: (several AUSamplers -> Mixer -> RemoteIO). The AUSamplers are connected to tracks in a MusicPlayer instance. That all works fine but I want to add recording to it.

Recording is working but the resulting .caf is pitch/tempo shifted slower + has bad sound quality. Must be something wrong with the format I am specifying?

Can someone eyeball this and tell me where I am setting the format incorrectly?

EDIT: could this be a stereo/mono issue? I mean to recording in mono.

I set the stream format on the RemoteIO instance to this:

   AudioStreamBasicDescription audioFormat;

audioFormat.mSampleRate         = 44100.00;
audioFormat.mFormatID           = kAudioFormatLinearPCM;
audioFormat.mFormatFlags        = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioFormat.mFramesPerPacket    = 1;
audioFormat.mChannelsPerFrame   = 1;
audioFormat.mBitsPerChannel     = 16;
audioFormat.mBytesPerPacket     = 2;
audioFormat.mBytesPerFrame      = 2;

// Apply format
result = AudioUnitSetProperty(ioUnit, 
                              kAudioUnitProperty_StreamFormat, 
                              kAudioUnitScope_Output, 
                              kInputBus, 
                              &audioFormat, 
                              sizeof(audioFormat));

Then from a button action I create a fileRef and attach a renderCallback to the RemoteIO instance:

- (void)startRecording
{

OSStatus result;

AudioStreamBasicDescription audioFormat;

audioFormat.mSampleRate         = 44100.00;
audioFormat.mFormatID           = kAudioFormatLinearPCM;
audioFormat.mFormatFlags        = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioFormat.mFramesPerPacket    = 1;
audioFormat.mChannelsPerFrame   = 1;
audioFormat.mBitsPerChannel     = 16;
audioFormat.mBytesPerPacket     = 2;
audioFormat.mBytesPerFrame      = 2;

NSArray  *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *destinationFilePath = [[NSString alloc] initWithFormat: @"%@/output.caf", documentsDirectory];
CFURLRef destinationURL = CFURLCreateWithFileSystemPath(kCFAllocatorDefault, 
                                                        (__bridge CFStringRef)destinationFilePath, 
                                                        kCFURLPOSIXPathStyle, 
                                                        false);

result = ExtAudioFileCreateWithURL(destinationURL, 
                                   kAudioFileWAVEType, 
                                   &audioFormat, 
                                   NULL, 
                                   kAudioFileFlags_EraseFile, 
                                   &extAudioFileRef);  

CFRelease(destinationURL);
NSAssert(result == noErr, @"Couldn't create file for writing");

result = ExtAudioFileSetProperty(extAudioFileRef, 
                                 kExtAudioFileProperty_ClientDataFormat, 
                                 sizeof(AudioStreamBasicDescription), 
                                 &audioFormat);

NSAssert(result == noErr, @"Couldn't create file for format");

result =  ExtAudioFileWriteAsync(extAudioFileRef, 0, NULL);
NSAssert(result == noErr, @"Couldn't initialize write buffers for audio file");   


printf("Adding render to remoteIO     \n");
result = AudioUnitAddRenderNotify(ioUnit, renderCallback, (__bridge void*)self);
if (result) {[self printErrorMessage: @"AudioUnitAddRenderNotify" withStatus: result]; return;}
 }

Finally in my rendercallback I write out the data in the postRender phase:

static OSStatus renderCallback (void *                       inRefCon,
                            AudioUnitRenderActionFlags * ioActionFlags,
                            const AudioTimeStamp *       inTimeStamp,
                            UInt32                       inBusNumber,
                            UInt32                       inNumberFrames,
                            AudioBufferList *            ioData) 
{

OSStatus result;
   if (*ioActionFlags == kAudioUnitRenderAction_PostRender){
    double timeInSeconds = inTimeStamp->mSampleTime / kSampleRate;
    printf("%fs inBusNumber: %lu inNumberFrames: %lu \n", timeInSeconds, inBusNumber, inNumberFrames);

    MusicPlayerController* THIS = (__bridge MusicPlayerController *)inRefCon;

   result =  ExtAudioFileWriteAsync(THIS->extAudioFileRef, inNumberFrames, ioData);
    if(result) printf("ExtAudioFileWriteAsync %ld \n", result);

  }

return noErr; 
}
Community
  • 1
  • 1
spring
  • 18,009
  • 15
  • 80
  • 160
  • Does calling ExtAudioFileWriteAsync from renderCallback work for you? For me it crashes with AURemoteIO::IOThread EXC_BAD_ACCESS for some reason. renderCallback runs on a background thread, doesn't it? – Martin Konicek Oct 21 '12 at 15:45
  • @MartinKonicek - I didn't have any crash problems with ExtAudioFileWriteAsync. Maybe something is getting released which shouldn't be? – spring Oct 21 '12 at 23:26
  • Hi, I realized you're writing to the file in renderCallback while I'm doing it in render notification callback, set up using AUGraphAddRenderNotify. I'm not getting EXC_BAD_ACCESS if I create a copy of the ioData. I will let you know what the problem was when I have fixed this. Thanks a lot! – Martin Konicek Oct 22 '12 at 14:27

1 Answers1

3

Ok - found some code that solves this - though I don't fully understand why.

I had been setting the mBitsPerChannel to 16 for both the RemoteIO output stream and the ExtFileRef. The result was slowed down & scratchy audio. Setting the ExtFileRef mBitsPerChannel to 32 plus adding the kAudioFormatFlagsNativeEndian flag solves the problem: the .caf audio is perfect (while leaving the RemoteIO output stream settings to what they were).

But then also setting the RemoteIO output stream settings to match my new settings also works. So I'm confused. Shouldn't this work so long as the AudioStreamBasicDescription settings are symmetrical for the RemoteIO instance and the ExtFileRef?

Anyway... the working setting is below.

size_t bytesPerSample = sizeof (AudioUnitSampleType);

AudioStreamBasicDescription audioFormat;
audioFormat.mSampleRate= graphSampleRate;
audioFormat.mFormatID=kAudioFormatLinearPCM;
audioFormat.mFormatFlags=kAudioFormatFlagsNativeEndian|kAudioFormatFlagIsSignedInteger|kAudioFormatFlagIsPacked;
audioFormat.mBytesPerPacket=bytesPerSample;
audioFormat.mBytesPerFrame=bytesPerSample;
audioFormat.mFramesPerPacket=1;
audioFormat.mChannelsPerFrame=1;
audioFormat.mBitsPerChannel= 8 * bytesPerSample;
audioFormat.mReserved=0;
spring
  • 18,009
  • 15
  • 80
  • 160
  • what frameworks are you adding to your project, i cannot use these data types even with audiounit.framework added – owen gerig May 10 '12 at 14:04
  • @skinnyTOD nice post. where you ever able to establish the logic behind the solve? – Orpheus Mercury May 12 '12 at 19:44
  • @Orpheus - thanks. No I never sorted it. Posted on the Apple dev forums and the coreAudio mailing list too. I don't like accepting "cut n' paste" solutions without understanding them but in this case/with audio, there just isn't much foundational info around. Oh well... it works. – spring May 12 '12 at 23:41
  • @owengerig you should import Audiotoolbox.h instead. – Tom Mar 12 '13 at 05:36