5

I'm trying to convert an AudioBufferList that I get from an Audio Unit into a CMSampleBuffer that I can pass into an AVAssetWriter to save audio from the microphone. This conversion works, in that the calls I'm making to perform the transformation don't fail, but recording ultimately does fail, and I'm seeing some output in the logs that seems to be cause for concern.

The code I'm using looks like this:

- (void)handleAudioSamples:(AudioBufferList*)samples numSamples:(UInt32)numSamples hostTime:(UInt64)hostTime {
    // Create a CMSampleBufferRef from the list of samples, which we'll own
    AudioStreamBasicDescription monoStreamFormat;
    memset(&monoStreamFormat, 0, sizeof(monoStreamFormat));
    monoStreamFormat.mSampleRate = 48000;
    monoStreamFormat.mFormatID = kAudioFormatLinearPCM;
    monoStreamFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagsNativeEndian | kAudioFormatFlagIsPacked | kAudioFormatFlagIsNonInterleaved;
    monoStreamFormat.mBytesPerPacket = 2;
    monoStreamFormat.mFramesPerPacket = 1;
    monoStreamFormat.mBytesPerFrame = 2;
    monoStreamFormat.mChannelsPerFrame = 1;
    monoStreamFormat.mBitsPerChannel = 16;

    CMFormatDescriptionRef format = NULL;
    OSStatus status = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, &monoStreamFormat, 0, NULL, 0, NULL, NULL, &format);
    if (status != noErr) {
        // really shouldn't happen
        return;
    }

    CMSampleTimingInfo timing = { CMTimeMake(1, 48000), kCMTimeZero, kCMTimeInvalid };

    CMSampleBufferRef sampleBuffer = NULL;
    status = CMSampleBufferCreate(kCFAllocatorDefault, NULL, false, NULL, NULL, format, numSamples, 1, &timing, 0, NULL, &sampleBuffer);
    if (status != noErr) {
        // couldn't create the sample buffer
        PTKLogError(@"Failed to create sample buffer");
        CFRelease(format);
      return;
    }

    // add the samples to the buffer
    status = CMSampleBufferSetDataBufferFromAudioBufferList(sampleBuffer,
                                                            kCFAllocatorDefault,
                                                            kCFAllocatorDefault,
                                                            0,
                                                            samples);
    if (status != noErr) {
        PTKLogError(@"Failed to add samples to sample buffer");
        CFRelease(sampleBuffer);
        CFRelease(format);
        return;
    }

    NSLog(@"Original sample buf size: %ld for %d samples from %d buffers, first buffer has size %d", CMSampleBufferGetTotalSampleSize(sampleBuffer), numSamples, samples->mNumberBuffers, samples->mBuffers[0].mDataByteSize);
    NSLog(@"Original sample buf has %ld samples", CMSampleBufferGetNumSamples(sampleBuffer));

As I mentioned, the code doesn't seem to be failing, per se, but AVAssetWriter doesn't like it, and the CMSampleBuffer that I'm creating seems to have size 0, based on the fact that the following log entries are being logged:

2015-07-09 19:34:00.710 xxxx[1481:271334] Original sample buf size: 0 for 1024 samples from 1 buffers, first buffer has size 2048
2015-07-09 19:34:00.710 xxxx[1481:271334] Original sample buf has 1024 samples

Oddly, the sample buffer reports that it has 1024 samples, but size 0. The original AudioBufferList has 2048 bytes of data, which is what I'd expect for 1024 2-byte samples.

Am I doing something wrong in terms of the way I'm initializing and populating the CMSampleBuffer?

Jim Wong
  • 407
  • 3
  • 13

1 Answers1

1

It turns out that the fact that the sample size was coming back as 0 was a red herring. Once I cleaned up a few things--notably, I set the timestamp correctly, like so:

uint64_t timeNS = (uint64_t)(hostTime * _hostTimeToNSFactor);
CMTime presentationTime = CMTimeMake(timeNS, 1000000000);
CMSampleTimingInfo timing = { CMTimeMake(1, 48000), presentationTime, kCMTimeInvalid };

recording started working.

So, in the event that someone else is thrown off by the reportedly 0 sample buffer size, be aware that this is OK, at least in the case in which you're feeding the data into an AVAssetWriter.

Jim Wong
  • 407
  • 3
  • 13
  • @PabloMartinez We computed it by calling `mach_timebase_info` and this math: `_hostTimeToNSFactor = (double)tinfo.numer / tinfo.denom`. However, there may be a better approach: https://developer.apple.com/library/ios/qa/qa1643/_index.html – Jim Wong Jan 09 '16 at 18:05
  • Thanks Jim, and what value are you passing to the function on hostTime? – Pablo Martinez Jan 18 '16 at 22:50
  • Hello Jim Can you please tell me what is hostTime? I don't know where get that parameter. I only have an AudioBufferList – Pablo Martinez Jan 28 '16 at 12:01
  • Pablo, sorry for the late response: I thought I'd answered your question, but obviously I hadn't. In this code, `hostTime` is the timestamp we get in the audio unit callback: `[sampleDelegate handleAudioSamples:&list numSamples:inNumberFrames hostTime:inTimeStamp->mHostTime];` – Jim Wong May 08 '16 at 10:11
  • Hi, do you have maybe a complete code sample to show? I'm having the same issues but don't really understand how the different parts go together. tnx – YYfim Jun 21 '21 at 17:42