9

I'm capturing audio using AVCaptureAudioDataOutputSampleBufferDelegate

  _captureSession = [[AVCaptureSession alloc] init];
  [self.captureSession setSessionPreset:AVCaptureSessionPresetLow];


  // Setup Audio input
  AVCaptureDevice *audioDevice = [AVCaptureDevice
                                defaultDeviceWithMediaType:AVMediaTypeAudio];
  AVCaptureDeviceInput *captureAudioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
  if(error){
      NSLog(@"Error Start capture Audio=%@", error);
  }else{
      if ([self.captureSession canAddInput:captureAudioInput]){
          [self.captureSession addInput:captureAudioInput];
      }
  }


  // Setup Audio output
  AVCaptureAudioDataOutput *audioCaptureOutput = [[AVCaptureAudioDataOutput alloc] init];
  if ([self.captureSession canAddOutput:audioCaptureOutput]){
      [self.captureSession addOutput:audioCaptureOutput];
  }
  [audioCaptureOutput release];


  //We create a serial queue 
  dispatch_queue_t audioQueue= dispatch_queue_create("audioQueue", NULL);
  [audioCaptureOutput setSampleBufferDelegate:self queue:audioQueue];
  dispatch_release(audioQueue);


  /*We start the capture*/
  [self.captureSession startRunning];

Delegate:

  - (void)captureOutput:(AVCaptureOutput *)captureOutput  didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

  // do something with sampleBuffer
  }

The question is how can i play audio from sampleBuffer?

Rubycon
  • 18,156
  • 10
  • 49
  • 70

1 Answers1

0

You can create NSData from the CMSampleBufferRef using the following code and then play it with AVAudioPlayer.

- (void)captureOutput:(AVCaptureOutput *)captureOutput  didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    AudioBufferList audioBufferList;
    NSMutableData *data= [NSMutableData data];
    CMBlockBufferRef blockBuffer;
    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, NULL, &audioBufferList, sizeof(audioBufferList), NULL, NULL, 0, &blockBuffer);

    for( int y=0; y< audioBufferList.mNumberBuffers; y++ ){

        AudioBuffer audioBuffer = audioBufferList.mBuffers[y];
        Float32 *frame = (Float32*)audioBuffer.mData;

        [data appendBytes:frame length:audioBuffer.mDataByteSize];

    }

    CFRelease(blockBuffer);

    AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithData:data error:nil];
    [player play];
}

I'm worried about how this will do performance wise though. There probably is a better way to do what you are trying to accomplish.

Gomino
  • 12,127
  • 4
  • 40
  • 49
brynbodayle
  • 6,546
  • 2
  • 33
  • 49
  • 1
    There is one problem - 'player' always is nil. Do you have any idea? – Rubycon Oct 31 '12 at 08:22
  • Pass in an error and see if there is one. Also make sure the data is not nil – brynbodayle Oct 31 '12 at 14:17
  • 1
    Error: Error Domain=NSOSStatusErrorDomain Code=1954115647 "The operation couldn’t be completed. (OSStatus error 1954115647.) Whats wrong? – Rubycon Nov 01 '12 at 09:33
  • Did you check if the data was nil? – brynbodayle Nov 01 '12 at 13:36
  • 1
    You must ad a header, e.g. a WAV-header for AVAudioPlayer to understand that it is a sound file. See Selwyn's answer at http://stackoverflow.com/questions/8504620/combine-two-wav-files-in-iphone-using-objective-c for a functioning header – Sten May 15 '13 at 10:28
  • 1
    `CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer` returns an `OSStatus` of -12737 for me, which according to the documentation is `kCMSampleBufferError_ArrayTooSmall`. Then the code crashes on `appendBytes:length:`. Any ideas? – MaxGabriel Jul 30 '13 at 21:40