5

I need to hold some video frames from a captureSession in memory and write them to a file when 'something' happens.

Similar to this solution, i use this code to put a frame into a NSMutableArray:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{       
    //...
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    uint8 *baseAddress = (uint8*)CVPixelBufferGetBaseAddress(imageBuffer);
    NSData *rawFrame = [[NSData alloc] initWithBytes:(void*)baseAddress length:(height * bytesPerRow)];
    [m_frameDataArray addObject:rawFrame];
    [rawFrame release];
    //...
}

And this to write the video file:

-(void)writeFramesToFile
{
    //...
    NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                    [NSNumber numberWithInt:640], AVVideoWidthKey,
                                    [NSNumber numberWithInt:480], AVVideoHeightKey,
                                    AVVideoCodecH264, AVVideoCodecKey,
                                    nil ];
    AVAssetWriterInput *bufferAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
    AVAssetWriter *bufferAssetWriter = [[AVAssetWriter alloc]initWithURL:pathURL fileType:AVFileTypeQuickTimeMovie error:&error];
    [bufferAssetWriter addInput:bufferAssetWriterInput];

    [bufferAssetWriter startWriting];
    [bufferAssetWriter startSessionAtSourceTime:startTime];
    for (NSInteger i = 1; i < m_frameDataArray.count; i++){
        NSData *rawFrame = [m_frameDataArray objectAtIndex:i];
        CVImageBufferRef imgBuf = [rawFrame bytes];
        [pixelBufferAdaptor appendPixelBuffer:imgBuf withPresentationTime:CMTimeMake(1,10)]; //<-- EXC_BAD_ACCESS
        [rawFrame release];
    }
    //... (finishing video file)
}

But something is wrong with the imgBuf reference. Any suggestions? Thanks in advance.

Community
  • 1
  • 1
jsB
  • 51
  • 1
  • 2
  • You are missing the first frame in your code. NSArray is indexed starting at 0. Also a CVImageBuffer is not just a collection of raw bytes. It is a structure. You should create a CVPixelBuffer. Take a look at CVPixelBuffer.h in the core video framework. Basically, create a new pixel buffer. Copy bytes over. – Steve McFarlin Jun 08 '11 at 08:19

2 Answers2

4

You're supposed to lock base address before accessing imageBuffer's properties.

CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
uint8 *baseAddress = (uint8*)CVPixelBufferGetBaseAddress(imageBuffer);
NSData *rawFrame = [[NSData alloc] initWithBytes:(void*)baseAddress length:(height * bytesPerRow)];
[m_frameDataArray addObject:rawFrame];
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
Alex Chugunov
  • 728
  • 5
  • 10
  • 1
    Thanks for your comment. I do have these lines in my code, but forgot to add them here, sorry. I also tried to lock the buffer without unlocking (for about 300 Samples), but this didn't work too. The CaptureSession seems to have a limited queue size around 5 Frames. – jsB Jun 06 '11 at 07:57
2

This is pretty old, but to help those that come after, there's a few issues to be fixed:

  1. Lock/unlock the base address when you copy out as suggested by Alex's answer
  2. CVImageBufferRef is an abstract base class type. You want to use CVPixelBufferCreateWithBytes to make an instance, not just typecast the raw pixel bytes. (The system needs to know the size/format of those pixels)
  3. You should create and store the new CVPixelBuffer directly from the original one's data instead of using an intermediary NSData for storage. That way you only have to do one copy instead of two.
Ethan
  • 538
  • 5
  • 7