7

I have built some code to process video files on OSX, frame by frame. The following is an extract from the code which builds OK, opens the file, locates the video track (only track) and starts reading CMSampleBuffers without problem. However each CMSampleBufferRef I obtain returns NULL when I try to extract the pixel buffer frame. There's no indication in iOS documentation as to why I could expect a NULL return value or how I could expect to fix the issue. It happens with all the videos on which I've tested it, regardless of capture source or CODEC.

Any help greatly appreciated.

NSString *assetInPath = @"/Users/Dave/Movies/movie.mp4";
NSURL *assetInUrl = [NSURL fileURLWithPath:assetInPath];
AVAsset *assetIn = [AVAsset assetWithURL:assetInUrl];

NSError *error;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:assetIn error:&error];
AVAssetTrack *track = [assetIn.tracks objectAtIndex:0];
AVAssetReaderOutput *assetReaderOutput = [[AVAssetReaderTrackOutput alloc]
                                              initWithTrack:track
                                              outputSettings:nil];
[assetReader addOutput:assetReaderOutput];

// Start reading
[assetReader startReading];

CMSampleBufferRef sampleBuffer;
do {
       sampleBuffer = [assetReaderOutput copyNextSampleBuffer];

       /**
        ** At this point, sampleBuffer is non-null, has all appropriate attributes to indicate that
        ** it's a video frame, 320x240 or whatever and looks perfectly fine. But the next
        ** line always returns NULL without logging any obvious error message
        **/

       CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

       if( pixelBuffer != NULL ) {
           size_t width = CVPixelBufferGetWidth(pixelBuffer);
           size_t height = CVPixelBufferGetHeight(pixelBuffer);
           CVPixelBufferLockBaseAddress(pixelBuffer, 0);
           ...
           other processing removed here for clarity
        }
} while( ... );

To be clear, I've stripped all error checking code but no problems were being indicated in that code. i.e. The AVAssetReader is reading, CMSampleBufferRef looks fine etc.

Dave Durbin
  • 3,562
  • 23
  • 33
  • If you managed to overcome this issue -and its date contributes to my assumption - could you please be kind and spare a few lines of code to demonstrate how? I'm struggling with similar issue (look for my question relating to the same function call) and simply can't solve it. All answers below suggest what is wrong - but not what is RIGHT to do, and I already tried most everything I can, and still fail. – Motti Shneor Apr 30 '19 at 06:08

3 Answers3

18

You haven't specified any outputSettings when creating your AVAssetReaderTrackOutput. I've run into your issue when specifying "nil" in order to receive the video track's original pixel format when calling copyNextSampleBuffer. In my app I wanted to ensure no conversion was happening when calling copyNextSampleBuffer for the sake of performance, if this isn't a big concern for you, specify a pixel format in the output settings.

The following are Apple's recommend pixel formats based on the hardware capabilities:

kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange kCVPixelFormatType_420YpCbCr8BiPlanarFullRange

nenchev
  • 1,998
  • 28
  • 16
  • 2
    This should be accepted answer as currently accepted post does't solve anything. – Tomasz Bąk Feb 09 '15 at 09:56
  • 4
    @nenchev how did you work around the issue in order to ensure no conversion was happening? – Andy Hin May 15 '15 at 17:19
  • @nenchev how did you work around the issue in order to ensure no conversion was happening? – Gwendal Roué Jun 01 '16 at 09:56
  • Where in the docs (or elsewhere) are these pixel formats recommended? They seem to be a little odd - because most camera's won't create planar images. However, I'm trying to do this with the CMSampleBufferRef I receive in my captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler - and always receive 'nil'. – Motti Shneor Apr 16 '19 at 21:41
  • @nechev how did you work around the issue in order to ensure no conversion was happening? – Roi Mulia Jun 22 '19 at 18:09
5

Because you haven't supplied any outputSettings you're forced to use the raw data contained within in the frame.

You have to get the block buffer from the sample buffer using CMSampleBufferGetDataBuffer(sampleBuffer), after you have that you need to get the actual location of the block buffer using

size_t blockBufferLength; char *blockBufferPointer; CMBlockBufferGetDataPointer(blockBuffer, 0, NULL, &blockBufferLength, &blockBufferPointer);

Look at *blockBufferPointer and decode the bytes using the frame header information for your required codec.

Pescolly
  • 922
  • 11
  • 18
  • how can one "decode the bytes" using "your required codec". Where was this codec required? If I decide I want to extract a TIFF image from the CMSampleBufferRef I received from captureStillImageAsynchronouslyFromConnection - how will the output component know that, and provides bytes accordingly? I don't get it. – Motti Shneor Apr 16 '19 at 21:44
  • This was written for a video analyzer where each frame has it's own header information. For a TIFF file, by the time you get the sample buffer the system will have already decoded the TIFF for you. – Pescolly Apr 17 '19 at 00:31
  • Please bare with me - I still do not understand. WHERE do I say "I want Tiff" ??? and to WHO I state this wish? As I see it, the CMSampleBufferRef contains some obscure data, that may be all kinds of things - depending on the actual camera, its driver, and whatever --- and I have no clue as for what to expect there. I don't even find a method to tell me what kind of CMSampleBuffer is this. I already spend lots of time trying to get a still image as tiff on disk - to no avail. – Motti Shneor Apr 17 '19 at 13:40
  • Try looking into the CoreImage Framework. https://developer.apple.com/documentation/coreimage/ciimage?language=objc – Pescolly Apr 17 '19 at 23:33
  • I tried, again to no avail. CIImage cannot be created from a CMSampleBufferRef. Only from CVImageBuffer or CVPixelBuffer - both of which I fail to extract. By the way- I DO provide output settings to my AVCaptureStillImageOutput before calling it to capture an image - so I expected to receive something more than raw data there. Maybe I need to make a different question for this. I'm close to giving up. – Motti Shneor Apr 29 '19 at 19:24
  • I now set up my own question - similar to this, but for my different context - of Camera capture, would you mind having a look at it? – Motti Shneor Apr 30 '19 at 06:05
1

FWIW: Here is what official docs say for the return value of CMSampleBufferGetImageBuffer:

"Result is a CVImageBuffer of media data. The result will be NULL if the CMSampleBuffer does not contain a CVImageBuffer, or if the CMSampleBuffer contains a CMBlockBuffer, or if there is some other error."

Also note that the caller does not own the returned dataBuffer from CMSampleBufferGetImageBuffer, and must retain it explicitly if the caller needs to maintain a reference to it.

Hopefully this info helps.

Aki
  • 3,709
  • 2
  • 29
  • 37
  • Thanks! Out of interest, where did you find that quote? Because here : https://developer.apple.com/library/mac/#documentation/CoreMedia/Reference/CMSampleBuffer/Reference/Reference.html and in the docs downlaoded in my XCode version there's absolutely no commentary at all! – Dave Durbin Jun 01 '13 at 18:10
  • In the header files. Apple has a very detailef docs for each method. In xcode right click over the method name in your code and click 'go to definition'(The wording might not be exact), and it will take you to the correct .h file and method definition with documentation – Aki Jun 01 '13 at 23:22
  • Doh! I guess I assumed that since the docs weren't in the generated documentation they weren;t in th source. Thanks again Aki. – Dave Durbin Jun 02 '13 at 08:20
  • Am I correct in assuming that with Swift and ARC, I no longer need to manually retain? – Kartick Vaddadi Nov 09 '17 at 01:30