6

For extracting the RAW image buffer from an uncompressed v210 (with pixel format kCVPixelFormatType_422YpCbCr10) I tried to follow this great post: Reading samples via AVAssetReader

The problem is, that when it comes to startReading of my assetReader I got an AVAssetReaderStatusFailed (with setting NSMutableDictionary object kCVPixelBufferPixelFormatTypeKey for the output settings to kCVPixelFormatType_422YpCbCr10). If I leave the outputSettings be nil, it parses each frame, but the CMSampleBufferRef buffers are empty.

I tried lots of the pixel formats such as

  • kCVPixelFormatType_422YpCbCr10
  • kCVPixelFormatType_422YpCbCr8
  • kCVPixelFormatType_422YpCbCr16
  • kCVPixelFormatType_422YpCbCr8_yuvs
  • kCVPixelFormatType_422YpCbCr8FullRange
  • kCVPixelFormatType_422YpCbCr_4A_8BiPlanar

but none of them work.

What am I doing wrong?

Any comments welcome...

Here is my code:

/* construct an AVAssetReader based on an file based URL */
NSError *error=[[NSError alloc]init];
NSString *filePathString=[[NSString alloc]
                          initWithString:@"/Users/johann/Raw12bit.mov"];

NSURL *movieUrl=[[NSURL alloc] initFileURLWithPath:filePathString];
AVURLAsset *movieAsset=[[AVURLAsset alloc] initWithURL:movieUrl options:nil]; 

/* determine image dimensions of images stored in movie asset */
CGSize size=[movieAsset naturalSize];  
NSLog(@"movie asset natual size: size.width=%f size.height=%f", 
      size.width, size.height);

/* allocate assetReader */
AVAssetReader *assetReader=[[AVAssetReader alloc] initWithAsset:movieAsset
                                                          error:&error];

/* get video track(s) from movie asset */
NSArray *videoTracks=[movieAsset tracksWithMediaType:AVMediaTypeVideo];

/* get first video track, if there is any */
AVAssetTrack *videoTrack0=[videoTracks objectAtIndex:0];

/* set the desired video frame format into attribute dictionary */
NSMutableDictionary* dictionary=[NSDictionary dictionaryWithObjectsAndKeys:
  [NSNumber numberWithInt:kCVPixelFormatType_422YpCbCr10], 
  (NSString*)kCVPixelBufferPixelFormatTypeKey,
  nil];

/* construct the actual track output and add it to the asset reader */
AVAssetReaderTrackOutput* assetReaderOutput=[[AVAssetReaderTrackOutput alloc] 
                                             initWithTrack:videoTrack0 
                                             outputSettings:dictionary]; //nil or dictionary
/* main parser loop */
NSInteger i=0;
if([assetReader canAddOutput:assetReaderOutput]){
  [assetReader addOutput:assetReaderOutput];

  NSLog(@"asset added to output.");

  /* start asset reader */
  if([assetReader startReading]==YES){
    /* read off the samples */
    CMSampleBufferRef buffer;
    while([assetReader status]==AVAssetReaderStatusReading){
      buffer=[assetReaderOutput copyNextSampleBuffer];
      i++;
      NSLog(@"decoding frame #%ld done.", i);
    }
  }
  else {
    NSLog(@"could not start reading asset.");
    NSLog(@"reader status: %ld", [assetReader status]);
  }
}
else {
  NSLog(@"could not add asset to output.");
}

Best Regards, Johann

Community
  • 1
  • 1
Johann Horvat
  • 1,285
  • 1
  • 14
  • 18

2 Answers2

2

Try kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, it worked fine for me.

j0k
  • 22,600
  • 28
  • 79
  • 90
Alexander
  • 78
  • 5
0
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
kCVPixelFormatType_32BGRA

system only supports this.

fedorqui
  • 275,237
  • 103
  • 548
  • 598
  • 1
    Welcome to stackoverflow. as the above post might answer the question. Little more explanation might help out the fellow programmers to understand how it works. And it is highly recommended to make use of comments for suggestions and doubts once earn enough reputations. – Nagama Inamdar Nov 21 '14 at 09:02