7

I'm trying to play a video (MP4/H.263) on iOS, but getting really fuzzy results. Here's the code to initialize the asset reading:

mTextureHandle = [self createTexture:CGSizeMake(400,400)];

NSURL * url = [NSURL fileURLWithPath:file];    
mAsset = [[AVURLAsset alloc] initWithURL:url options:NULL];

NSArray * tracks = [mAsset tracksWithMediaType:AVMediaTypeVideo];

mTrack = [tracks objectAtIndex:0];

NSLog(@"Tracks: %i", [tracks count]);

NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary * settings = [[NSDictionary alloc] initWithObjectsAndKeys:value, key, nil];

mOutput = [[AVAssetReaderTrackOutput alloc] 
            initWithTrack:mTrack outputSettings:settings];


mReader = [[AVAssetReader alloc] initWithAsset:mAsset error:nil];
[mReader addOutput:mOutput];

So much for the reader init, now the actual texturing:

CMSampleBufferRef sampleBuffer = [mOutput copyNextSampleBuffer];    
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress( pixelBuffer, 0 );

glBindTexture(GL_TEXTURE_2D, mTextureHandle);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 600, 400, 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress( pixelBuffer ));    
CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );    
CFRelease(sampleBuffer);

Everything works well ... except the rendered image looks like this; sliced and skewed?

enter image description here

I've even tried looking into AVAssetTrack's preferred transformation matrix, to no avail, since it always returns CGAffineTransformIdentity.

Side-note: If I switch the source to camera, the image gets rendered fine. Am I missing some decompression step? Shouldn't that be handled by the asset reader?

Thanks!

Code: https://github.com/shaded-enmity/objcpp-opengl-video

arul
  • 13,998
  • 1
  • 57
  • 77
  • 4
    would you mind sharing the code you did for this project? I'm trying to do something similar and am having a hard time getting started. – Michael Nguyen May 30 '14 at 03:47
  • I'm also interested to use OnenGL ES for video playing. did you follow any tutorial? Or may I have any suggested blog for this? – Sk Borhan Uddin Jan 05 '17 at 11:04
  • If I manage to recover the codeI was using for this I'll share it on github with some descriptive README steps. I guess some things may have changed in terms of APIs so the code may be a little outdated (I don't do iOS dev anymore so can't really say, but we'll see) – arul Jan 05 '17 at 14:40
  • @arul ... did you successfully manage to share the code somewhere? i m interested to see it :) thanks by advance ! – zeus Feb 06 '17 at 22:14
  • Updated the question with the code I managed to salvage from a corrupted HD backup. – arul Feb 07 '17 at 16:31

1 Answers1

6

I think the CMSampleBuffer uses a padding for performance reason, so you need to have the right width for the texture.

Try to set width of the texture with : CVPixelBufferGetBytesPerRow(pixelBuffer) / 4 (if your video format uses 4 bytes per pixel, change if other)

Johnmph
  • 3,391
  • 24
  • 32
  • You're right, wow, I didn't realize that wrong texture width would distort the image in such a peculiar way. Thank you very much! – arul Oct 01 '11 at 14:43