I have a AVPlayerLayer (subclass of CALayer) and I need to get in into a image type that can be passed to a QCRenderer (QCRenderer accepts NSImages and CIImages.) I can convert the CALayer to a CGImageRef, and that to an NSImage, but the contents is always clear.
I've narrowed it down to one of two reasons:
- I am not creating the NSImage correctly.
- The AVPlayer is not rendering to the AVPlayerLayer.
I am not receiving any errors, and have found some documentation on converting CALayers. Also, I added the AVPlayerLayer to an NSView, which remains empty so I believe 2 is the problem.
I'm using a modified version of Apple's AVPlayerDemo's AVPlayerDemoPlaybackViewController. I turned it into an NSObject since I stripped all of the interface code out of it.
I create the AVPlayerLayer in the (void)prepareToPlayAsset:withKeys: method when I create the AVPlayer: (I'm only adding the layer to a NSView to test if it is working.)
if (![self player])
{
/* Get a new AVPlayer initialized to play the specified player item. */
[self setPlayer:[AVPlayer playerWithPlayerItem:self.mPlayerItem]];
/* Observe the AVPlayer "currentItem" property to find out when any
AVPlayer replaceCurrentItemWithPlayerItem: replacement will/did
occur.*/
[self.player addObserver:self
forKeyPath:kCurrentItemKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:AVPlayerDemoPlaybackViewControllerCurrentItemObservationContext];
mPlaybackView = [AVPlayerLayer playerLayerWithPlayer:self.player];
[self.theView setWantsLayer:YES];
[mPlaybackView setFrame:self.theView.layer.bounds];
[self.theView.layer addSublayer:mPlaybackView];
}
I then create a NSRunLoop to grab a frame of the AVPlayerLayer 30 times per second:
framegrabTimer = [NSTimer timerWithTimeInterval:(1/30) target:self selector:@selector(grabFrameFromMovie) userInfo:nil repeats:YES];
[[NSRunLoop currentRunLoop] addTimer:framegrabTimer forMode:NSDefaultRunLoopMode];
Here is the code I use to grab the frame and pass it to the class that handles the QCRenderer:
-(void)grabFrameFromMovie {
CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
CGContextRef theContext = CGBitmapContextCreate(NULL, mPlaybackView.frame.size.width, mPlaybackView.frame.size.height, 8, 4*mPlaybackView.frame.size.width, colorSpace, kCGImageAlphaPremultipliedLast);
[mPlaybackView renderInContext:theContext];
CGImageRef CGImage = CGBitmapContextCreateImage(theContext);
NSImage *image = [[NSImage alloc] initWithCGImage:CGImage size:NSMakeSize(mPlaybackView.frame.size.width, mPlaybackView.frame.size.height)];
[[NSNotificationCenter defaultCenter] postNotificationName:@"AVPlayerLoadedNewFrame" object:[image copy]];
CGContextRelease(theContext);
CGColorSpaceRelease(colorSpace);
CGImageRelease(CGImage); }
I can't figure out why I'm only getting clear. Any help with this is greatly appreciated, as there is not enough AVFoundation documentation for OS X.