I have trouble with screen recording. Right now I'm using "drawViewHierarchyInRect: afterScreenUpdates:" and feeding the pixel buffer to an AVAssetWriterInputPixelBufferAdaptor, this is working fine, but only on an iPhone 5s/5. On iPad and iPhone 4s this method is performing way too bad, 10-15 fps. I need at least 25-30.
My current method is the best method so far. I've been trying glReadPixels and renderInContext (doesn't work with live camera feed.
So I've been around searching on stackoverflow, I found a couple of alternatives and most of them I've tried. But the last one I found, OpenGL ES 2d rendering into image, but I can't get it to work and I don't know if it's worth the time.
if ([[CCDirector sharedDirector] isPaused] || !writerInput || !writerInput.readyForMoreMediaData || !VIDEO_WRITER_IS_READY) {
return;
}
CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)[[[CCDirector sharedDirector] openGLView] context], NULL, &rawDataTextureCache);
if (err) {
NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d");
}
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
CVPixelBufferCreate(kCFAllocatorDefault,
(int)esize.width,
(int)esize.height,
kCVPixelFormatType_32BGRA,
attrs,
&renderTarget);
CVOpenGLESTextureRef renderTexture;
CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault,
rawDataTextureCache,
renderTarget,
NULL, // texture attributes
GL_TEXTURE_2D,
GL_RGBA, // opengl format
(int)esize.width,
(int)esize.height,
GL_BGRA, // native iOS format
GL_UNSIGNED_BYTE,
0,
&renderTexture);
CFRelease(attrs);
CFRelease(empty);
glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0);
CVPixelBufferLockBaseAddress(renderTarget, 0);
CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
CMTime presentationTime = CMTimeMake(elapsedTime * 30, 30);
if(![adaptor appendPixelBuffer:renderTarget withPresentationTime:presentationTime]) {
NSLog(@"Adaptor FAIL");
}
CVPixelBufferUnlockBaseAddress(renderTarget, 0);
CVPixelBufferRelease(renderTarget);
Above is the code that is relevant, I've been feeding a pixel buffer to my adaptor and it has been working fine up until now.
The adaptor is just failing, and logging "Adaptor FAIL". Don't get any error.
I don't know if I'm completely off, trying to do this with the EAGLContext of a cocos2d app.
Thanks in advance.
* UPDATE *
I changed,
CMTime presentationTime = CMTimeMake(elapsedTime * 30, 30);
to,
CMTime presentationTime = CMTimeMake(elapsedTime * 120, 120);
I believe that 30 is not enough since it was running faster than 30 FPS. I probably was adding multiple frames at the same time because of the increased frame rate which made the adaptor fail. So the adaptor stopped failing now, but the screen still freezes. Though I know where the buttons are and I managed to stop the recording and play the video. It works. But flashing black screen every other frame.