2

I am creating an application in which we can able to draw using our finger in an imageView , the same time we can record the screen also. I have done these features so far , but the problem is once the video recording is completed , if we play the recorded video the finger drawing is not smooth in video.

I am not using opengl , the drawing is on UIImageView and on every 0.01 sec we capture th image from UIImageView and append the pixel buffer to the AVAssetWriterInputPixelBufferAdaptor object .

Here is the code I used for converting the UIImage into buffer

- (CVPixelBufferRef) pixelBufferFromCGImage:(CGImageRef) image {

  CGSize frameSize = CGSizeMake(976, 667);
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
CVPixelBufferRef pxbuffer = NULL;
CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
                                      frameSize.height,  kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                      &pxbuffer);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);

CGColorSpaceRef rgbColorSpace = CGImageGetColorSpace(image);

CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width,
                                             frameSize.height, 8, 4*frameSize.width, rgbColorSpace,
                                             kCGImageAlphaPremultipliedFirst);

CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                       CGImageGetHeight(image)), image);

CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;

}

the below method is calling on 0.01 sec timeinterval

CVPixelBufferRef pixelBufferX  = (CVPixelBufferRef)[self pixelBufferFromCGImage:theIM];
bValue = [self.avAdaptor appendPixelBuffer:pixelBufferX withPresentationTime:presentTime];

Can any one guide for the improvement in video capture ?

Thanks in advance

Raj
  • 5,895
  • 4
  • 27
  • 48

1 Answers1

1

You shouldn't display things by calling them every 0.01 seconds. If you want to stay in sync with video, see AVSynchronizedLayer, which is explicitly for this. Alternately, see CADisplayLink, which is for staying in sync with screen refreshes. 0.01 seconds doesn't line up with anything in particular, and you're probably getting beats where you're out of sync with the video and with the display. In any case, you should be doing your drawing in some callback from your player, not with a timer.

You are also leaking your pixel buffer in every loop. Since you called CVPixelBufferCreate(), you're responsible for eventually calling CFRelease() on the resulting pixel buffer. I would expect your program to eventually crash by running out of memory if this ran for awhile.

Make sure you've studied the AV Foundation Programming Guide so you know how all the pieces fit together in media playback.

Rob Napier
  • 286,113
  • 34
  • 456
  • 610
  • Thanks your replay ,I tried with CADisplayLink instead of NSTimer , but there is no luck , still the problem is there. I already handled release for CVPixelBufferCreate, I didnt included the piece of code in question. – Raj Mar 20 '13 at 13:44
  • `CADisplayLink` will only help you if your frames are coming in at exact the screen-refresh rate (i.e. they're also being processed as part of a `CADisplayLink`). – Rob Napier Mar 20 '13 at 15:21