I have several UIImages and I want to create a video from them.
I am used a solution based on this
to create a video from UIImages. In my case, I would like to create a 30 fps video. So, every image is 1/30 of a second.
After setting everything to start saving the video, as mentioned on that page, I have created a method that saves one image to the movie and this method is called by a loop. Something like:
for (int i=0; i<[self.arrayOfFrames count]; i++ {
UIImage *oneImage = [self.arrayOfFrames objectAtIndex:i];
[self saveOneFrame:oneImage atTime:i];
}
and the method is
-(void)saveOneFrame:(UIImage *)imagem atTime:(NSInteger)time {
// I have tried this autorelease pool to drain memory after the method is finished
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:imagem.CGImage size:imagem.size];
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < 30)
{
if (adaptor.assetWriterInput.readyForMoreMediaData)
{
printf("appending %d attemp %d\n", time, j);
CMTime oneFrameLength = CMTimeMake(1, 30.0f ); // one frame = 1/30 s
CMTime lastTime;
CMTime presentTime;
if (time == 0) {
presentTime = CMTimeMake(0, self.FPS);
} else {
lastTime = CMTimeMake(tempo-1, self.FPS);
presentTime = CMTimeAdd(lastTime, duracaoUmFrame);
}
// this will always add 1/30 to every new keyframe
CMTime presentTime = CMTimeAdd(lastTime, oneFrameLength);
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
NSParameterAssert(bufferPool != NULL);
[NSThread sleepForTimeInterval:0.05];
}
else
{
printf("adaptor not ready %d, %d\n", time, j);
[NSThread sleepForTimeInterval:0.1];
}
j++;
}
if (!append_ok) {
printf("error appending image %d times %d\n", time, j);
}
CVBufferRelease(buffer);
[pool drain]; // I have tried with and without this autorelease pool in place... no difference
}
The application simply quits, without any warning, after saving 50 frames to the movie...
This is the other method:
-(CVPixelBufferRef) pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
&pxbuffer);
status=status;//Added to make the stupid compiler not show a stupid warning.
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
size.height, 8, 4*size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
//CGContextTranslateCTM(context, 0, CGImageGetHeight(image));
//CGContextScaleCTM(context, 1.0, -1.0);//Flip vertically to account for different origin
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
I run instruments and have not detected any leak or exaggerated memory usage that is about the same before the movie starts being saved.
any clues?
NOTE:
After looking at the device logs, I found this:
<Notice>: (UIKitApplication:com.myID.myApp[0xc304]) Bug: launchd_core_logic.c:3732 (25562):3
<Notice>: (UIKitApplication:com.myID.myApp[0xc304]) Assuming job exited: <rdar://problem/5020256>: 10: No child processes
<Warning>: (UIKitApplication:com.myID.myApp[0xc304]) Job appears to have crashed: Segmentation fault: 11
<Warning>: Application 'myApp' exited abnormally with signal 11: Segmentation fault: 11