2

Reference

I have gone through may good links in SO to create a Video file from NSArray of images. One of the most useful was this SO Question Links

ISSUE

  • Now there is a memory leak in VideoToolBox. (i have attached screen shot of instrument while running app in iOS 5.1 Simulator)Instrument Screen Shot
  • My application use unto 346 MB of memory while creating this video. Mainly because of this method.
 
    (BOOL)appendPixelBuffer:(CVPixelBufferRef)pixelBuffer withPresentationTime:(CMTime)presentationTime
 

This method of AVAssetWriterInputPixelBufferAdaptor class will retain all the CVPixelBufferRef until the video is created.

CODE

i have created a ImageToVideo.m NSOperation class to Create Video from Images.

 
#import "ImageToVideo.h"

@implementation ImageToVideo

//global pixcel buffer.
CVPixelBufferRef pxbuffer = NULL;

- (void) pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];

    CVBufferRelease(pxbuffer);
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
                                          size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
                                          &pxbuffer);
    options = nil;
    [options release];

    status=status;//Added to make the stupid compiler not show a stupid warning.
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
                                                 size.height, 8, 4*size.width, rgbColorSpace, 
                                                 kCGImageAlphaNoneSkipFirst);
    NSParameterAssert(context);

    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);   

}


- (void)writeImageAsMovietoPath:(NSString*)path size:(CGSize)size 
{
    NSError *error = nil;
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                              error:&error];
    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                                   [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                                   nil];
    AVAssetWriterInput* writerInput = [[AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeVideo
                                        outputSettings:videoSettings] retain];

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                     sourcePixelBufferAttributes:nil];
    NSParameterAssert(writerInput);
    NSParameterAssert([videoWriter canAddInput:writerInput]);
    [videoWriter addInput:writerInput];

    NSMutableArray *photoImages = [[[NSMutableArray alloc] init] autorelease];

    //Creating a image Array
    for (int i = 0; i = [photoImages count]) 
            {
                CVBufferRelease(pxbuffer);
                pxbuffer = NULL;

            } 
            else 
            {
                //creating the pixcel buffer.
                [self pixelBufferFromCGImage:[[photoImages objectAtIndex:i] CGImage] size:CGSizeMake(640, 980)];
            }          


            if (pxbuffer) 
            {
                // append buffer
                [adaptor appendPixelBuffer:pxbuffer withPresentationTime:presentTime];
                CVBufferRelease(pxbuffer);
                pxbuffer = NULL;
                i++;
            } 
            else 
            {
                //Finish the session:
                [writerInput markAsFinished];
                [videoWriter finishWriting];                

                CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
                [videoWriter release];
                [writerInput release];
                NSLog (@"Done");
                break;
            }
        }
    }

    //release the photoImage array.
    [photoImages removeAllObjects];
}

- (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo
{
    // Unable to save the image  
    if (error)
    {
        UIAlertView *alert;
        alert = [[UIAlertView alloc] initWithTitle:@"Error" 
                                           message:@"Unable to save image to Photo Album." 
                                          delegate:self cancelButtonTitle:@"Ok" 
                                 otherButtonTitles:nil];
        [alert show];
        [alert release];
    }
}



- (void) main
{
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
    NSLog(@"Operation Started");

    NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"Documents/movie.mp4"]];  
    CGSize size = CGSizeMake(640, 960);

    [self writeImageAsMovietoPath:path size:size] ;

    [pool drain];
}


@end

 

Question

  • Why there is a memory leak in VideoToolBox ?
  • Is there is a better method to convert List of image to Video in iOS ?
  • How can i reduce my application memory footprint while creating the video ?
Community
  • 1
  • 1
Naveen Murthy
  • 3,661
  • 2
  • 21
  • 22

1 Answers1

1

There is not a real leak in VideoToolBox. It does look like there is when running in the simulator, but when run on the device there will be no leak. The answer to parts 2 and 3 of your question is yes, there is a better way. You can make use of a pixel buffer pool to get better performance and less memory allocation, but you have to do it "just right".

I am not going to waste your time with cutting and pasting, since you have already experienced the downside with attempting to copy other people's code via S.O. This movie encoding operation is really complex and the Apple APIs are not so easy to use. It would be better to have a look at an actual finished version of working code instead of attempting to cut and paste your way to success. Basically, you should call CVPixelBufferPoolCreatePixelBuffer() instead of CVPixelBufferCreate() to create a pixel buffer but reuse and existing one out of the pool if it exists already.

Have a look at the class AVAssetWriterConvertFromMaxvid.m it is part of my AVAnimator library on github. This class shows a real working example of a pixel buffer pool and how your code will need to interact with the run loop of the secondary thread to avoid getting stuck in certain encoding situations.

Firo
  • 15,448
  • 3
  • 54
  • 74
MoDJ
  • 4,309
  • 2
  • 30
  • 65