0

I am writing an OS X application that will create a video using a series of images. It was developed using code from here: Make movie file with picture Array and song file, using AVAsset, but not including the audio portion.

The code functions and creates an mpg file.

The problem is the memory pressure. It doesn't appear to free up any memory. Using XCode Instruments I found the biggest culprits are:

CVPixelBufferCreate
[image TIFFRepresentation];
CGImageSourceCreateWithData
CGImageSourceCreateImageAtIndex

I tried adding code to release, but the ARC should already being doing that. Eventually OS X will hang and or crash.

Not sure how to handle the memory issue. There are no mallocs in the code. I'm open to suggestions. It appears that many others have used this same code.

This is the code that is based on the link above:

- (void)ProcessImagesToVideoFile:(NSError **)error_p size:(NSSize)size videoFilePath:(NSString *)videoFilePath jpegs:(NSMutableArray *)jpegs fileLocation:(NSString *)fileLocation
{

    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                              [NSURL fileURLWithPath:videoFilePath]
                                                       fileType:AVFileTypeMPEG4
                                                          error:&(*error_p)];
    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                               nil];
     AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeVideo
                                        outputSettings:videoSettings];


        AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                 sourcePixelBufferAttributes:nil];



    NSParameterAssert(videoWriterInput);
    NSParameterAssert([videoWriter canAddInput:videoWriterInput]);

    videoWriterInput.expectsMediaDataInRealTime = YES;

    [videoWriter addInput:videoWriterInput];
    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];


    CVPixelBufferRef buffer = NULL;

    //Write all picture array in movie file.

    int frameCount = 0;

    for(int i = 0; i<[jpegs count]; i++)
    {
        NSString *filePath = [NSString stringWithFormat:@"%@%@", fileLocation, [jpegs objectAtIndex:i]];
        NSImage *jpegImage = [[NSImage alloc ]initWithContentsOfFile:filePath];
        CMTime frameTime = CMTimeMake(frameCount,(int32_t) 24);

        BOOL append_ok = NO;
        int j = 0;
        while (!append_ok && j < 30)
        {
            if (adaptor.assetWriterInput.readyForMoreMediaData)
            {
                if ((frameCount % 25) == 0)
                {
                    NSLog(@"appending %d to %@ attemp %d\n", frameCount, videoFilePath, j);
                }


                buffer = [self pixelBufferFromCGImage:jpegImage  andSize:size];
                append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
                if (append_ok == NO) //failes on 3GS, but works on iphone 4
                {
                    NSLog(@"failed to append buffer");
                    NSLog(@"The error is %@", [videoWriter error]);
                }
                //CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
                //NSParameterAssert(bufferPool != NULL);

                if(buffer)
                {
                    CVPixelBufferRelease(buffer);
                    //CVBufferRelease(buffer);
                }
            }
            else
            {
                printf("adaptor not ready %d, %d\n", frameCount, j);
                [NSThread sleepForTimeInterval:0.1];
            }
            j++;
        }

        if (!append_ok)
        {
            printf("error appending image %d times %d\n", frameCount, j);
        }

        frameCount++;
            //CVBufferRelease(buffer);
        jpegImage = nil;
        buffer = nil;
    }

    //Finish writing picture:
    [videoWriterInput markAsFinished];
    [videoWriter finishWritingWithCompletionHandler:^(){
    NSLog (@"finished writing");

    }];
}

- (CVPixelBufferRef) pixelBufferFromCGImage: (NSImage *) image andSize:(CGSize) size
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES],     kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES],     kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
    CVPixelBufferRef pxbuffer = NULL;

    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                      size.width,
                                      size.height,
                                      kCVPixelFormatType_32ARGB,
                                      (__bridge CFDictionaryRef) options,
                                      &pxbuffer);

    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height,
                                             8, 4*size.width, rgbColorSpace,
                                             kCGImageAlphaPremultipliedFirst);
    NSParameterAssert(context);
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGImageRef imageRef = [self nsImageToCGImageRef:image];
    CGRect imageRect = CGRectMake(0, 0, CGImageGetWidth(imageRef),     CGImageGetHeight(imageRef));
    CGContextDrawImage(context, imageRect, imageRef);


    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
    imageRef = nil;
    context = nil;
    rgbColorSpace = nil;
    return pxbuffer;
}

- (CGImageRef)nsImageToCGImageRef:(NSImage*)image;
{
    NSData * imageData = [image TIFFRepresentation];// memory hog
    CGImageRef imageRef;
    if(!imageData) return nil;
    CGImageSourceRef imageSource = CGImageSourceCreateWithData((__bridge CFDataRef)imageData, NULL);
    imageRef = CGImageSourceCreateImageAtIndex(imageSource, 0, NULL);

    imageData = nil;
    imageSource = nil;
    return imageRef;
}
Community
  • 1
  • 1
Mike Weber
  • 311
  • 2
  • 16

2 Answers2

0

ARC works only for retainable object pointers. ARC documentation defines them as

A retainable object pointer (or “retainable pointer”) is a value of a retainable object pointer type (“retainable type”). There are three kinds of retainable object pointer types:

  1. block pointers (formed by applying the caret (^) declarator sigil to a function type)

  2. Objective-C object pointers (id, Class, NSFoo*, etc.)

  3. typedefs marked with attribute((NSObject)) Other pointer types, such as int* and CFStringRef, are not subject to ARC’s semantics and restrictions.

You already explicitly call release here

CGContextRelease(context);

You should do the same for other objects. Like

CVPixelBufferRelease(pxbuffer);

for pxbuffer

Community
  • 1
  • 1
Avt
  • 16,927
  • 4
  • 52
  • 72
  • @AminNegm-Awad Arc *can* work for every retainable object but *unless you specify otherwise* it will not touch anything that was created by a C function, including `CGContext` and `CVPixelBuffer`. You have to manually release those, or else manually tell ARC to release them. – Abhi Beckert Apr 27 '14 at 19:14
  • 1
    1. "ARC works only for subclasses of NSObject." is wrong in many ways. 2. The place of creation (C function or Objective-C method) is without any impact on that. 3. It works in a documented way for *every retainable object* without any exception. This has nothing to do with C. 4. You mix up completely different things. 5. Please read the documentation at llvm.org. – Amin Negm-Awad Apr 27 '14 at 19:24
  • @AminNegm-Awad: ARC does not automatically manage CoreFoundation-style objects, despite the fact that they are retainable. – Chuck Apr 27 '14 at 19:29
  • I know, that CF Objects are not handled. He said: "ARC works only for subclasses of NSObject." This is wrong. You say: "Despite the fact that they are retainable". This is wrong. For you, too: Please read the documentation at llvm.org. There is a definition for "retainable object pointer". – Amin Negm-Awad Apr 27 '14 at 19:38
  • ARC works for every retainable *Objective-C* object. – Wevah Apr 27 '14 at 19:53
  • Isn't OP already doing that in `ProcessImagesToVideoFile:size:videoFilePath:jpegs:fileLocation:`? – Chuck Apr 27 '14 at 20:19
  • @Chuck OP changed his code after I post my answer. In first code version some Core Video objects were not released. I had no time to recheck his code. – Avt Apr 27 '14 at 21:51
  • @Wevah: From llvm: "block pointers (…) Objective-C object pointers (…) typedefs marked with __attribute__((NSObject))" So there are two simple examples for objects ARC works with, even they are no member of a NSObject: 1. NSProxy, 2. Blocks. But the answer is edited. Everything is fine. – Amin Negm-Awad Apr 28 '14 at 05:03
  • `NSProxy` and blocks (under the hood) are both Objevtive-C objects though. :P – Wevah Apr 28 '14 at 20:59
0

Your code is using ARC but the libraries you are calling might not be using ARC. They might be relying on the older autorelease pool system to free up memory.

You should have a read how it works, this is fundamental stuff that every Obj-C developer needs to memorise, but basically any object can be added to the current "pool" of objects, which will be released when the pool is released.

By default, the pool on the main thread is emptied each time the app enters an idle state. This usually works fine, since the main thread should never be busy for more than few hundredths of a second and you can't really build up much memory in that amount of time.

When you do a lengthy and memory intensive operation you need to manually setup an autorelease pool, which is most commonly put inside a for or while loop (although you can actually put them anywhere you want, that's just the most useful scenario):

for ( ... ) {
  @autoreleasepool {
    // do somestuff
  }
}

Also, ARC is only for Objective C code. It does not apply to objects created by C functions like CGColorSpaceCreateDeviceRGB() and CVPixelBufferCreate(). Make sure you are manually releasing all of those.

Abhi Beckert
  • 32,787
  • 12
  • 83
  • 110
  • Your first paragraph seems to imply that autorelease pools don't exist under ARC, which I don't think is what you meant to say. – jscs Apr 27 '14 at 19:29
  • @JoshCaswell my understanding is ARC will typically release objects immediately after the last line of code that uses them, instead of autoreleasing them on creation? So any object maintained under ARC will be released without ever entering an autorelease pool. – Abhi Beckert Apr 27 '14 at 19:33
  • 1
    You can have ARC manage objects returned from frameworks that were compiled without ARC just fine, providing the framework follows the standard method-naming conventions. The magical never-enters-autoreleasepool stuff only happens when the two things were compiled under ARC (because ARC looks up the call stack for specific calls). – Wevah Apr 27 '14 at 19:52
  • Both the ColorSpace and PixelBuffer are released by calling CGColorSpaceRelease and CVPixelBufferRelease. I also wrapped the code that generates the video file within a @autoreleasepool{} block, but I do not see the memory being freed after the code exits that block. – Mike Weber Apr 28 '14 at 00:16