2

I'm working on a MacOS program in Objective C that needs to produce in-memory thumbnails to send to a server. The following code is used to perform this operation. As the program runs, a leak of about 40mb is induced each time this method is called. I'm missing something really basic, I suspect, but I don't see the source of the problem.

I should add that I've also tried creating one context to use over the life of the program and the problem, if anything, seems somewhat worse.

When I run Instruments, the allocations for the category "VM: ImageIO_JPEG_Data" are growing by one allocation of 40mb each time it's called. The responsible library is "ImageIO" and the responsible caller is "ImageIO_Malloc".

- (void) createPhotoThumbnail
{
    NSURL* fileURL = [NSURL fileURLWithPath : _imagePath];

    CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
    CGContextRef bitmapContext = CGBitmapContextCreate(NULL, MAX_THUMB_DIM, MAX_THUMB_DIM, 8, 0,
                                                       colorspace, (CGBitmapInfo)kCGImageAlphaNoneSkipLast);
    CIContext *ciContext = [CIContext contextWithCGContext: bitmapContext options: @{}];

    if (fileURL)
    {
        CIImage *image = [[CIImage alloc] initWithContentsOfURL: fileURL];

        if (image)
        {
            // scale the image
            CIFilter *scaleFilter = [CIFilter filterWithName: @"CILanczosScaleTransform"];
            [scaleFilter setValue: image forKey: @"inputImage"];
            NSNumber *scaleFactor = [[NSNumber alloc] initWithFloat: ((float) MAX_THUMB_DIM) /
                                     ((float)MAX(_processedWidth, _processedHeight))];
            [scaleFilter setValue: scaleFactor forKey: @"inputScale"];
            [scaleFilter setValue: @1.0 forKey: @"inputAspectRatio"];
            CIImage *scaledImage = [scaleFilter valueForKey: @"outputImage"];

            NSMutableData* thumbJpegData = [[NSMutableData alloc] init];
            CGImageDestinationRef dest = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)thumbJpegData,
                                                                          (__bridge CFStringRef)@"public.jpeg",
                                                                          1,
                                                                          NULL);
            if (dest)
            {
                CGImageRef img = [ciContext createCGImage:scaledImage
                                                  fromRect:[scaledImage extent]];
                CGImageDestinationAddImage(dest, img, nil);
                if (CGImageDestinationFinalize(dest))
                {
                    // encode it as a string for later
                    _thumbnail = [thumbJpegData base64EncodedStringWithOptions: 0];
                }
                else
                {
                    DDLogError(@"Failed to generate photo thumbnail");
                }
                CGImageRelease(img);
                CFRelease(dest);
            }
            else
            {
                DDLogError(@"Failed to finalize photo thumbnail image");
            }
            thumbJpegData = nil;
        }
    }

    CGContextRelease(bitmapContext);
    CGColorSpaceRelease(colorspace);
    ciContext = nil;

}

UPDATE: I switched the code to use a CGAffineTransform instead of the filter with "CILanczosScaleTransform" and the symptom did not change. Next I used a completely new method (snippet below) and yet the problem persists.

NSImage *thumbnail = [[NSImage alloc] initWithSize: newSize];

[thumbnail lockFocus];
[sourceImage setSize: newSize];
[[NSGraphicsContext currentContext] setImageInterpolation:NSImageInterpolationHigh];
[sourceImage compositeToPoint: NSZeroPoint operation: NSCompositeCopy];
[thumbnail unlockFocus];

NSData *tiff = [thumbnail  TIFFRepresentation];
NSBitmapImageRep *imageRep = [NSBitmapImageRep imageRepWithData: tiff];
NSDictionary *imageProps = [NSDictionary dictionaryWithObject:[NSNumber numberWithFloat:0.9] forKey:NSImageCompressionFactor];
NSData *thumbJpegData = [imageRep representationUsingType:NSJPEGFileType properties:imageProps];

This is making me think the problem is perhaps related to something inherent in the way I'm doing this. I find it hard to believe two different methods of image scaling are going to exhibit the same sort of leak.

Fitter Man
  • 682
  • 8
  • 17
  • How big are your input images you are scaling? Are you using ARC? – Sven Mar 13 '15 at 23:38
  • 1
    @Sven - Despite the OP's self-deprecating remarks, this isn't the usual "simple failure to observe the 'Create Rule'" nor is it a non-ARC question. I can reproduce the behavior that the OP describes, using ARC and I see nothing obviously wrong here. For me, it's leaking the uncompressed image size (so my 2888x1800 screen snapshot is leaking 20mb for each image). The use of ivars is a little sloppy here, but even after one remedies that, the fundamental issue of leaking memory looks legit to me. – Rob Mar 14 '15 at 01:23
  • Yes, using ARC. My images are various photos, so the size varies quite a bit (from older 3MP camera to a newer 12MP camera) yet the size seems to be consistent at around 40MB per image. – Fitter Man Mar 14 '15 at 03:25

2 Answers2

1

Thanks to this answer I was able to identify the need for an autorelease pool, something I was completely unaware of. The code in the question is one of a series of methods that are called repeatedly from inside a tight loop. This apparently prevents the OS from having a chance to do some cleanup. The block now looks like this:

@autoreleasepool {
    [self findRelevantAdjustments];
    [self adjustForStraightenCrop];
    [self moveFacesRelativeToTopLeftOrigin];
    [self createPhotoThumbnail];
    [self sendPhotoToServer];
}

Moral of the story: even with ARC there are more things to pay attention to when it comes to the memory lifecycle.

Community
  • 1
  • 1
Fitter Man
  • 682
  • 8
  • 17
0

The problem is not in the CGImageDestinationRef logic, because it still leaks even if you replace that with something far simple, such as:

NSBitmapImageRep *rep = [[NSBitmapImageRep alloc] initWithCIImage:scaledImage];
NSData *data = [rep representationUsingType:NSJPEGFileType properties:nil];

Digging a little further, it would appear that the problem appears to be an issue within CILanczosScaleTransform. If you use an inputScale of @1.0, then the leak disappears. But use something less than @1.0 (even @0.5) and it leaks.

I'd suggest you consider finding a different method for resizing the image.

Rob
  • 415,655
  • 72
  • 787
  • 1,044
  • Do you have a specific alternate recommendation for how to resize. I have two needs: one is to create thumbnails of a specific max dimension regardless of the source image size. The other is to extract thumbnails of a fixed size from specific spots on the original photo (crop first, in other words) and in that case the scaling can be up- or down-scaling. Thanks for narrowing it down to the transform. – Fitter Man Mar 14 '15 at 03:28
  • No, I'm an iOS guy, so I probably am not the right person to ask. But searching Stack Overflow for "NSImage resize" came up with a few hits such as [this](http://stackoverflow.com/questions/11949250/how-to-resize-nsimage), [this](http://stackoverflow.com/questions/5264993/resize-and-save-nsimage) or [this](http://stackoverflow.com/questions/2531812/trying-to-resize-an-nsimage-which-turns-into-nsdata). I'm sure "NSImage crop" will yield equivalent links. But my experience with CIFilter is that it's powerful and elegant, but a tad slower, anyway, so finding alternatives is probably prudent. – Rob Mar 14 '15 at 04:04
  • I tried using a CIAffineTransform to do the rescale and the same thing happens. It looks like this is completely broken and I will be looking for another way to do this. – Fitter Man Mar 14 '15 at 19:40