I am working on a rich graphics iOS application. At an instance, the memory taken by our application is 250 MB. I would take each frame from Camera, process it with OpenGL shaders and extract some data. Each time I use the camera to get the frames for processing I see an increase in the memory up to 280 MB. When I stop capturing the frames, memory comes back to normal to 250 MB. If I repeat the process of starting the camera and exiting for 10 times (lets say), I receive a memory warning (Though no memory leak being observed). I am not using ARC here. I am maintaing an auto release pool that includes the entire processing of a frame. I don't see any leaks while profiling. After 10 times, the memory seems to stand at 250 MB. I am not sure of the reason for memory warning. Any insights? I am happy to provide further information. Opengl version - ES 2.0, iOS version - 7.0
3 Answers
you have to use ARC, it will automatically release the bad memory, and make your application optimized

- 391
- 1
- 13
According to some other questions like this one (Crash running OpenGL on iOS after memory warning) and this one (instruments with iOS: Why does Memory Monitor disagree with Allocations?) the problem may be that you aren't deleting OpenGL resources (VBOs, textures, renderbuffers, whatever) when you're done with them.
-
Thanks. I am issuing commands to delete all my FBO, textures, render buffers. – user862972 Feb 17 '14 at 08:19
Without seeing code, who knows? Are you simply rendering the frame buffer using the presentRenderbuffer method of EAGLContext? Then, what are you doing with the pixelBuffer you passed to CVOpenGLESTextureCacheCreateTextureFromImage? The pixel buffer is the only source of substantial memory in a typical use scenario.
However, if you're swapping the data in the render buffer to another buffer with, say, glReadPixels, then you've introduced one of several memory hogs. If the buffer you swapped to was a CoreGraphics buffer via, say, a CGDataProvider, did you include a data release callback, or did you pass nil as the parameter when you created the provider? Did you glFlush after you swapped buffers?
These are questions for which I could ascertain answers if you provided code; if you think you can tackle this without doing so, but would like to see working code that successfully manages memory in the most arduous use-case scenario there could possibly be:
https://demonicactivity.blogspot.com/2016/11/tech-serious-ios-developers-use-every.html
For your convenience, I've provided some code below. Place it after any call to the presentRenderbuffer
method, commenting out the call if you do not want to render the buffer to the display in the CAEAGLLayer (as I did in the sample below):
// [_context presentRenderbuffer:GL_RENDERBUFFER];
dispatch_async(dispatch_get_main_queue(), ^{
@autoreleasepool {
// To capture the output to an OpenGL render buffer...
NSInteger myDataLength = _backingWidth * _backingHeight * 4;
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
glPixelStorei(GL_UNPACK_ALIGNMENT, 8);
glReadPixels(0, 0, _backingWidth, _backingHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
// To swap the pixel buffer to a CoreGraphics context (as a CGImage)
CGDataProviderRef provider;
CGColorSpaceRef colorSpaceRef;
CGImageRef imageRef;
CVPixelBufferRef pixelBuffer;
@try {
provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, &releaseDataCallback);
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * _backingWidth;
colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
imageRef = CGImageCreate(_backingWidth, _backingHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
} @catch (NSException *exception) {
NSLog(@"Exception: %@", [exception reason]);
} @finally {
if (imageRef) {
// To convert the CGImage to a pixel buffer (for writing to a file using AVAssetWriter)
pixelBuffer = [CVCGImageUtil pixelBufferFromCGImage:imageRef];
// To verify the integrity of the pixel buffer (by converting it back to a CGIImage, and thendisplaying it in a layer)
imageLayer.contents = (__bridge id)[CVCGImageUtil cgImageFromPixelBuffer:pixelBuffer context:_ciContext];
}
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpaceRef);
CGImageRelease(imageRef);
}
}
});
. . .
The callback to free the data in the instance of the CGDataProvider class:
static void releaseDataCallback (void *info, const void *data, size_t size) {
free((void*)data);
}
The CVCGImageUtil class interface and implementation files, respectively:
@import Foundation;
@import CoreMedia;
@import CoreGraphics;
@import QuartzCore;
@import CoreImage;
@import UIKit;
@interface CVCGImageUtil : NSObject
+ (CGImageRef)cgImageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer context:(CIContext *)context;
+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image;
+ (CMSampleBufferRef)sampleBufferFromCGImage:(CGImageRef)image;
@end
#import "CVCGImageUtil.h"
@implementation CVCGImageUtil
+ (CGImageRef)cgImageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer context:(CIContext *)context
{
// CVPixelBuffer to CoreImage
CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer];
image = [image imageByApplyingTransform:CGAffineTransformMakeRotation(M_PI)];
CGPoint origin = [image extent].origin;
image = [image imageByApplyingTransform:CGAffineTransformMakeTranslation(-origin.x, -origin.y)];
// CoreImage to CGImage via CoreImage context
CGImageRef cgImage = [context createCGImage:image fromRect:[image extent]];
// CGImage to UIImage (OPTIONAL)
//UIImage *uiImage = [UIImage imageWithCGImage:cgImage];
//return (CGImageRef)uiImage.CGImage;
return cgImage;
}
+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
{
CGSize frameSize = CGSizeMake(CGImageGetWidth(image),
CGImageGetHeight(image));
NSDictionary *options =
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES],
kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES],
kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status =
CVPixelBufferCreate(
kCFAllocatorDefault, frameSize.width, frameSize.height,
kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)options,
&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(
pxdata, frameSize.width, frameSize.height,
8, CVPixelBufferGetBytesPerRow(pxbuffer),
rgbColorSpace,
(CGBitmapInfo)kCGBitmapByteOrder32Little |
kCGImageAlphaPremultipliedFirst);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
+ (CMSampleBufferRef)sampleBufferFromCGImage:(CGImageRef)image
{
CVPixelBufferRef pixelBuffer = [CVCGImageUtil pixelBufferFromCGImage:image];
CMSampleBufferRef newSampleBuffer = NULL;
CMSampleTimingInfo timimgInfo = kCMTimingInfoInvalid;
CMVideoFormatDescriptionRef videoInfo = NULL;
CMVideoFormatDescriptionCreateForImageBuffer(
NULL, pixelBuffer, &videoInfo);
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
pixelBuffer,
true,
NULL,
NULL,
videoInfo,
&timimgInfo,
&newSampleBuffer);
return newSampleBuffer;
}
@end

- 1,485
- 14
- 19