11

I'm trying to make an app to process a set of frames, stored as jpg into the app using Google-vision API.

The pipeline is simple.

1) I create the detector with some options:

     _options = @{
                  GMVDetectorFaceLandmarkType : @(GMVDetectorFaceLandmarkAll),
                  GMVDetectorFaceClassificationType : @(GMVDetectorFaceClassificationAll),
                  GMVDetectorFaceTrackingEnabled : @(NO)
                  };
    _faceDetector = [GMVDetector detectorOfType:GMVDetectorTypeFace options:_options];

2) I read a frame with this method:

    UIImage *image = [UIImage imageWithContentsOfFile:imFile];

The path contained in imFile is correct, I can see the Image representation

3) At last, I process the frame:

NSArray<GMVFaceFeature *> *faces = [_faceDetector featuresInImage:image options:nil];

With this code I can process some frames, but when analyzing a lot of them, the memory usage of the app keeps increasing and the app is killed automatically.

I've tried to track the memory leak, but as far as I tracked it, it comes from inside the last part, inside the [detector featuresInImage...]

Is there something I am doing wrong, or is there a memory leak inside it? I have tried to find any issue from google but couldn't manage to find it.

EDIT:

Here is what I do with each of the results of the detector:

    if ([faces count]>0){
        GMVFaceFeature *face = [faces objectAtIndex:0];
        NSFileHandle *myHandle = [NSFileHandle fileHandleForWritingAtPath:filename];
        [myHandle seekToEndOfFile];

        NSString* lineToWrite = [NSString stringWithFormat:@"%u",fNumber];
        lineToWrite = [lineToWrite stringByAppendingString:[NSString stringWithFormat:@",%f",face.smilingProbability]];
        lineToWrite = [lineToWrite stringByAppendingString:@"\n"];
        NSError *errorWrite;
        [myHandle writeData:[lineToWrite dataUsingEncoding:NSUTF8StringEncoding]];

        if(errorWrite){
            NSLog(@"%@",errorWrite);
        }
    }

The method ends there. So basically what I do is creating a file and writing in it.

Ivan
  • 782
  • 11
  • 23
  • Is it possible that the memory is allocated in the featuresInImage method, but after that, when the features are returned to you, you're storing those infos in an array that you are keeping around a bit too long maybe? I suggest trying to add a return soon after that line of code, to see if the automatic memory management is doing its job correctly. If that's the case, then your problem is actually after the code you posted. By the way it may be possible that the google code leak something, but it's unlikely. – Axy Jul 26 '18 at 05:24
  • I added a some more details – Ivan Jul 26 '18 at 06:52
  • Can't see anything wrong in your code. Can I ask how did you track that the memory issue comes from that line of code? And, if you comment it out, does the issue disappear? – Axy Jul 26 '18 at 06:55
  • Yep, what I did (may be wrong) is releasing all the memory I could after I use it, all the UIImages I use, everything I could, and the profiler showed basically the same memory allocations. It increases by creating one object (which comes from the featuresInImage call), that keeps accumulating and finally crashing my app. – Ivan Jul 26 '18 at 07:16
  • Maybe you can try this. Once I struggled with a memory issue that was related to the fact that I was using some UIKit render functions not on the main thread. So I suggest, you wrap your function in a dispatch on the main thread and profile the app. If the memory issue disappear you (or google) may be using some UIKit function on a background thread and that doesn't allow UIKit to release its memory. – Axy Aug 06 '18 at 06:00

0 Answers0