2

I’m playing with the new VNGeneratePersonSegmentationRequest Vision API to make a simple background removal filter

I made a small project to test it, works great, but I’m running into issues with memory. After executing the request the app’s memory consumption adds 300MBs that are never freed.

I’ve cycled through a bunch of images and requests in a test run and thankfully memory consumption remains constant even when filtering more images, but I worry about that initial memory that is never freed, even when inducing a memory warning. I suspect the Vision Framework needs that memory after being called, but my app doesn’t handle video frames or anything, so it’s memory going to waste

//Autorelease pool doesn't helps
    autoreleasepool {
        // Create request
        let request = VNGeneratePersonSegmentationRequest()

        request.revision = VNGeneratePersonSegmentationRequestRevision1
        request.qualityLevel = .accurate
        request.outputPixelFormat = kCVPixelFormatType_OneComponent8

       let handler:VNImageRequestHandler = VNImageRequestHandler(ciImage: inputs.ciImage,
                                            options: [:])

        //A jump in memory usage after running the handler
        //Subsequent calls don't add to memory usage
        do{
            try handler.perform([request])
        }
        catch{
            return
        }
        
        //Even if I delete this chunk of code, memory consumption remains high

        let mask = request.results?.first!
        if let maskBuffer = mask?.pixelBuffer{
            self.personMask = CIImage(cvPixelBuffer: maskBuffer)
            let maskScaleX = inputs.ciImage.extent.width / personMask!.extent.width
            let maskScaleY = inputs.ciImage.extent.height / personMask!.extent.height
            self.personMask = personMask!.transformed(by: __CGAffineTransformMake(
                maskScaleX, 0, 0, maskScaleY, 0, 0))

        }
    }
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
Ebarella
  • 223
  • 1
  • 3
  • 9
  • Have you used Instruments to check which objects use up that memory? I bet Vision is initializing an ML model when you perform the request for the first time and keeps it around just in case you want to perform another one. It will probably release the model again if memory pressure is too high. – Frank Rupprecht Oct 18 '21 at 16:47
  • Getting different reading in Xcode's memory report and Allocations, around (20% of the report numbers, only around 11MB persistent from the call to VNImageRequestHandler). Ran a Release build and still get high memory usage in the memory report, I don't know if attaching a debugger might be the reson – Ebarella Oct 19 '21 at 23:52
  • Can you please try disabling Metal API Validation and GPU Frame Capture? – Frank Rupprecht Oct 21 '21 at 05:27
  • Memory report remains the same in both Release and Debug build with those settings. Using the approach from the link below gives numbers similar to the Allocations instrument so maybe there's something wrong with the memory report? https://stackoverflow.com/questions/40991912/how-to-get-memory-usage-of-my-application-and-system-in-swift-by-programatically/40992791 – Ebarella Oct 22 '21 at 00:32
  • have you found any solution to this? I have the same issue – teodik abrami Apr 24 '22 at 19:08
  • No, there's been a few releases I don't check instruments to avoid anxiety. It also depends on the device used, an iPhone 13 seems to reduce memory usage. Might try to get into a tech talk on WWDC to see if an engineer has something to say – Ebarella Apr 25 '22 at 20:11

0 Answers0