0

I am trying to create video from array of images, and it's working but only in simulator. It seems that the main problem is memory usage. I already optimised arrays but it's obviously not enough, and my app is crashing on the last stage, when it's trying to convert images to frames.

I said 'array' but I don't really use an array for images, I'm creating them on-the-fly (I'm getting the right frame from my video source and applying to it the overlay image, and after that I'm converting this image to frame for the new video).

func getTheCombinedImage(frameNumber: Int)->UIImage {
    let videoURLAsset = videoAsset as! AVURLAsset
    let generator:AVAssetImageGenerator = AVAssetImageGenerator(asset: videoURLAsset)
    generator.requestedTimeToleranceBefore = kCMTimeZero
    generator.requestedTimeToleranceAfter = kCMTimeZero

    var actualTime : CMTime = CMTimeMake(0, 0)
    let duration:CMTime = CMTimeMake(Int64(frameNumber), Int32(30))
    let frameRef:CGImageRef = try! generator.copyCGImageAtTime(duration, actualTime: &actualTime)
    let sourceImage:UIImage = UIImage(CGImage: frameRef)
    let tempImage:UIImage = getTheTempImage(frameNumber)

    UIGraphicsBeginImageContext(sourceImage.size)
    sourceImage.drawInRect(CGRect(x: 0, y: 0, width: sourceImage.size.width, height: sourceImage.size.height), blendMode: CGBlendMode.Normal, alpha: 1.0)
    tempImage.drawInRect(CGRect(x: 0, y: 0, width: tempImage.size.width, height: tempImage.size.height), blendMode: CGBlendMode.Normal, alpha: 1.0)
    let combinedImage = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()

    return combinedImage
}

it seems that this function is not very good in terms of memory (and, maybe, in other terms too). It seems that I didn't release here what should be released but what?

When I don't use this function my memory situation is much better:

screenshot from Instruments

Also, I thinks that it is not enough because 180MB of memory usage is still too high, and after converting to video my memory usage level is like 100-110 MB (instead of 55-60 MB before that). In Instruments/Allocations I can see a lot of JVTLib instances from VideoToolbox (like JVTLib_101510(JVTLib_101496, int)*) — I think they are needed for conversion and _CVPixelBufferStandardMemoryLayout instances — I think these ones were created by me and obviously I'm not releasing something here too.

func appendPixelBufferForImageAtURL(image: UIImage, pixelBufferAdaptor: AVAssetWriterInputPixelBufferAdaptor, presentationTime: CMTime) -> Bool {
    var appendSucceeded = true

    autoreleasepool {
        var pixelBuffer: CVPixelBuffer? = nil
        let status: CVReturn = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, pixelBufferAdaptor.pixelBufferPool!, &pixelBuffer)

        if let pixelBuffer = pixelBuffer where status == 0 {
            let managedPixelBuffer = pixelBuffer
            fillPixelBufferFromImage(image, pixelBuffer: managedPixelBuffer)
            appendSucceeded = pixelBufferAdaptor.appendPixelBuffer(pixelBuffer, withPresentationTime: presentationTime)
        } else {
            NSLog("error: Failed to allocate pixel buffer from pool")
        }
    }

   return appendSucceeded
}

I don't understand what is wrong here.

func fillPixelBufferFromImage(image: UIImage, pixelBuffer: CVPixelBufferRef) {
    let imageData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage))
    CVPixelBufferLockBaseAddress(pixelBuffer, 0)
    let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)
    let bitmapInfo = CGImageAlphaInfo.PremultipliedFirst.rawValue
    let rgbColorSpace = CGColorSpaceCreateDeviceRGB()

    let context = CGBitmapContextCreate(
        pixelData,
        Int(self.externalInputSize.width),
        Int(self.externalInputSize.height),
        8,
        CVPixelBufferGetBytesPerRow(pixelBuffer),
        rgbColorSpace,
        bitmapInfo
    )

   let imageDataProvider = CGDataProviderCreateWithCFData(imageData)
   CGContextDrawImage(context, CGRectMake(0, 0, self.externalInputSize.width, self.externalInputSize.height), image.CGImage)
   CVPixelBufferUnlockBaseAddress(pixelBuffer, 0)
}

I'm not releasing context here but it seems that I don't have to do this in Swift (Do you need to release CGContextRef in Swift?)

update: thanks, guys

it works now. what helped (in case someone has the same problem):

  1. don't put images into arrays (I fixed this before posting the question, but still)

  2. don't panic and autoreleasepool everything.

  3. try to optimize memory usage everywhere (the lower your memory usage on start the higher your memory cap).

  4. put processed image into property.

Community
  • 1
  • 1
lithium
  • 1,272
  • 1
  • 14
  • 28
  • 2
    what is the resolution of images? – heximal Oct 20 '15 at 12:17
  • 1
    To emphasize @heximal's comment, a `UIImage` takes four bytes per pixel. A 1000x1000 image therefore takes 4 million bytes of RAM, and an array of 1,000 images of this size takes 4 billion bytes (not counting overhead for the array itself). – nhgrif Oct 20 '15 at 12:26
  • The resolution of images depends on video resolution. My test file is 640x360. But I'm not using array for images, I'm creating every image on demand. – lithium Oct 20 '15 at 13:02
  • You can try to use Allocation instrument. Here is link on documentation and tutorial: https://developer.apple.com/library/mac/documentation/AnalysisTools/Reference/Instruments_User_Reference/AllocationsInstrument/AllocationsInstrument.html – NixSolutionsMobile Oct 20 '15 at 12:40

2 Answers2

4

By the looks of it, you're writing non-realtime data to an AVAssetWriter. In that case, you want to be careful not to overwhelm the writer by giving it frames faster than it can encode them. The best way to do this is to let the writer pull data from you rather than pushing data at the writer. And that's most easily done with AVAssetWriterInput.requestMediaDataWhenReadyOnQueue(_:usingBlock:). There's a simple example in the docs for how to use this function.

In this pull-style, you tell the writer "here's a block to call whenever you can handle more data." The writer calls it, and it should keep adding data until isReadForMoreMediaData becomes false. Then the block returns, and will be called again whenever the writer is ready for some more. You set a "I'm done now" boolean when there are no more frames.

You should be careful to drain your autorelease pool periodically by making sure that any loops have autoreleasepool{} inside them, but it looks like you generally are already doing that. If you're not, and you have a loop that generates large images without an autoreleasepool, then that's certainly your problem.

As a small benefit to reduce memory churn, you should probably cache any static objects in a property, like rgbColorSpace. You also probably don't need to be generating a UIImage in order to draw into the pixel buffer. You can just use CGContextDrawImage to draw your CGImage directly. (And probably return a CGImage from getTheTempImage.)

Unrelated side note: avoid prefixing methods with get. That has a specific memory management meaning that is contrary to your usage (it means the result will be passed back in a pointer-to-pointer parameter). ARC is smart enough to avoid creating a memory bug, but it's confusing to developers who know the ARC naming rules. Rather than getTheCombinedImage(_:) it should be combinedImageForFrameNumber(_:).

Rob Napier
  • 286,113
  • 34
  • 456
  • 610
  • Thanks a lot. I am already using `requestMediaDataWhenReadyOnQueue`, so it's not a problem here. I'll try `autoreleasepool{}' on getTheCombinedImage() after katleta3000's advice and it works — now my memory usage is like 200-250MB instead of 400-600MB. It's still too much but it is definitely better. – lithium Oct 20 '15 at 13:15
  • Why do I want to use CGImage instead of UIImage here? Is it any better in terms of memory? – lithium Oct 20 '15 at 13:19
  • You've already created a `CGImage` (`frameRef`). Wrapping that into a `UIImage` is just a waste (it certainly isn't going to be *more* efficient). `UIImage` doesn't add a lot of overhead, but there's no reason to add anything in this case. – Rob Napier Oct 20 '15 at 13:21
2

Had similiar problem with looping image animation (when UIImage object is created it's cached - so many instances gave huge memory consumption). ARC is bad for such situations, cause holds references over-time and not release it in cycles with much memory arrangment. Try wrapping your code in autoreleasepool or look into manual memory managment:

autoreleasepool {
  /* code */ 
}
katleta3000
  • 2,484
  • 1
  • 18
  • 23
  • thanks a lot. it definitely better now but the problem is still here (250 MB instead of 400-600 MB). – lithium Oct 20 '15 at 13:19
  • @lithium tried releasing CGReferences? like `CGContextRelease()` ? – katleta3000 Oct 20 '15 at 13:37
  • yes, I tried to do that, but have an error with message "CGContextRelease is unavailable: Core Foundation objects are automatically memory managed" – lithium Oct 20 '15 at 13:41