3

This question is different that in Ios Xcode Message from debugger: Terminated due to memory issue . I am using different device and my app is being killed in foreground, besides that I cannot use Instruments to see allocations.

I am trying to merge short intervals of many AVAssets into one video file. I need to apply additional filters and transformations on them.

I implemented classes, which can take one asset and make everything exactly as I want, but now, when I try to do the same thing with many (cca 7 aasets is still ok) shorter assets (complete duration could be even shorter then with one asset), the application crashes and I get only "Message from debugger: Terminated due to memory issue" log.

I cannot event use most of Instruments tools, because the application crashes immediately with them. I tried many things to solve it, but I was unsuccessful and I would really appreciate some help.

Thank you

Relevant code snippets are here:

Creation of composition:

func export(toURL url: URL, callback: @escaping (_ url: URL?) -> Void){
    var lastTime = kCMTimeZero
    var instructions : [VideoFilterCompositionInstruction] = []
    let composition = AVMutableComposition()
    composition.naturalSize = CGSize(width: 1080, height: 1920)

    for (index, assetURL) in assets.enumerated() {
        let asset : AVURLAsset? = AVURLAsset(url: assetURL)
        guard let track: AVAssetTrack = asset!.tracks(withMediaType: AVMediaType.video).first else{callback(nil); return}

        let range = CMTimeRange(start: CMTime(seconds: ranges[index].lowerBound, preferredTimescale: 1000),
                                end: CMTime(seconds: ranges[index].upperBound, preferredTimescale: 1000))

        let videoTrack = composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid)!
        let audioTrack = composition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid)!

        do{try videoTrack.insertTimeRange(range, of: track, at: lastTime)}
        catch _{callback(nil); return}

        if let audio = asset!.tracks(withMediaType: AVMediaType.audio).first{
            do{try audioTrack.insertTimeRange(range, of: audio, at: lastTime)}
            catch _{callback(nil); return}
        }

        let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
        layerInstruction.trackID = videoTrack.trackID

        let instruction = VideoFilterCompositionInstruction(trackID: videoTrack.trackID,
                                                            filters: self.filters,
                                                            context: self.context,
                                                            preferredTransform: track.preferredTransform,
                                                            rotate : false)
        instruction.timeRange = CMTimeRange(start: lastTime, duration: range.duration)
        instruction.layerInstructions = [layerInstruction]

        instructions.append(instruction)

        lastTime = lastTime + range.duration
    }

    let videoComposition = AVMutableVideoComposition()
    videoComposition.customVideoCompositorClass = VideoFilterCompositor.self
    videoComposition.frameDuration = CMTimeMake(1, 30)
    videoComposition.renderSize = CGSize(width: 1080, height: 1920)
    videoComposition.instructions = instructions

    let session: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetHighestQuality)!
    session.videoComposition = videoComposition
    session.outputURL = url
    session.outputFileType = AVFileType.mp4

    session.exportAsynchronously(){
        DispatchQueue.main.async{
            callback(url)
        }
    }

and part of AVVideoCompositing class:

func startRequest(_ request: AVAsynchronousVideoCompositionRequest){
    autoreleasepool() {
        self.getDispatchQueue().sync{
            guard let instruction = request.videoCompositionInstruction as? VideoFilterCompositionInstruction else{
                request.finish(with: NSError(domain: "jojodmo.com", code: 760, userInfo: nil))
                return
            }
            guard let pixels = request.sourceFrame(byTrackID: instruction.trackID) else{
                request.finish(with: NSError(domain: "jojodmo.com", code: 761, userInfo: nil))
                return
            }

            var image : CIImage? = CIImage(cvPixelBuffer: pixels)

            for filter in instruction.filters{
                filter.setValue(image, forKey: kCIInputImageKey)
                image = filter.outputImage ?? image
            }

            let newBuffer: CVPixelBuffer? = self.renderContext.newPixelBuffer()

            if let buffer = newBuffer{
                instruction.context.render(image!, to: buffer)
                request.finish(withComposedVideoFrame: buffer)
            }
            else{
                request.finish(withComposedVideoFrame: pixels)
            }
        }
    }
Tomáš Černý
  • 101
  • 1
  • 6
  • 3
    Possible duplicate of [Ios Xcode Message from debugger: Terminated due to memory issue](https://stackoverflow.com/questions/34089385/ios-xcode-message-from-debugger-terminated-due-to-memory-issue) – picciano Apr 18 '18 at 15:43
  • These problems seem to be unrelated. I am just getting the same general message "Terminated due to memory issue". Believe me, I've seen all the questions here which could be related and I have tried many solutions but none of them worked. Memory usage in Instruments is low (around 17 MB), still I can see here low memory alerts but in the application, `didReceiveMemoryWarning` isn't fired before the crash and even if it was, I don't know how to free the memory in this case. – Tomáš Černý Apr 18 '18 at 20:48
  • Any chance you managed to solve the rotation issue with a custom AVVideoCompositing class? It's not respecting the asset transform, all the CIFilters in the request are always in landscape... We stuck on this for over a week ah – Roi Mulia Dec 12 '18 at 12:06

1 Answers1

0

There are some circumventive when App returns memory warning. This is due to when we are trying to process a big amount of data.

To avoid this kind of memory warning, we have to make habit of using [unowned self] transcript.

If we are not using this [unowned self] in closure, then we will get memory leak warning and at some stage, app will crash.

You can find more on [unowned self] from below link: Shall we always use [unowned self] inside closure in Swift

After adding [unowned self], add deinit(){ } function in your class and release or nil unwanted data.

Pratik Patel
  • 1,393
  • 12
  • 18
  • Unfortunately I am still getting low memory warnings and crash even with `[unowned self]` and `deinit()` :( . Here is output from Instruments: ![Instruments](https://drive.google.com/file/d/1CcaXGbCSym2TBPn0tz5oTT1dPretm97p/view?usp=sharing). – Tomáš Černý Apr 19 '18 at 07:26
  • Hey Pratik, I've posted question regards this subject. Any chance you can take a look and maybe help me and my team? We are stuck on a small issue for days ah : https://stackoverflow.com/questions/51114201/updating-avplayeritem-video-composition – Roi Mulia Jun 30 '18 at 10:54