5

I have a crash report (it is already symbolicated at least I would hope so as I obtained this log from XCode Organizer)

Incident Identifier: F4324555-0916-4E32-82EF-3272917367BB
Beta Identifier:     80811904-A512-48A1-9593-D386703A62F0
Hardware Model:      iPhone7,2
Process:             SelfieSuperStarz [596]
Path:                /private/var/containers/Bundle/Application/BFA0D82B-274B-400B-8F84-52A1D7369C51/SelfieSuperStarz.app/SelfieSuperStarz
Identifier:          com.PuckerUp.PuckerUp
Version:             21 (1.31)
Beta:                YES
Code Type:           ARM-64 (Native)
Role:                Foreground
Parent Process:      launchd [1]
Coalition:           com.PuckerUp.PuckerUp [434]


Date/Time:           2017-07-29 20:06:11.7394 -0400
Launch Time:         2017-07-29 19:34:39.7433 -0400
OS Version:          iPhone OS 10.3.2 (14F89)
Report Version:      104

Exception Type:  EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Exception Note:  EXC_CORPSE_NOTIFY
Triggered by Thread:  0

Last Exception Backtrace:
0   CoreFoundation                  0x18bebafe0 __exceptionPreprocess + 124 (NSException.m:165)
1   libobjc.A.dylib                 0x18a91c538 objc_exception_throw + 56 (objc-exception.mm:521)
2   CoreFoundation                  0x18be26eb4 -[__NSArray0 objectAtIndex:] + 108 (CFArray.c:69)
3   SelfieSuperStarz                0x10007b708 specialized _ArrayBuffer._getElementSlowPath(Int) -> AnyObject + 116
4   SelfieSuperStarz                0x10007ea40 specialized Merger.merge(completion : () -> (), assets : [Asset]) -> () + 1444 (Merger.swift:0)
5   SelfieSuperStarz                0x100071f3c specialized AssetView.finish(UIButton) -> () + 520 (Merger.swift:0)
6   SelfieSuperStarz                0x1000712d0 @objc AssetView.finish(UIButton) -> () + 40 (AssetView.swift:0)
7   UIKit                           0x192021010 -[UIApplication sendAction:to:from:forEvent:] + 96 (UIApplication.m:4580)
8   UIKit                           0x192020f90 -[UIControl sendAction:to:forEvent:] + 80 (UIControl.m:609)
9   UIKit                           0x19200b504 -[UIControl _sendActionsForEvents:withEvent:] + 440 (UIControl.m:694)
10  UIKit                           0x192020874 -[UIControl touchesEnded:withEvent:] + 576 (UIControl.m:446)
11  UIKit                           0x192020390 -[UIWindow _sendTouchesForEvent:] + 2480 (UIWindow.m:2122)
12  UIKit                           0x19201b728 -[UIWindow sendEvent:] + 3192 (UIWindow.m:2292)
13  UIKit                           0x191fec33c -[UIApplication sendEvent:] + 340 (UIApplication.m:10778)
14  UIKit                           0x1927e6014 __dispatchPreprocessedEventFromEventQueue + 2400 (UIEventDispatcher.m:1448)
15  UIKit                           0x1927e0770 __handleEventQueue + 4268 (UIEventDispatcher.m:1671)
16  UIKit                           0x1927e0b9c __handleHIDEventFetcherDrain + 148 (UIEventDispatcher.m:1706)
17  CoreFoundation                  0x18be6942c __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24 (CFRunLoop.c:1943)
18  CoreFoundation                  0x18be68d9c __CFRunLoopDoSources0 + 540 (CFRunLoop.c:1989)
19  CoreFoundation                  0x18be669a8 __CFRunLoopRun + 744 (CFRunLoop.c:2821)
20  CoreFoundation                  0x18bd96da4 CFRunLoopRunSpecific + 424 (CFRunLoop.c:3113)
21  GraphicsServices                0x18d800074 GSEventRunModal + 100 (GSEvent.c:2245)
22  UIKit                           0x192051058 UIApplicationMain + 208 (UIApplication.m:4089)
23  SelfieSuperStarz                0x10002e990 main + 56 (AppDelegate.swift:16)
24  libdyld.dylib                   0x18ada559c start + 4

As you can see it says in my Class Merger at line 0. Which is impossible, as you can probably assume. I am not sure how to interpret what specialized means or why the @objc is there.

3   SelfieSuperStarz                0x10007b708 specialized _ArrayBuffer._getElementSlowPath(Int) -> AnyObject + 116
4   SelfieSuperStarz                0x10007ea40 specialized Merger.merge(completion : () -> (), assets : [Asset]) -> () + 1444 (Merger.swift:0)
5   SelfieSuperStarz                0x100071f3c specialized AssetView.finish(UIButton) -> () + 520 (Merger.swift:0)
6   SelfieSuperStarz                0x1000712d0 @objc AssetView.finish(UIButton) -> () + 40 (AssetView.swift:0)

Just not sure where the error is occurring as the line says Merger:0 and I'm not sure what those headers (specialized/objc) mean if they are telling me anything.

Here is my merge function inside Merger. I use a variety of loops and calculations for opacity and determine things, but I check for nil in locations.

func merge(completion:@escaping () -> Void, assets:[Asset]) {

    self.setupAI()

    let assets = assets.sorted(by: { $0.layer.zPosition < $1.layer.zPosition })
    if let firstAsset = controller.firstAsset {

        let mixComposition = AVMutableComposition()

        let firstTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo,
                                                                     preferredTrackID: Int32(kCMPersistentTrackID_Invalid))

        do {
            try firstTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, self.controller.realDuration),
                                           of: firstAsset.tracks(withMediaType: AVMediaTypeVideo)[0],
                                           at: kCMTimeZero)
        } catch _ {
            print("Failed to load first track")
        }

        let documentDirectory = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]

        var myTracks:[AVMutableCompositionTrack] = []
        var ranges:[ClosedRange<CMTime>] = []

        for asset in assets {

            let secondTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo,
                                                                          preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
            secondTrack.preferredTransform = asset.asset.preferredTransform
            do {
                try secondTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, asset.endTime-asset.beginTime),
                                               of: asset.asset.tracks(withMediaType: AVMediaTypeVideo)[0],
                                               at: CMTime(seconds: CMTimeGetSeconds(asset.beginTime), preferredTimescale: 600000))
            } catch _ {
                print("Failed to load second track")
            }
            if(ranges.count == 0) {
                ranges.append(asset.beginTime...asset.endTime)
            }
            else {
                var none = true
                for range in ranges {
                    let start = range.contains(asset.beginTime)
                    let end = range.contains(asset.endTime)
                    var connection = false
                    var nothing = false

                    //This range is completely encompassed (begin and end inside)
                    if(start && end) {
                        //Don't add to the rnge
                        none = false
                        nothing = true
                    }

                    //Begin is in range (right side)
                    else if(start && !end) {
                        connection = true
                        none = false
                    }

                    //End is in range (left side)
                    else if(!start && end) {
                        connection = true
                        none = false
                    }

                    var connected = false
                    //It connects 2 different timess
                    if(connection) {
                        for range2 in ranges {
                            if(range != range2) {
                                if(start && range2.contains(asset.endTime)) {
                                    let index = ranges.index(of: range)
                                    if(index != nil) {
                                        ranges.remove(at: index!)
                                        ranges.append(range.lowerBound...range2.upperBound)
                                        connected = true
                                        break
                                    }
                                }
                                else if(end && range2.contains(asset.beginTime)) {
                                    let index = ranges.index(of: range)
                                    if(index != nil) {
                                        ranges.remove(at: index!)
                                        ranges.append(range.lowerBound...range2.upperBound)
                                        connected = true
                                        break
                                    }
                                }
                            }
                        }
                    }
                    if(!connected && !none && !nothing) {
                        if(start) {
                            let index = ranges.index(of: range)
                            if(index != nil) {
                                ranges.remove(at: index!)
                                ranges.append(range.lowerBound...asset.endTime)
                            }
                        }
                        else if(end) {
                            let index = ranges.index(of: range)
                            if(index != nil) {
                                ranges.remove(at: index!)
                                ranges.append(asset.beginTime...asset.endTime)
                            }
                        }
                    }
                }
                if(none) {
                    ranges.append(asset.beginTime...asset.endTime)
                }
            }
            myTracks.append(secondTrack)
        }

        for range in ranges {
            print(CMTimeGetSeconds(range.lowerBound), CMTimeGetSeconds(range.upperBound))
        }
        for assets in self.controller.assets {
            print(CMTimeGetSeconds(assets.beginTime), CMTimeGetSeconds(assets.endTime))
        }

        if let loadedAudioAsset = self.controller.audioAsset {
            let audioTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: 0)
            do {
                try audioTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, self.controller.realDuration),
                                               of: loadedAudioAsset.tracks(withMediaType: AVMediaTypeAudio)[0] ,
                                               at: kCMTimeZero)
            } catch _ {
                print("Failed to load Audio track")
            }
        }

        let mainInstruction = AVMutableVideoCompositionInstruction()
        mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.controller.realDuration)

        // 2.2
        let firstInstruction = self.videoCompositionInstructionForTrack(firstTrack, firstAsset)
        var instructions:[AVMutableVideoCompositionLayerInstruction] = []
        var counter:Int = 0
        for tracks in myTracks {
            let secondInstruction = self.videoCompositionInstructionForTrack(tracks, assets[counter].asset, type:true)
            let index = myTracks.index(of: tracks)

            //This should never be nil, but if it is, it might cause opacity's to go out of whack for that specific track. Only reason I can think of why I am crashing in this method.
            if(index != nil) {
                if(index! < assets.count-1) {
                    for i in (counter+1...assets.count-1) {
                        if(assets[counter].endTime > assets[i].endTime) {
                            secondInstruction.setOpacity(1.0, at: assets[i].endTime)
                            secondInstruction.setOpacity(0.0, at: assets[counter].endTime)
                            print("Bigger")
                            break
                        }
                    }
                }
                if(index! > 0) {
                    for i in (0...counter).reversed() {
                        if(assets[counter].endTime < assets[i].endTime) {
                            secondInstruction.setOpacity(0.0, at: assets[counter].endTime)
                            print("Smaller")
                            break
                        }
                    }
                }
                if(counter < myTracks.count-1) {
                    if(assets[counter].layer.zPosition <= assets[counter+1].layer.zPosition) {
                        secondInstruction.setOpacity(0.0, at: assets[counter+1].beginTime)
                    }
                    else {
                        secondInstruction.setOpacity(0.0, at: assets[counter].endTime)
                    }
                }
                instructions.append(secondInstruction)
                counter += 1
            }
        }

        for range in ranges {
            firstInstruction.setOpacity(0.0, at: range.lowerBound)
            firstInstruction.setOpacity(1.0, at: range.upperBound)
        }

        // 2.3
        mainInstruction.layerInstructions = [firstInstruction] + instructions

        let imageLayer = CALayer()
        let image = UIImage(named: "Watermark")
        imageLayer.contents = image!.cgImage

        let ratio = (firstAsset.tracks(withMediaType: AVMediaTypeVideo)[0].naturalSize.width/image!.size.width)/2
        let rect = CGRect(x: image!.size.width*ratio, y: 0, width: image!.size.width*ratio, height: image!.size.height*ratio)
        imageLayer.frame = rect
        imageLayer.backgroundColor = UIColor.clear.cgColor
        imageLayer.opacity = 0.75

        let videoLayer = CALayer()
        videoLayer.frame = CGRect(x: 0, y: 0, width: firstAsset.tracks(withMediaType: AVMediaTypeVideo)[0].naturalSize.width, height: firstAsset.tracks(withMediaType: AVMediaTypeVideo)[0].naturalSize.height)

        let parentlayer = CALayer()
        parentlayer.frame = CGRect(x: 0, y: 0, width: image!.size.width*ratio, height: image!.size.height*ratio)
        parentlayer.addSublayer(videoLayer)
        parentlayer.addSublayer(imageLayer)

        let mainComposition = AVMutableVideoComposition()
        mainComposition.instructions = [mainInstruction]
        mainComposition.frameDuration = CMTimeMake(1, 30)
        mainComposition.renderSize = self.controller.firstAsset!.tracks(withMediaType: AVMediaTypeVideo)[0].naturalSize
        mainComposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentlayer)
impression7vx
  • 1,728
  • 1
  • 20
  • 50
  • Is there an array of assets which you trying to access an asset at some index? Or any other array in that for which you are trying to access an object which is not there? – adev Jul 30 '17 at 05:37
  • I added my code for that class that affiliated with this situation – impression7vx Jul 30 '17 at 05:40
  • Lot of places in your code has this `[0]`. You should check there if count is greater than zero before doing it and log an error if count is zero. That should be the reason for this crash. – adev Jul 30 '17 at 06:12
  • This is NOT fully symbolicated as line numbers are missing. Please fix that. – meaning-matters Jul 30 '17 at 06:46
  • This is fully symbolicated (according to my knowledge) . I obtained this from XCode's Organizer and that contains everything. From there, I downloaded the file and posted part of it here (the part that matters). I think it shows 0 because it is specialized (situational from my understanding) – impression7vx Jul 30 '17 at 16:30
  • Whether it's fully symbolicated or not depends on whether it has all the information or not. So in this case, it's not "fully" symbolicated as line numbers are missing. I'm not saying this to nitpick, but to point out that perhaps you have to look further in that direction. I have not seen this myself yet, but I had cases where the Organizer did not symbolicate all things. For me, it was using libraries (cocoapods, e.g.). I had to follow https://possiblemobile.com/2015/03/symbolicating-your-ios-crash-reports/ and tinker a bit to get all symbols, so maybe you need to "redo" symbolication. – Gero Aug 04 '17 at 07:41
  • I would agree with you except https://developer.apple.com/library/content/technotes/tn2151/_index.html#//apple_ref/doc/uid/DTS40008184-CH1-SYMBOLICATEWITHXCODE states that "When you archive the application for distribution, Xcode will gather the application binary along with the .dSYM file and store them at a location inside your home folder. You can find all of your archived applications in the Xcode Organizer under the "Archived" section. For more information about creating an archive, refer to the App Distribution Guide" – impression7vx Aug 04 '17 at 17:06
  • I know this documentation very well. You can believe me I was sure my dSYMs must be there, too. But I have _seen_ missing symbolication despite this. My assumption is that it has to do with how your executable is built. In my case I guess the various cocoapods resulted in the .dSYM file being incomplete, but I never dug down that hole. All I am certain of is that it is very possible to build and upload an app without all debug symbols and that _might_ have happened in your case. You can ignore this, but considering nobody seems to know an answer you might want to consider this and investigate. – Gero Aug 08 '17 at 09:15

2 Answers2

0

It's quite clear that the method "Merger.merge" accesses a non-existing element. The debugger will show you where that is.

I'd guess that some thread is modifying some array behind your back so that an index that was valid at some point becomes invalid. And "for range2 in ranges" when you later on modify "ranges" is asking for trouble.

gnasher729
  • 51,477
  • 5
  • 75
  • 98
  • I got the crash report from XCodes Organizer from a tester and I cannot reproduce the issue. So, while a breakpoint test would help, I cannot reproduce the issue. That's why I use the crash report from a crash log from Apple and XCode's Organizer. If you notice, I remove 1 range and always add another. I do this to do something like say: (1-10) but now I want (1-15) so I delete (1-10) and append the (1-15). – impression7vx Jul 30 '17 at 19:28
0

I am guessing that the problem is caused by line 129, considering the +1444 offset and array access. Have you tried a video without audio, like a video taken without an audio permission? I encountered an out of bound crash by assuming videos taken in my app always had audio tracks. You will be surprised by users what video they feed into your app. I even had a user crashing my app with a flv video, selected from their phone album.

(I am not sure if this will work for Swift) It is also possible to locate the line in Xcode using the debugger. First, you run the app with the same optimzation setting as the release build. Then you break different lines in the function to check the assembly code. How to see assembly code in xcode 6. You might need to scroll to the top to double check the function name and the meaning of the offset.

_ArrayBuffer._getElementSlowPath is a class in the private API, indicated by the underscore prefix. You will probably not find any credible and/or official sources about it.

The incompleteness in your crash log is probably caused by bugs/immaturity of Swift (the tools behind it). Welcome to one of the most trended and hyped language. Even in working conditions, Swift gives line 0 for generated code where the code does not correspond to any meaningful parts of the source code. It is not your case though.

keithyip
  • 985
  • 7
  • 21
  • The problem with debugging is I cannot seem to reproduce the error; only one of the testers. Making this process incredibly complicated – impression7vx Aug 07 '17 at 11:37
  • Also note, the audio file I am pulling is not from their audio, it's from our video. It is also a guaranteed video. We use a preset video that is already in the app and go from there. – impression7vx Aug 07 '17 at 11:42
  • You removed all the stack trace of other threads. The multi-thread problem is still open. Assuming it is not a multi-thread problem, unable to reproduce the problem meaning that you have not found the critical step yet. A video can have no audio tracks. I am not sure what a preset is. Have you tried hard enough to feed your app with unexpected videos? – keithyip Aug 07 '17 at 11:45
  • Do you mean that the audio track 0 is from an app bundled video? – keithyip Aug 07 '17 at 11:46
  • Correct (assuming bundle means from within the app and not downloaded/from user). – impression7vx Aug 07 '17 at 11:47
  • I did not read all your code. Another possibility was that the tester did something to mess up the the data used in this method using the UI. – keithyip Aug 07 '17 at 12:07
  • Yea. Working on figuring that part out lol. Gonna meet with the tester and grab crash logs from phone and see how that goes – impression7vx Aug 07 '17 at 12:09
  • @keithyip, i am facing specialized _ArrayBuffer._getElementSlowPath(Int) issue in some of device (mostly in iOS 11 and some of 10.0) randomly (Now it works fine without any changes). My project is with swift 3.0 (Xcode 8.3) and i'm using firebase crashlytics with pods. Could you please figure it out. Fatal Exception: NSRangeException 0 CoreFoundation 0x18a74a1c0 __exceptionPreprocess 1 libobjc.A.dylib 0x18918455c objc_exception_throw 2 CoreFoundation 0x18a6b53dc __CFArrayGetCallBacks 3 USU ACCESS 0x10017174c specialized _ArrayBuffer._getElementSlowPath(Int) -> AnyObject (CalendarView.swift) – Jamshed Alam Aug 19 '18 at 06:51
  • It is better to post a new question when your problem is totally unrelated, in your case CalendarView.swift. Do not focus on ArrayBuffer. Check your exception, signal, calls before and after ArrayBuffer. – keithyip Aug 21 '18 at 02:51