2

The app I’m working on loops a video a specified # of times by adding the same AVAssetTrack (created from the original video url) multiple times to the same AVComposition at successive intervals. The app similarly inserts a new video clip into an existing composition by 'removing' the time range from the composition's AVMutableCompositionTrack (for AVMediaTypeVideo) and inserting the new clip's AVAssetTrack into the previously removed time range.

However, occasionally and somewhat rarely, after inserting a new clip as described above into a time range within a repeat of the original looping video, there are resulting blank frames which only appear at the video loop’s transition points (within the composition), but only during playback - the video exports correctly without gaps.

This leads me to believe the issue is with the AVPlayer or AVPlayerItem and how the frames are currently buffered for playback, rather than how I'm inserting/ looping the clips or choosing the correct CMTime stamps to do so. The app is doing a bunch of things at once (loop visualization in the UI via an NSTimer, audio playback via Amazing Audio Engine) - could my issue be a result of competition for resources?

One more note: I understand that discrepancies between audio and video in an asset can cause glitches (i.e. the underlying audio is a little bit longer than the video length), but as I'm not adding an audioEncodingTarget to the GPUImageWriter that I'm using to record and save the video, the videos have no audio components.

Any thoughts or directions you can point me in would be greatly appreciated! Many thanks in advance.

Update: the flashes coincide with the "Had to drop a video frame" error logged by the GPUImage library, which according to the creator has to do with the phone not being able to process video fast enough. Can multi-threading solving this?

Update 2: So the flashes actually don't always correspond to the had to drop a video frame error. I have also disabled all of the AVRecorder/Amazing Audio Engine code and the issue still persists making it not a problem of resource competition between those engines. I have been logging properties of AVPlayer item and notice that the 'isPlayBackLikelyToKeepUp' which is always NO, and 'isPlaybackBufferFull' which is always yes.

  • "could my issue be a result of competition for resources" Absolutely. As you rightly say, the problem doesn't exist when exporting - only when trying to do all this _live_. – matt Jul 17 '15 at 18:39
  • thanks matt. In the CPU report while running the app the AURemoteIO::IOThread is consistently maxed out (as well as 'thread 2'). is this also indicative of my problem, and can something like GCD possibly help? – Christopher Maier Jul 18 '15 at 17:15
  • I don't see what GCD has to do with it. The same work on another thread is still the same work... If anything, the problem sounds like you are _already_ trying to do too much at once... You might do a google search on GPUImage and Amazing Audio Engine, since I see that other people trying to use them together also experience difficulties (e.g. http://stackoverflow.com/questions/31485744/no-audio-in-video-recording-using-gpuimage-after-initializing-the-amazing-audi) – matt Jul 18 '15 at 17:29
  • Sorry if my questions seem silly - this is my first time digging into a complicated audio/video app. Looks like there are some tweaks I can make to things like buffer duration to improve efficiency. Is there a way in Xcode to know if I'm definitively pushing the phone to its capacity? Maybe it's possible that running 3 separate audio/video systems (GPU image/AVPlayer/Amazing Audio Engine) simultaneously is just not feasible. Thanks again. – Christopher Maier Jul 19 '15 at 14:57
  • "Is there a way in Xcode to know if I'm definitively pushing the phone to its capacity?" You run it on a device and use Instruments, which tells you all about the use of the CPU, the energy drain, the memory, etc. But it sounds like you already know that! :) And yes, "not feasible" is exactly what I'm suggesting, but I have not push the phone to these kinds of limits, so I don't really know. Keep in mind that there are still devices out there that have _only one CPU_ that can run iOS 8/9... – matt Jul 19 '15 at 15:57

1 Answers1

1

So problem is solved - sort of frustrating how brutally simple the fix is. I just used a time range a frame shorter for adding the videos to the composition rather than the AVAssetTrack's time range. No more flashes. Hopefully the users won't miss that 30th of a second :)

shortened_duration = CMTimeSubtract(originalVideoAssetTrack.timeRange.duration, CMTimeMake(1,30));