3

My problem is: I am trying to do seamless looping (I intend to make my AVPlayer or AVPlayerQueue, loop without any delay between playbacks). So for example, if I do a video and go to the playback, it should be looping endlessly without any blips in between or delays in the looping.

I have written the code below (its straight from example code too):

    var playerQQ: AVQueuePlayer!
    var playerLayur: AVPlayerLayer!
    var playerEyetem: AVPlayerItem!
    var playerLooper: AVPlayerLooper!

    func playRecordedVideo(videoURL: URL) {

        playerQQ = AVQueuePlayer()
        playerLayur = AVPlayerLayer(player: playerQQ)
        playerLayur.frame = (camBaseLayer?.bounds)! 
       camBaseLayer?.layer.insertSublayer(playerLayur, above: previewLayer) 

       playerEyetem = AVPlayerItem(url: videoURL)
        playerLooper = AVPlayerLooper(player: playerQQ, templateItem: playerEyetem)
        playerQQ.play()

    }

The code above does not loop seamlessly; it has blips in-between the end of the current Player and the next one. I have tried a lot to find the problem and searched it online and have not found a solution. Also, I've been trying NSNotifications and other methods including setting Player.seek(to: zero) when the player finishes playback. But nothing has worked at all.

Any help would be appreciated :)

  • Can you give us any indivcation fo what other questions you have looked at? Have you looked here? https://stackoverflow.com/questions/39441621/im-trying-to-use-avqueueplayer-to-create-a-seamless-audio-loop-however-i-don?noredirect=1&lq=1 –  Dec 20 '18 at 23:13
  • yeah, I looked at these: [https://stackoverflow.com/questions/48759288/recommended-way-of-updating-timerange-property-on-avplayerlooper], [https://cocoacasts.com/key-value-observing-kvo-and-swift-3/] <-- and tried to figure out KVO stuff but was unable to. –  Dec 20 '18 at 23:14
  • 1
    `AVPlayerLooper` should be the right way to go. If the video you want to loop has video *and* audio, did you make sure both tracks start at the same time, and have exactly the same length? For some sources, especially recorded material, its not uncommon that tracks have a slight offset or differ a little bit in length, which, when looped, can result in noticeable blips. – NoHalfBits Dec 21 '18 at 00:10
  • @NoHalfBits How can I check if the audio and video are the same length? –  Dec 21 '18 at 00:41
  • as soon as you have playerEyetem, do `for track in playerEyetem.asset.tracks { print( track.mediaType); CMTimeRangeShow( track.timeRange); }` . Or use an app to inspect the file; i still use QuickTimePlayer7 for this a lot. – NoHalfBits Dec 21 '18 at 09:19
  • @NoHalfBits you are right. There **is** a differnce, for example audio was 2.088 while video was 2.102. So, how can i make the times equal?? Thanks for the help! –  Dec 21 '18 at 19:19
  • Just out of the back of my head: Assuming the video is not under your control at build time (if it *is*, well, just trim it with an editor before adding it to the project): Get the common time range of audio and video in the asset (`CMTimeRangeGetIntersection` might be handy). Create a new empty `AVMutableComposition` and copy the common timeRange from the asset into the composition (`insertTimeRange`). Create a `AVPlayerItem` with the composition (note it's an `AVAsset` subclass), then hand the player item to the `AVPlayerLooper`... does this solve the problem? – NoHalfBits Dec 21 '18 at 23:28
  • @NoHalfBits yes the video is created in app while running, so i cant just trim it. I implemented your answer and I think the loop blip is gone (from the limited testing ive done) however now the video gets oriented sideways for some reason. –  Dec 22 '18 at 01:45
  • @NoHalfBits I put an update at bottom of question with my current code Im working with. It flips the orientation to horizontal in playback for some reason. –  Dec 22 '18 at 03:44
  • Sorry, coming from macOS, I always forget the orientation... copying the preferredTransform from the source video track to the composition video track should do the trick. And one more remark: I would suggest inserting at kCMTimeZero instead of intersectionTimeRange.start; the later may be problematic if the time range start time isn't zero, for some reason. Let me know if this resolves the problem, and I'll post this all as a complete answer – NoHalfBits Dec 22 '18 at 14:03
  • @NoHalfBits thanks so much for all the help. Sorry i haven't done swift in a long time. How do I set the preferredTransform for the mutableComposition? Also when I print the preferredTransform for the original asset (asset1) and for the mutableComposition I get the same value. –  Dec 22 '18 at 18:50
  • Assuming there is only one video track, `mutableComposition.tracks(withMediaType: .video).first?.preferredTransform = (asset1.tracks(withMediaType: .video).first?.preferredTransform)!` (*after* the `insertTimeRange`) should result in a composition with video in the same orientation as the source, esp. for portrait videos (again just out of the back of my head, sorry, can't do any testing right now) – NoHalfBits Dec 22 '18 at 22:21
  • @NoHalfBits perfect! Yes it worked now! Thank you so much! Great work. Post as a answer and ill accept it and upvote it. –  Dec 22 '18 at 22:38

4 Answers4

2

It looks like .mp4 files problem, convert .mp4 file to .mov file. The AVPlayer or AVQueuePlayer with the .mp4 file all work fine. here is my code:

NotificationCenter.default.addObserver(forName: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil, queue: nil) { [weak self] (noty) in
   self?.player?.seek(to: CMTime.zero)
   self?.player?.play() }

or

let asset = AVAsset(url: URL(fileURLWithPath: movPath))
let playerItem = AVPlayerItem(asset: asset)
let player = AVQueuePlayer(playerItem: playerItem)
playerLooper = AVPlayerLooper(player: player, templateItem: playerItem)
plyerLayer.frame = CGRect(x: 0, y: 88, width: kWidth, height: kWidth * 0.75)
plyerLayer.videoGravity = .resizeAspectFill
plyerLayer.player = player
view.layer.addSublayer(plyerLayer)
player.play()
Dragon.Yao
  • 21
  • 3
1

One thing to to keep in mind when looping assets is that audio and video tracks can have different offsets and different durations, resulting in 'blips' when looping. Such small differences are quite common in recorded assets.

Iterating over the tracks and printing the time ranges can help to detect such situations: for track in asset.tracks { print( track.mediaType); CMTimeRangeShow( track.timeRange); }

To trim audio and video tracks to equal start times and equal durations, get the common time range of the tracks, and then insert this time range from the original asset into a new AVMutableComposition. Normally, you also want to preserve properties like the orientation of the video track:

let asset: AVAsset = (your asset initialization here)

let videoTrack: AVAssetTrack = asset.tracks(withMediaType: .video).first!
let audioTrack: AVAssetTrack = asset.tracks(withMediaType: .audio).first!

// calculate common time range of audio and video track
let timeRange: CMTimeRange = CMTimeRangeGetIntersection( (videoTrack.timeRange), (audioTrack.timeRange))

let composition: AVMutableComposition = AVMutableComposition()

try composition.insertTimeRange(timeRange, of: asset, at: kCMTimeZero)

// preserve orientation
composition.tracks(withMediaType: .video).first!.preferredTransform = videoTrack.preferredTransform

Since AVMutableComposition is a subclass of AVAsset, it can be used for AVPlayerLooper-based looping playback, or exporting with AVAssetExportSession.

I've put a more complete trimming implementation on github: https://github.com/fluthaus/NHBAVAssetTrimming. It's more robust, handles multiple tracks, preserves more properties and can either be easily integrated in projects or be build as a standalone macOS command line movie trimming utility.

NoHalfBits
  • 604
  • 1
  • 5
  • 10
0

If playing at the end try

NotificationCenter.default.addObserver(self,
                                       selector: #selector(playerItemDidReachEnd(notification:)),
                                       name: Notification.Name.AVPlayerItemDidPlayToEndTime,
                                       object: avPlayer?.currentItem)

 @objc func playerItemDidReachEnd(notification: Notification) {
        if let playerItem: AVPlayerItem = notification.object as? AVPlayerItem {
            playerItem.seek(to: kCMTimeZero, completionHandler: nil)
        }
    }

If not, I would suggest managing your own with a dTime (fire an NSTimer 1/30 seconds or something) and set to play with something like this

    player.seekToTime(seekTimeInProgress, toleranceBefore: kCMTimeZero,
            toleranceAfter: kCMTimeZero, completionHandler: ...

the kCMTimeZero are extremely important, or the time won't be exact. And finally, I've found there is a load time when restarting vids, depending on the iOS phone type and the length of the video and how many your playing, so if you're still getting that lag after you eliminate the timing issues you may be forced to account in your UX.

John Lanzivision
  • 425
  • 4
  • 12
  • I get Use of local variable 'playerItemDidReachEnd(notification:)' before its declaration when trying to implement this. –  Dec 20 '18 at 23:38
  • thanks for the answer it did not work perfectly though, thx for the effort though :). –  Dec 24 '18 at 01:59
0

The answer from @NoHalfBits works great, but also I found another solution. I basically got the intersection time range of the video and sound mediaTypes from the playerItem's asset. After that I added the intersectionTimeRange as the timeRange in the parameter when I call:

playerLooper = AVPlayerLooper(playerQueue:_, playerItem:_, timeRange: intersectionTimeRange)

This will work! To get the timeRanges of each set up a for loop for the playerItem's asset.