73

I'm working on an application where there is a collection view, and cells of the collection view can contain video. Right now I'm displaying the video using AVPlayer and AVPlayerLayer. Unfortunately, the scrolling performance is terrible. It seems like AVPlayer, AVPlayerItem, and AVPlayerLayer do a lot of their work on the main thread. They are constantly taking out locks, waiting on semaphores, etc. which is blocking the main thread and causing severe frame drops.

Is there any way to tell AVPlayer to stop doing so many things on the main thread? So far nothing I've tried has solved the problem.

I also tried building a simple video player using AVSampleBufferDisplayLayer. Using that I can make sure that everything happens off the main thread, and I can achieve ~60fps while scrolling and playing video. Unfortunately that method is much lower level, and it doesn't provide things like audio playback and time scrubbing out of the box. Is there any way to get similar performance with AVPlayer? I'd much rather use that.

Edit: After looking into this more, it doesn't look like it's possible to achieve good scrolling performance when using AVPlayer. Creating an AVPlayer and associating in with an AVPlayerItem instance kicks off a bunch of work which trampolines onto the main thread where it then waits on semaphores and tries to acquire a bunch of locks. The amount of time this stalls the main thread increases quite dramatically as the number of videos in the scrollview increases.

AVPlayer dealloc also seems to be a huge problem. Dealloc'ing an AVPlayer also tries to synchronize a bunch of stuff. Again, this gets extremely bad as you create more players.

This is pretty depressing, and it makes AVPlayer almost unusable for what I'm trying to do. Blocking the main thread like this is such an amateur thing to do so it's hard to believe Apple engineers would've made this kind of mistake. Anyways, hopefully they can fix this soon.

Antonio
  • 988
  • 1
  • 7
  • 12
  • 3
    "it doesn't look like it's possible to achieve good scrolling performance when using AVPlayer" This is simply not the case. There are many apps that relay on AVFoundation for playing back media in scrolling feeds. Vine / Facebook / Instagram all use AVPlayer's to playback media in feeds. Its very tricky but what damian outlined below is a good start for getting it going. – Andy Poes Jul 02 '15 at 03:09
  • 2
    Vine, Facebook, and Instagram are actually all fairly choppy when scrolling past videos. There are some very noticeable frame drops for all of them. Instagram has the best performance, but I also had trouble finding a screen with lots of long, high definition videos going at the same time. It looks like they haven't solved the problem either, even though they all have great engineers and tons of resources. I'm pretty sure that AVPlayer is the problem here. If you don't believe me, fire up Instruments and take a look at how often the main thread gets blocked. – Antonio Jul 03 '15 at 00:11
  • Another good example is Storehouse. It's a great app with amazing scrolling performance, and obviously it was made by people who knew what they were doing. Even still, create a Storehouse article and fill it with a bunch of videos and try scrolling through the page. It is extremely choppy. I'm pretty sure the problem here is AVPlayer. There's just no way to get it to stop blocking the main thread. Even if there is some magical way to get AVPlayer to behave, this is inexcusably bad API design as it shouldn't take a ridiculous amount of effort to get not terrible scrolling performance. – Antonio Jul 03 '15 at 00:21
  • Yea, no doubt AVFoundation will cause frame drops on main thread. But you can mitigate as best as you can through a lot of various techniques. Audio is a huge hit – avoiding audio playback will help lessen the frame drops. Also - with scrolling, aggressive controlling of when videos play / pause is essential. For instance, you get a frame hit when tearing down a video, so you can avoid some frame drops by not stopping videos until you start a new one. Anyways, there's a few techniques that can be applied that are outside AVFoundation that could help the issue as well. – Andy Poes Jul 06 '15 at 19:53
  • Did you found any solution? I have just similar issue playing videos just in place in uitableview, nothing from loadValuesAsynchronouslyForKeys to AVSampleBufferDisplayLayer helps – Roman Truba Apr 05 '16 at 18:27
  • 2
    @RomanTruba, unfortunately there is no solution that'll be performant with `AVPlayer`. As I stated in my **Edit** section, the performance problems are inherent in the implementation of `AVFoundation`, and there's nothing you can do about it. If you want to play video in some sort of view that scrolls, and you don't want to drop a ton of frames, then the only real way to do that is to build your own video player from scratch, bypassing `AVPlayer` completely. Or, you can wait for iOS 10 and hope that Apple will have gotten its act together by then. – Antonio Apr 18 '16 at 18:54
  • 1
    Is there any way to tell AVPlayer to stop doing so many things on the main thread? So far nothing I've tried has solved the problem... Create your own dispatch queues using your own context. – James Bush Jul 08 '16 at 17:31
  • @JamesBush, no there's nothing you can do. No matter what you do, no matter what thread you try to set everything up on, it will jump onto the main thread itself and do a bunch of blocking calls. Look at all of Apple's apps. There's a reason none of them have video that plays as you're scrolling. The video will always animate to fullscreen before it does anything. – Antonio Jul 12 '16 at 07:48
  • Sorry, Antonio; you're wrong on that one. How do I know? Because I just finished doing it yesterday. I can scroll very smoothly at any velocity in a collection view of videos playing also smoothly. It wasn't indeed all about threads. Anyone want the code? I'll post here shortly – James Bush Jul 13 '16 at 18:43
  • @james-bush, I am interested in the solution, would like to have a look at the code. Please, share, if you don't mind. – Richard Topchii Sep 25 '16 at 21:17
  • @RichardTopchiy You can download my sample app from my blog at http://demonicactivity.blogspot.com/2016/08/draft-what-in-hell-this-demoniac-is-up.html?m=1 That's an older one that uses AVPlayer; I have a code you can swap that still uses AVPlayer, but, to render, OpenGL. The limit is still 16 at a time, but is also real-time – James Bush Sep 28 '16 at 04:02
  • @Antonio Wrong-a-roony on this one. – James Bush Sep 28 '16 at 04:03
  • @JamesBush Create a sample app that consists of a table view or collection view, where each cell in the list plays a video that is either stored locally in the app bundle, or on a remote server, but NOT in the photo library. Make sure there are at least 5 - 10 video files, and over 100 cells in the list. If you can smoothly scroll through that list of videos at 60fps and have all the videos play, I would love to see that answer and I'm sure everyone else who has upvoted this question would love to see that answer too. – Antonio Sep 29 '16 at 00:16
  • @Antonio You can download the sample app you described at the above-stated link. You can watch a video of it there, too. – James Bush Sep 29 '16 at 00:53
  • @JamesBush Thanks for putting the sample code together. It's very interesting. I noticed that the videos I was loading were stuttering and playing at an inconsistent rate, though main thread interactions were almost perfectly responsive. Are there any concerns with ditching `AVPlayer` and syncing sample buffers correctly? – CIFilter Feb 24 '17 at 00:36
  • 3
    For what it's worth, we were able to get good scrolling performance by a) having a pool of reusable players and player layers, b) setting up players (and layers) in advance (e.g., +/- 1 view off screen), and, **most importantly**, c) adding the player layer to the window, hidden, in advance as well. `AVPlayerLayer` has a lot of internal setup that will not happen until it can conceivably be visible; that is, it has to exist inside a window, even if it's just for one runloop. Doing this, we get very little main thread blockage when rapidly adding/removing players. – CIFilter Feb 25 '17 at 00:58
  • I should add that, in our system, we basically set up players and layers well in advance. We request them in our view controllers relatively lazily (i.e., +/- one offscreen view controller). Finally, we insert the actual player layer into the view hierarchy and begin playing as lazily as possible (i.e., the moment any part of the view controller is visible). This was sufficient for our use case for AV Foundation to stop blocking the main thread during user interaction. – CIFilter Feb 25 '17 at 01:05
  • 1
    I'm not sure if I commented on this yet; but, loading sample buffers using an asset reader, and then displaying the sample buffers in an AVSampleBufferDisplayLayer would work better than having multiple player objects. That's because the sample buffers are vended on a separate thread, and so is Core Animation. Going this route takes the guesswork out of multithreaded programming, and leaves you with a better informed solution than you could provide on your own. – James Bush May 03 '17 at 20:01
  • do you have any solution for this ?? I am dying to implement a collection view of videos that can be shown all at the same time – Farid Al Haddad Jun 04 '17 at 21:45
  • @Antonio, so have you find a way to do it? – John Aug 19 '17 at 19:35
  • @John, unfortunately I don't think there's anyway to get good scroll performance with AVPlayer if your use case is showing videos inside of the cells of a table view or collection view. – Antonio Aug 21 '17 at 07:59
  • But how come Intagram and other apps can do it? – John Aug 21 '17 at 18:44
  • @John, Instagram probably doesn't use AVPlayer. They probably wrote their own video player from scratch. Most likely because they ran into the same issues that his post brings up. – Antonio Aug 22 '17 at 01:13
  • Yeah probably, could you please take a look at this problem for me? Thanks https://stackoverflow.com/questions/45777602/firebase-database-indexpath?noredirect=1#comment78514612_45777602 – John Aug 22 '17 at 01:48
  • @Antonio Have u got it solved? Is there any open source player available which can perform these heavy tasks in bg? – SandeepAggarwal Oct 16 '17 at 12:29
  • @SandeepAggarwal, nope. The best you can do is write your own video player using the video toolbox APIs or something. – Antonio Oct 18 '17 at 05:44
  • 1
    Sorry I can not answer your question, however if you don't mind I do have a question as you seem knowledgeable. When your app enters the background, and you reopen it; have your videos stopped playing? If not, how have you made them continue playing. My question is similar to this one: https://stackoverflow.com/questions/58241061/avplayer-inside-of-uicollectionviewcell-stops-playing-when-app-enters-background. Thank you very much and best of luck! – Wert Oct 05 '19 at 20:38

6 Answers6

25

Build your AVPlayerItem in a background queue as much as possible (some operations you have to do on the main thread, but you can do setup operations and waiting for video properties to load on background queues - read the docs very carefully). This involves voodoo dances with KVO and is really not fun.

The hiccups happen while the AVPlayer is waiting for the AVPlayerItems status to become AVPlayerItemStatusReadyToPlay. To reduce the length of the hiccups you want to do as much as you can to bring the AVPlayerItem closer to AVPlayerItemStatusReadyToPlay on a background thread before assigning it to the AVPlayer.

It's been a while since I actually implemented this, but IIRC the main thread blocks are caused because the underlying AVURLAsset's properties are lazy-loaded, and if you don't load them yourself, they get busy-loaded on the main thread when the AVPlayer wants to play.

Check out the AVAsset documentation, especially the stuff around AVAsynchronousKeyValueLoading. I think we needed to load the values for duration and tracks before using the asset on an AVPlayer to minimize the main thread blocks. It's possible we also had to walk through each of the tracks and do AVAsynchronousKeyValueLoading on each of the segments, but I don't remember 100%.

damian
  • 3,604
  • 1
  • 27
  • 46
  • 5
    Thanks a lot for the tips! Can you elaborate on "building the `AVPlayerItem` on a background queue as much possible". I've tried creating the `AVPlayerItem` on a background thread and loading its asset's tracks however its status is always AVPlayerItemStatusUnknown. It seems like it doesn't transition to AVPlayerItemStatusReadyToPlay until it's associated with an AVPlayer, and that AVPlayer is hooked up to an AVPlayerLayer. I'm sure I'm doing something wrong here. – Antonio May 21 '15 at 19:24
  • @Antonio Hey did you managed to figure this out? i'm also trying to create this – YYfim Jan 12 '16 at 12:02
  • Hey @YuviGr, I looked into this a lot more and it doesn't seem like there's anything you can do. AVPlayer and AVFoundation are going to block the main thread, and there's no good way to work around that. It's really depressing, but it's an architectural issue that only people at Apple are capable of fixing. Honestly, the only way to simultaneously get video playback and smooth scrolling is to implement your own video player from scratch, bypassing pretty much all of AVFoundation. Either that or wait a year for iOS 10 and hope Apple gets it together by then. – Antonio Jan 21 '16 at 01:22
  • 1
    Hey @Antonio I actually manage to create this using a few instances of AVPlayer and it works awesome, if you are still interested in accomplishing this I'll be happy to further elaborate – YYfim Jan 21 '16 at 04:43
  • @YuviGr, ya please tell me more! How are you managing to get the main thread unblocked? Have you tried embedding a bunch of videos in a scrollable list, like a collection view or something? – Antonio Jan 21 '16 at 09:21
  • @Antonio i added a answer, hope it helps you – YYfim Jan 22 '16 at 07:25
  • @Antonio take a look at AndyPoes answer. He's using the same method I (tried to) outline above. – damian Mar 30 '16 at 08:52
  • 1
    This answer is contrary to Apple's documentation, where the `AVPlayer` is constructed with the just-created `AVPlayerItem` before the status has become `AVPlayerItemStatusReadyToPlay`. In fact I did never reach that state without having previously constructed the `AVPlayer`. – Gobe Sep 14 '16 at 21:24
  • 1
    He asynchronous key loading stuff has nothing to do with downloading sample buffers from the file. Completely unrelated. You do not need to do any work whatsoever related to multi-threaded programming. If you truly need to be break up tasks among multiple cues and threads, simply use an asset reader to read sample buffers and then display them in an AVSampleBufferDisplayLayer. – James Bush May 03 '17 at 20:03
  • @gobe yeah, you're trying to get the `AVPlayerItem` *closer* to `AVPlayerItemStatusReadyToPlay` than it is when it is freshly `alloc`'d. – damian Feb 19 '18 at 19:34
  • @JamesBush If you do everything on the UI thread your scroll view will not scroll smoothly (it will noticeably pause scrolling) when you start a video playing. There are no techniques to make that completely go away but you can make the pauses much smaller by moving particular tasks to a background thread. – damian Feb 19 '18 at 19:37
  • @damian I don't see that I'm doing anything wrong here; I have a video that shows this working, and I can send you working code so you can see it for yourself. Please completely evaluate what anyone suggests before you criticize. I don't give bad advice, and my code is always tried-and-true before I pass it out. – James Bush Feb 22 '18 at 15:55
  • @JamesBush In that case, please consider reflecting on how your comment might have come across to me. – damian Feb 23 '18 at 09:00
20

Don't know if this will help – but here's some code I'm using to load videos on background queue that definitely helps with main thread blocking (Apologies if it doesn't compile 1:1, I abstracted from a larger code base I'm working on):

func loadSource() {
    self.status = .Unknown

    let operation = NSBlockOperation()
    operation.addExecutionBlock { () -> Void in
    // create the asset
    let asset = AVURLAsset(URL: self.mediaUrl, options: nil)
    // load values for track keys
    let keys = ["tracks", "duration"]
    asset.loadValuesAsynchronouslyForKeys(keys, completionHandler: { () -> Void in
        // Loop through and check to make sure keys loaded
        var keyStatusError: NSError?
        for key in keys {
            var error: NSError?
            let keyStatus: AVKeyValueStatus = asset.statusOfValueForKey(key, error: &error)
            if keyStatus == .Failed {
                let userInfo = [NSUnderlyingErrorKey : key]
                keyStatusError = NSError(domain: MovieSourceErrorDomain, code: MovieSourceAssetFailedToLoadKeyValueErrorCode, userInfo: userInfo)
                println("Failed to load key: \(key), error: \(error)")
            }
            else if keyStatus != .Loaded {
                println("Warning: Ignoring key status: \(keyStatus), for key: \(key), error: \(error)")
            }
        }
        if keyStatusError == nil {
            if operation.cancelled == false {
                let composition = self.createCompositionFromAsset(asset)
                // register notifications
                let playerItem = AVPlayerItem(asset: composition)
                self.registerNotificationsForItem(playerItem)
                self.playerItem = playerItem
                // create the player
                let player = AVPlayer(playerItem: playerItem)
                self.player = player
            }
        }
        else {
            println("Failed to load asset: \(keyStatusError)")
        }
    })

    // add operation to the queue
    SomeBackgroundQueue.addOperation(operation)
}

func createCompositionFromAsset(asset: AVAsset, repeatCount: UInt8 = 16) -> AVMutableComposition {
     let composition = AVMutableComposition()
     let timescale = asset.duration.timescale
     let duration = asset.duration.value
     let editRange = CMTimeRangeMake(CMTimeMake(0, timescale), CMTimeMake(duration, timescale))
     var error: NSError?
     let success = composition.insertTimeRange(editRange, ofAsset: asset, atTime: composition.duration, error: &error)
     if success {
         for _ in 0 ..< repeatCount - 1 {
          composition.insertTimeRange(editRange, ofAsset: asset, atTime: composition.duration, error: &error)
         }
     }
     return composition
}
Andy Poes
  • 1,682
  • 15
  • 19
  • Why did you iterate when creating AVMutableComposition? I don't see any difference in timeRange. Don't you reuse AVPlayer? I'm struggling with scrolling performances so every hint is like a blessing :) – Josip B. Dec 10 '15 at 14:34
  • @JosipB. I just iterated to make the video loop. You can see it adds at time composition.duration, so every time it iterates, composition.duration is updated. – Andy Poes Jan 19 '16 at 02:50
  • 1
    Great answer @AndyPoes! I've been breaking my head on this for more than 10 hours... – Tal Zion Nov 22 '16 at 10:20
  • 1
    Hi @AndyPoes, I'm playing automatically videos while the user is scrolling the collection view. The play action seems to block the UI, and even if it's wrapped in background queue, the AVFoundation goes back to the main thread to proceed the Play Action. Do you have any suggestions, hints to resolve that ? Instagram having a very smooth scrolling experience. I'm really wondering how they do that. – manonthemoon Dec 31 '16 at 20:41
  • @manonthemoon There's a few reasons this can happen: (1) Audio is costly. If you remove the audio tracks from your video you'll notice less hit to main thread. (2) More megabytes = more processing. Smaller / shorter videos load and play with better performance. (3) Only play when necessary. Don't try to play videos as soon as they come on screen. Threshold playback to scroll velocity. (4) Pause videos instead of stopping videos. Tearing down videos can be costly, so only teardown video when the cell is reused. (5) Release mode improves playback a ton! – Andy Poes Jan 02 '17 at 23:31
  • There's no such thing as a background queue. There are queues that run tasks in threads that have a background priority, though. – James Bush May 03 '17 at 20:05
  • 1
    Terrible answer. There's no such thing as loading videos on a background queue when Apple has already done that for you. Even if they didn't, all this would do is issue the AVFoundation class methods you invoked on one thread, whereas the actual classes might put them right back into another. They aren't stupid; they know the most efficient way to load videos. Even still, videos load one sample buffer at a time, so even if you could somehow queue the load operation on the main queue, the UI and whatever else would never be affected because you're not loading the whole video at once...never. – James Bush Feb 22 '18 at 16:01
  • This worked perfectly for my case, thank you. – Brandon WongKS Feb 22 '22 at 15:16
8

If you look into Facebook's AsyncDisplayKit (the engine behind Facebook and Instagram feeds), you can render video for the most part on background threads using their AVideoNode. If you subnode that into an ASDisplayNode and add the displayNode.view to whatever view you are scrolling (table/collection/scroll), you can achieve perfectly smooth scrolling (just make sure they create the node and assets and all that on a background thread). The only issue is when having the change the video item, as this forces itself onto the main thread. If you only have a few videos on that particular view you are fine to use this method!

        dispatch_async(dispatch_get_global_queue(QOS_CLASS_BACKGROUND, 0), {
            self.mainNode = ASDisplayNode()
            self.videoNode = ASVideoNode()
            self.videoNode!.asset = AVAsset(URL: self.videoUrl!)
            self.videoNode!.frame = CGRectMake(0.0, 0.0, self.bounds.width, self.bounds.height)
            self.videoNode!.gravity = AVLayerVideoGravityResizeAspectFill
            self.videoNode!.shouldAutoplay = true
            self.videoNode!.shouldAutorepeat = true
            self.videoNode!.muted = true
            self.videoNode!.playButton.hidden = true
            
            dispatch_async(dispatch_get_main_queue(), {
                self.mainNode!.addSubnode(self.videoNode!)
                self.addSubview(self.mainNode!.view)
            })
        })
Saurabh
  • 745
  • 1
  • 9
  • 33
Gregg
  • 1,477
  • 1
  • 16
  • 17
  • Currently, in order to use ASVideoNode, you need to work off the AsyncDisplayKit master branch – Kevin Mar 23 '16 at 17:38
  • Installing from pod is fine, you just need to include ASVideoNode.h in your header bridge file. But @Kevin is correct, ASVideoNode is extremely new and thus not officially supported yet. – Gregg Mar 23 '16 at 21:05
  • 2
    @Gregg Did you actually try this? `ASVideoNode` is just a wrapper for `AVPlayerLayer`. It's not magic, and it won't give you any scroll performance benefit for video playback. – Antonio Mar 26 '16 at 21:25
  • 1
    hey @Antonio, have u looked at https://developer.apple.com/library/ios/samplecode/AVPlayerDemo/Listings/ReadMe_txt.html? It gives a good example on how to use loadValuesAsync and KVO to play a video. If you are able to do all the main thread tasks while the user is not scrolling, I believe you can achieve your goal. – chourobin Apr 24 '16 at 14:09
  • Yes, I've looked through that demo, and yes I know how to get video playing using loadValuesAsync and KVO. The fundamental problem is not what I'm doing on the main thread, it's what AVPlayer is doing on the main thread. Its internal implementation will jump onto the main thread where it will then do a bunch of blocking operations which kills scroll performance. I could wait for the user to stop scrolling, but that would completely ruin the magic of what I'm trying to do. I want to have video play immediately as it scrolls onscreen, and I want the entire experience to be smooth and seamless. – Antonio Apr 27 '16 at 02:00
  • Unfortunately, AVPlayer and the rest of AVFoundation haven't been implemented well enough where building something like that would be possible. The hardware on iPhones is more fast enough to do what I want to do, but AVFoundation is seriously dropping the ball here. – Antonio Apr 27 '16 at 02:02
  • It makes no sense to go this route when you already have the better queue/thread choices made for you by Core Animation and Core Media. Why not load sample buffers with an asset reader and then display it in a sample buffer display layer. Apple made the perfect choices for you with regards to threading and cues for both of these... and, you don't have to use someone's kit to make them work. – James Bush May 03 '17 at 20:08
  • @JamesBush do you have any solution for this ?? I am dying to implement a collection view of videos that can be shown all at the same time – Farid Al Haddad Jun 04 '17 at 21:44
  • I've posted both sample code and videos demonstrating the sample code long ago. They are on my blog. – James Bush Jun 05 '17 at 16:23
  • @JamesBush can you please post the direct link to your blog with sample code demonstrating the workaround for the AVPlayer. it would be great help. – Bhavin Kansagara Jul 03 '17 at 09:03
  • 1
    @BhavinKansagara That's exactly what I did until some jealous person deleted it. Here is both a video and the link to the source code: http://www.mediafire.com/download/ivecygnlhqxwynr/VideoWallCollectionView.zip https://youtu.be/7QlaO7WxjGg Somebody is VERY JEALOUS OF MY PRESENCE HERE. HOW PATHETIC. Anyway, enjoy the code. – James Bush Jul 03 '17 at 21:58
  • I need to achieve the auto play functionality in the table view using AVVideoNode. By using above Code video listed in table view cell start playing simultaneously – Paul.V Dec 11 '18 at 09:35
0

Here's a working solution for displaying a "video wall" in a UICollectionView:

1) Store all of your cells in an NSMapTable (from henceforth, you will only access a cell object from the NSMapTable):

self.cellCache = [[NSMapTable alloc] initWithKeyOptions:NSPointerFunctionsWeakMemory valueOptions:NSPointerFunctionsStrongMemory capacity:AppDelegate.sharedAppDelegate.assetsFetchResults.count];
    for (NSInteger i = 0; i < AppDelegate.sharedAppDelegate.assetsFetchResults.count; i++) {
        [self.cellCache setObject:(AssetPickerCollectionViewCell *)[self.collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:[NSIndexPath indexPathForItem:i inSection:0]] forKey:[NSIndexPath indexPathForItem:i inSection:0]];
    }

2) Add this method to your UICollectionViewCell subclass:

- (void)setupPlayer:(PHAsset *)phAsset {
typedef void (^player) (void);
player play = ^{
    NSString __autoreleasing *serialDispatchCellQueueDescription = ([NSString stringWithFormat:@"%@ serial cell queue", self]);
    dispatch_queue_t __autoreleasing serialDispatchCellQueue = dispatch_queue_create([serialDispatchCellQueueDescription UTF8String], DISPATCH_QUEUE_SERIAL);
    dispatch_async(serialDispatchCellQueue, ^{
        __weak typeof(self) weakSelf = self;
        __weak typeof(PHAsset) *weakPhAsset = phAsset;
        [[PHImageManager defaultManager] requestPlayerItemForVideo:weakPhAsset options:nil
                                                     resultHandler:^(AVPlayerItem * _Nullable playerItem, NSDictionary * _Nullable info) {
                                                         if(![[info objectForKey:PHImageResultIsInCloudKey] boolValue]) {
                                                             AVPlayer __autoreleasing *player = [AVPlayer playerWithPlayerItem:playerItem];
                                                             __block typeof(AVPlayerLayer) *weakPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
                                                             [weakPlayerLayer setFrame:weakSelf.contentView.bounds]; //CGRectMake(self.contentView.bounds.origin.x, self.contentView.bounds.origin.y, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height * (9.0/16.0))];
                                                             [weakPlayerLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
                                                             [weakPlayerLayer setBorderWidth:0.25f];
                                                             [weakPlayerLayer setBorderColor:[UIColor whiteColor].CGColor];
                                                             [player play];
                                                             dispatch_async(dispatch_get_main_queue(), ^{
                                                                 [weakSelf.contentView.layer addSublayer:weakPlayerLayer];
                                                             });
                                                         }
                                                     }];
    });

    }; play();
}

3) Call the method above from your UICollectionView delegate this way:

- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath
{

    if ([[self.cellCache objectForKey:indexPath] isKindOfClass:[AssetPickerCollectionViewCell class]])
        [self.cellCache setObject:(AssetPickerCollectionViewCell *)[collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:indexPath] forKey:indexPath];

    dispatch_async(dispatch_get_global_queue(0, DISPATCH_QUEUE_PRIORITY_HIGH), ^{
        NSInvocationOperation *invOp = [[NSInvocationOperation alloc]
                                        initWithTarget:(AssetPickerCollectionViewCell *)[self.cellCache objectForKey:indexPath]
                                        selector:@selector(setupPlayer:) object:AppDelegate.sharedAppDelegate.assetsFetchResults[indexPath.item]];
        [[NSOperationQueue mainQueue] addOperation:invOp];
    });

    return (AssetPickerCollectionViewCell *)[self.cellCache objectForKey:indexPath];
}

By the way, here's how you would populate a PHFetchResult collection with all videos in the Video folder of the Photos app:

// Collect all videos in the Videos folder of the Photos app
- (PHFetchResult *)assetsFetchResults {
    __block PHFetchResult *i = self->_assetsFetchResults;
    if (!i) {
        static dispatch_once_t onceToken;
        dispatch_once(&onceToken, ^{
            PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumVideos options:nil];
            PHAssetCollection *collection = smartAlbums.firstObject;
            if (![collection isKindOfClass:[PHAssetCollection class]]) collection = nil;
            PHFetchOptions *allPhotosOptions = [[PHFetchOptions alloc] init];
            allPhotosOptions.sortDescriptors = @[[NSSortDescriptor sortDescriptorWithKey:@"creationDate" ascending:NO]];
            i = [PHAsset fetchAssetsInAssetCollection:collection options:allPhotosOptions];
            self->_assetsFetchResults = i;
        });
    }
    NSLog(@"assetsFetchResults (%ld)", self->_assetsFetchResults.count);

    return i;
}

If you want to filter videos that are local (and not in iCloud), which is what I'd assume, seeing as you're looking for smooth-scrolling:

// Filter videos that are stored in iCloud
- (NSArray *)phAssets {
    NSMutableArray *assets = [NSMutableArray arrayWithCapacity:self.assetsFetchResults.count];
    [[self assetsFetchResults] enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
        if (asset.sourceType == PHAssetSourceTypeUserLibrary)
            [assets addObject:asset];
    }];

    return [NSArray arrayWithArray:(NSArray *)assets];
}
James Bush
  • 1,485
  • 14
  • 19
  • Reuse isn't the same as unique. How does it know what to reuse? That's just reusing the view; my other objects are independent of the view. When you swap the view, you still have the old AVPlayerItem/AVPlayer/etc. To get a new one, you have to associate the distinction by assigning those objects to a specific cell at a specific index. When I did not do it this way, scrolling was smooth, only the cells initially loaded would appear, the rest were duplicates of them. – James Bush Jul 15 '16 at 06:26
  • I'm amending my comment to better answer your question about mapping the right videos to cells. In short, I don't do that anymore. When I do to ensure that an asset matches the intended cell is to use the asset identifier to set the tag of the cell. Then, I just check to make sure that the cell tag and the asset identifier match before I proceed; it's one call after another. This prevents the problem I was having. Another solution I have successfully use, is to reference all views in a cell by tag number. Never by name. And, I always make new references to them every time I use them. – James Bush May 03 '17 at 20:14
  • @John Sorry, I don't; but, the translation shouldn't be too hard. – James Bush Aug 21 '17 at 17:42
  • @JamesBush can you help me out in this https://stackoverflow.com/questions/51787983/autoplay-video-in-uitableviewcell-hiccupps ? – Rahul Vyas Aug 16 '18 at 05:31
-1

I manage to create a horizontal feed like view with avplayer in each cell did it like so:

  1. Buffering - create a manager so you can preload (buffer) the videos. The amount of AVPlayers you want to buffer depends on the experience you are looking for. In my app i manage only 3 AVPlayers, so one player is being played now and the previous & next players are being buffered. All the buffering manager is doing is managing that the correct video is being buffered at any given point

  2. Reused cells - Let the TableView / CollectionView reuse the cells in cellForRowAtIndexPath: all you have to do is after you dequqe the cell pass him it's correct player (i just give the buffering an indexPath on the cell and he returns the correct one)

  3. AVPlayer KVO's - Every time the buffering manager gets a call to load a new video to buffer the AVPlayer create all of his assets and notifications, just call them like so:

// player

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
    self.videoContainer.playerLayer.player = self.videoPlayer;
    self.asset = [AVURLAsset assetWithURL:[NSURL URLWithString:self.videoUrl]];
    NSString *tracksKey = @"tracks";
    dispatch_async(dispatch_get_main_queue(), ^{
        [self.asset loadValuesAsynchronouslyForKeys:@[tracksKey]
                                  completionHandler:^{                         dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
                                          NSError *error;
                                          AVKeyValueStatus status = [self.asset statusOfValueForKey:tracksKey error:&error];

                                          if (status == AVKeyValueStatusLoaded) {
                                              self.playerItem = [AVPlayerItem playerItemWithAsset:self.asset];
                                              // add the notification on the video
                                              // set notification that we need to get on run time on the player & items
                                              // a notification if the current item state has changed
                                              [self.playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:contextItemStatus];
                                              // a notification if the playing item has not yet started to buffer
                                              [self.playerItem addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:contextPlaybackBufferEmpty];
                                              // a notification if the playing item has fully buffered
                                              [self.playerItem addObserver:self forKeyPath:@"playbackBufferFull" options:NSKeyValueObservingOptionNew context:contextPlaybackBufferFull];
                                              // a notification if the playing item is likely to keep up with the current buffering rate
                                              [self.playerItem addObserver:self forKeyPath:@"playbackLikelyToKeepUp" options:NSKeyValueObservingOptionNew context:contextPlaybackLikelyToKeepUp];
                                              // a notification to get information about the duration of the playing item
                                              [self.playerItem addObserver:self forKeyPath:@"duration" options:NSKeyValueObservingOptionNew context:contextDurationUpdate];
                                              // a notificaiton to get information when the video has finished playing
                                              [NotificationCenter addObserver:self selector:@selector(itemDidFinishedPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.playerItem];
                                              self.didRegisterWhenLoad = YES;

                                              self.videoPlayer = [AVPlayer playerWithPlayerItem:self.playerItem];

                                              // a notification if the player has chenge it's rate (play/pause)
                                              [self.videoPlayer addObserver:self forKeyPath:@"rate" options:NSKeyValueObservingOptionNew context:contextRateDidChange];
                                              // a notification to get the buffering rate on the current playing item
                                              [self.videoPlayer addObserver:self forKeyPath:@"currentItem.loadedTimeRanges" options:NSKeyValueObservingOptionNew context:contextTimeRanges];
                                          }
                                      });
                                  }];
    });
});

where: videoContainer - is the view you want to add the player to

Let me know if you need any help or more explanations

Good luck :)

YYfim
  • 1,402
  • 1
  • 9
  • 24
  • Thanks for doing this, but I don't really understand how any of this would really solve the performance problems that I've observed with AVFoundation. What's your scroll performance like? I'm pretty sure that you're still going to drop a bunch of frames ever time you associate an AVPlayer with an AVPlayerItem. Is that not the case? If possible, can you upload a little screencast of what the scroll performance looks like? – Antonio Jan 22 '16 at 19:46
  • I'm using paging enabled in the collection view and when a video starts playing and no scroll is happening I associate all the items and the assets of the buffered players – YYfim Jan 22 '16 at 20:00
  • 2
    So you don't actually do any work until the page has settled? Because that would explain how you get an acceptable user experience. You're still stalling the main thread like crazy, but there aren't any scroll animations to interrupt, so it's not noticeable. Does that sound right? – Antonio Jan 22 '16 at 20:51
  • That's not accurate, there is a main thread block (according to instruments about 150+\- ms) when the AVPlayerItem gets created.all other AVPlayer initializations are happening in a background queue. – YYfim Jan 23 '16 at 05:43
  • 2
    Which part is not accurate? A main thread stall of 150ms is going cause a lot of frame drops and it'll ruin any scroll view animations that are occurring at the same time. You haven't actually found a way to prevent that, right? – Antonio Jan 25 '16 at 20:44
  • Antonio: see the answer I submitted above; with that code, I can display up to 15 videos at a time with lossless playback. Moreover, I can scroll to another 15 without any stuttering. – James Bush Jul 13 '16 at 21:15
  • 1
    This is nothing. This is absolutely nothing that would solve any kind of problem whatsoever. What you're describing that you've done is not reflected in that code. – James Bush May 03 '17 at 20:16
  • Could you please explain why you dispatched an already thread-dispatched method to other threads? Makes zero sense... – James Bush Aug 17 '18 at 18:50
-1

I've played around with all the answers above and found out that they're true only to a certain limit.

Easiest and the simplest way that worked for me so far is that the code you assign your AVPlayerItem to your AVPlayer instance in a background thread. I noticed that assigning the AVPlayerItem to the player on the main thread (even after AVPlayerItem object is ready) always takes a toll on your performance and frame rate.

Swift 4

ex.

let mediaUrl = //your media string
let player = AVPlayer()
let playerItem = AVPlayerItem(url: mediaUrl)

DispatchQueue.global(qos: .default).async {
    player.replaceCurrentItem(with: playerItem)
}
melaka
  • 699
  • 7
  • 23