168

I have a short mp4 video file that I've added to my current Xcode6 Beta project.

I want to play the video in my app.

After hours searching, I can't find anything remotely helpful. Is there a way to accomplish this with Swift or do you have to use Objective-C? Can I get pointed in the right direction? I can't be the only one wondering this.

hbk
  • 10,908
  • 11
  • 91
  • 124
36 By Design
  • 3,203
  • 3
  • 18
  • 18

9 Answers9

322

Sure you can use Swift!

1. Adding the video file

Add the video (lets call it video.m4v) to your Xcode project

2. Checking your video is into the Bundle

Open the Project Navigator cmd + 1

Then select your project root > your Target > Build Phases > Copy Bundle Resources.

Your video MUST be here. If it's not, then you should add it using the plus button

enter image description here

3. Code

Open your View Controller and write this code.

import UIKit
import AVKit
import AVFoundation

class ViewController: UIViewController {

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        playVideo()
    }

    private func playVideo() {
        guard let path = Bundle.main.path(forResource: "video", ofType:"m4v") else {
            debugPrint("video.m4v not found")
            return
        }
        let player = AVPlayer(url: URL(fileURLWithPath: path))
        let playerController = AVPlayerViewController()
        playerController.player = player
        present(playerController, animated: true) {
            player.play()
        }
    }
}
Luca Angeletti
  • 58,465
  • 13
  • 121
  • 148
  • 7
    Most imp part of this solution is to check that field "Add To targets"-> Project name when we copy that resource to our project. Thanks this solution worked for me.. – Pramod Jan 21 '15 at 06:26
  • 2
    The MPMoviePlayerController class is formally deprecated in iOS 9. (The MPMoviePlayerViewController class is also formally deprecated.) To play video content in iOS 9 and later, instead use the AVPictureInPictureController or AVPlayerViewController class from the AVKit framework, or the WKWebView class from WebKit. [Apple reference guide](https://developer.apple.com/library/prerelease/ios/documentation/MediaPlayer/Reference/MPMoviePlayerController_Class/index.html) – HGDev Sep 20 '15 at 15:28
  • can you please check these http://stackoverflow.com/questions/32952056/video-file-not-playing-that-is-on-server –  Oct 06 '15 at 08:31
  • Having trouble with this in latest xcode beta 7.2 - `initializer for conditional binding must have optional type not nsurl` – Brian Wigginton Nov 04 '15 at 06:04
  • That's why the `NSURL` initialiser no longer returns an optional. Remove this line `url = NSURL(fileURLWithPath: path),` and on the next line replace the variable `url` with `NSURL(fileURLWithPath: path)`. – Luca Angeletti Nov 04 '15 at 10:26
  • @appzYourLife great answer! but how can I play streaming video using your solution on swift 2.1? I searched the web but i couldn't play streaming video with avplayer ! – anonymox Dec 21 '15 at 10:29
  • hi @appzYourLife any chance you could help on this question about capturing video? http://stackoverflow.com/questions/34704032/swift-video-records-at-one-size-but-renders-at-wrong-size – Crashalot Jan 14 '16 at 01:39
  • One question , as the ViewController screen will always create a new instance when the screen appears on screen ... does this mean that the firstAppear Bool will always be true ? – PoolHallJunkie Apr 11 '16 at 10:18
  • hello my application is only supported portrait mode if i am starting video in landscape mode and when i am click on done button my view remains in landscape mode but i want that view in portrait mode because my application is only supported portrait mode so how to do in swift???? – Akash Raghani Jun 10 '16 at 05:29
  • this did not work for me. AVPlayerViewController does not exist. If it has to be user defined vc, then it should not prefix with AV -- confusing. – mobibob Aug 11 '17 at 21:43
  • 1
    @mobibob Of course [AVPlayerViewController](https://developer.apple.com/documentation/avkit/avplayerviewcontroller) exists :-) Did you `import AVKit`? – Luca Angeletti Aug 11 '17 at 22:11
  • @LucaAngeletti i was experiencing a problem with Xcode auto-complete. Once i fixed that, AVPlayerViewController showed and compiled. I did not think to update my comment. – mobibob Aug 21 '17 at 14:34
  • @LucaAngeletti I want to play my local system .mp4 file on my real device. The location of the mp4 file is Desktop or Network. – Raksha Saini Aug 24 '17 at 12:50
  • @Rex Please [ask](https://stackoverflow.com/help/how-to-ask) a new question – Luca Angeletti Aug 24 '17 at 13:45
  • 1
    tip: I made a stupid mistake to use `URL(string: fileString)` that will never load the video file. Be careful. – Developer Sheldon May 13 '19 at 22:12
  • Thank you. I don't why after all this time Apple has not automated this as for Assets. – Jeremy Andrews Aug 17 '19 at 16:39
34

another Swift 3 Example. The provided solution did not work for me.

private func playVideo(from file:String) {
    let file = file.components(separatedBy: ".")

    guard let path = Bundle.main.path(forResource: file[0], ofType:file[1]) else {
        debugPrint( "\(file.joined(separator: ".")) not found")
        return
    }
    let player = AVPlayer(url: URL(fileURLWithPath: path))

    let playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = self.view.bounds
    self.view.layer.addSublayer(playerLayer)
    player.play()
}

useage:

playVideo(from: "video.extension")

Note: Check Copy Bundle Resources under Build Phases to ensure that the video is available to the Project.

nyxee
  • 2,773
  • 26
  • 22
  • This solution works with other layers together, it is a better solution than the majority voted one, thanks! – Mike Lee Feb 23 '18 at 10:57
15

U can setup AVPlayer in another way, that open for u full customization of your video Player screen

Swift 2.3

  1. Create UIView subclass for playing video (basically u can use any UIView object and only needed is AVPlayerLayer. I setup in this way because it much clearer for me)

    import AVFoundation
    import UIKit
    
    class PlayerView: UIView {
    
    override class func layerClass() -> AnyClass {
        return AVPlayerLayer.self
    }
    
    var player:AVPlayer? {
        set {
            if let layer = layer as? AVPlayerLayer {
                layer.player = player
            }
        }
        get {
            if let layer = layer as? AVPlayerLayer {
                return layer.player
            } else {
                return nil
            }
        }
    }
    }
    
  2. Setup your player

    import AVFoundation
    import Foundation
    
    protocol VideoPlayerDelegate {
        func downloadedProgress(progress:Double)
        func readyToPlay()
        func didUpdateProgress(progress:Double)
        func didFinishPlayItem()
        func didFailPlayToEnd()
    }
    
    let videoContext:UnsafeMutablePointer<Void> = nil
    
    class VideoPlayer : NSObject {
    
        private var assetPlayer:AVPlayer?
        private var playerItem:AVPlayerItem?
        private var urlAsset:AVURLAsset?
        private var videoOutput:AVPlayerItemVideoOutput?
    
        private var assetDuration:Double = 0
        private var playerView:PlayerView?
    
        private var autoRepeatPlay:Bool = true
        private var autoPlay:Bool = true
    
        var delegate:VideoPlayerDelegate?
    
        var playerRate:Float = 1 {
            didSet {
                if let player = assetPlayer {
                    player.rate = playerRate > 0 ? playerRate : 0.0
                }
            }
        }
    
        var volume:Float = 1.0 {
            didSet {
                if let player = assetPlayer {
                    player.volume = volume > 0 ? volume : 0.0
                }
            }
        }
    
        // MARK: - Init
    
        convenience init(urlAsset:NSURL, view:PlayerView, startAutoPlay:Bool = true, repeatAfterEnd:Bool = true) {
            self.init()
    
            playerView = view
            autoPlay = startAutoPlay
            autoRepeatPlay = repeatAfterEnd
    
            if let playView = playerView, let playerLayer = playView.layer as? AVPlayerLayer {
                playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
            }
            initialSetupWithURL(urlAsset)
            prepareToPlay()
        }
    
        override init() {
            super.init()
        }
    
        // MARK: - Public
    
        func isPlaying() -> Bool {
            if let player = assetPlayer {
                return player.rate > 0
            } else {
                return false
            }
        }
    
        func seekToPosition(seconds:Float64) {
            if let player = assetPlayer {
                pause()
                if let timeScale = player.currentItem?.asset.duration.timescale {
                    player.seekToTime(CMTimeMakeWithSeconds(seconds, timeScale), completionHandler: { (complete) in
                        self.play()
                    })
                }
            }
        }
    
        func pause() {
            if let player = assetPlayer {
                player.pause()
            }
        }
    
        func play() {
            if let player = assetPlayer {
                if (player.currentItem?.status == .ReadyToPlay) {
                    player.play()
                    player.rate = playerRate
                }
            }
        }
    
        func cleanUp() {
            if let item = playerItem {
                item.removeObserver(self, forKeyPath: "status")
                item.removeObserver(self, forKeyPath: "loadedTimeRanges")
            }
            NSNotificationCenter.defaultCenter().removeObserver(self)
            assetPlayer = nil
            playerItem = nil
            urlAsset = nil
        }
    
        // MARK: - Private
    
        private func prepareToPlay() {
            let keys = ["tracks"]
            if let asset = urlAsset {
                asset.loadValuesAsynchronouslyForKeys(keys, completionHandler: {
                    dispatch_async(dispatch_get_main_queue(), {
                        self.startLoading()
                    })
                })
            }
        }
    
        private func startLoading(){
            var error:NSError?
            guard let asset = urlAsset else {return}
            let status:AVKeyValueStatus = asset.statusOfValueForKey("tracks", error: &error)
    
            if status == AVKeyValueStatus.Loaded {
                assetDuration = CMTimeGetSeconds(asset.duration)
    
                let videoOutputOptions = [kCVPixelBufferPixelFormatTypeKey as String : Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)]
                videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: videoOutputOptions)
                playerItem = AVPlayerItem(asset: asset)
    
                if let item = playerItem {
                    item.addObserver(self, forKeyPath: "status", options: .Initial, context: videoContext)
                    item.addObserver(self, forKeyPath: "loadedTimeRanges", options: [.New, .Old], context: videoContext)
    
                    NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(playerItemDidReachEnd), name: AVPlayerItemDidPlayToEndTimeNotification, object: nil)
                    NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(didFailedToPlayToEnd), name: AVPlayerItemFailedToPlayToEndTimeNotification, object: nil)
    
                    if let output = videoOutput {
                        item.addOutput(output)
    
                        item.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed
                        assetPlayer = AVPlayer(playerItem: item)
    
                        if let player = assetPlayer {
                            player.rate = playerRate
                        }
    
                        addPeriodicalObserver()
                        if let playView = playerView, let layer = playView.layer as? AVPlayerLayer {
                            layer.player = assetPlayer
                            print("player created")
                        }
                    }
                }
            }        
        }
    
        private func addPeriodicalObserver() {
            let timeInterval = CMTimeMake(1, 1)
    
            if let player = assetPlayer {
                player.addPeriodicTimeObserverForInterval(timeInterval, queue: dispatch_get_main_queue(), usingBlock: { (time) in
                    self.playerDidChangeTime(time)
                })
            }
        }
    
        private func playerDidChangeTime(time:CMTime) {
            if let player = assetPlayer {
                let timeNow = CMTimeGetSeconds(player.currentTime())
                let progress = timeNow / assetDuration
    
                delegate?.didUpdateProgress(progress)
            }
        }
    
        @objc private func playerItemDidReachEnd() {
            delegate?.didFinishPlayItem()
    
            if let player = assetPlayer {
                player.seekToTime(kCMTimeZero)
                if autoRepeatPlay == true {
                    play()
                }
            }
        }
    
        @objc private func didFailedToPlayToEnd() {
            delegate?.didFailPlayToEnd()
        }
    
        private func playerDidChangeStatus(status:AVPlayerStatus) {
            if status == .Failed {
                print("Failed to load video")
            } else if status == .ReadyToPlay, let player = assetPlayer {
                volume = player.volume
                delegate?.readyToPlay()
    
                if autoPlay == true && player.rate == 0.0 {
                    play()
                }
            }
        }
    
        private func moviewPlayerLoadedTimeRangeDidUpdated(ranges:Array<NSValue>) {
            var maximum:NSTimeInterval = 0
            for value in ranges {
                let range:CMTimeRange = value.CMTimeRangeValue
                let currentLoadedTimeRange = CMTimeGetSeconds(range.start) + CMTimeGetSeconds(range.duration)
                if currentLoadedTimeRange > maximum {
                    maximum = currentLoadedTimeRange
                }
            }
            let progress:Double = assetDuration == 0 ? 0.0 : Double(maximum) / assetDuration
    
            delegate?.downloadedProgress(progress)
        }
    
        deinit {
            cleanUp()
        }
    
        private func initialSetupWithURL(url:NSURL) {
            let options = [AVURLAssetPreferPreciseDurationAndTimingKey : true]
            urlAsset = AVURLAsset(URL: url, options: options)
        }
    
        // MARK: - Observations
    
        override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
            if context == videoContext {
                if let key = keyPath {
                    if key == "status", let player = assetPlayer {
                        playerDidChangeStatus(player.status)
                    } else if key == "loadedTimeRanges", let item = playerItem {
                        moviewPlayerLoadedTimeRangeDidUpdated(item.loadedTimeRanges)
                    }
                }
            }
        }
    
    }
    
  3. Usage:

assume u have view

@IBOutlet private weak var playerView: PlayerView!
private var videoPlayer:VideoPlayer?

and in viewDidLoad()

    private func preparePlayer() {
        if let filePath = NSBundle.mainBundle().pathForResource("intro", ofType: "m4v") {
            let fileURL = NSURL(fileURLWithPath: filePath)
            videoPlayer = VideoPlayer(urlAsset: fileURL, view: playerView)
            if let player = videoPlayer {
                player.playerRate = 0.67
            }
        }
    }

Objective-C

PlayerView.h

    #import <AVFoundation/AVFoundation.h>
    #import <UIKit/UIKit.h>

    /*!
     @class PlayerView
     @discussion Represent View for playinv video. Layer - PlayerLayer
     @availability iOS 7 and Up
     */
    @interface PlayerView : UIView

    /*!
     @var player
     @discussion Player object
     */
    @property (strong, nonatomic) AVPlayer *player;

    @end

PlayerView.m

    #import "PlayerView.h"

    @implementation PlayerView

    #pragma mark - LifeCycle

    + (Class)layerClass
    {
        return [AVPlayerLayer class];
    }

    #pragma mark - Setter/Getter

    - (AVPlayer*)player
    {
        return [(AVPlayerLayer *)[self layer] player];
    }

    - (void)setPlayer:(AVPlayer *)player
    {
        [(AVPlayerLayer *)[self layer] setPlayer:player];
    }

    @end

VideoPlayer.h

    #import <AVFoundation/AVFoundation.h>
    #import <UIKit/UIKit.h>
    #import "PlayerView.h"

    /*!
     @protocol VideoPlayerDelegate
     @discussion Events from VideoPlayer
     */
    @protocol VideoPlayerDelegate <NSObject>

    @optional

    /*!
     @brief Called whenever time when progress of played item changed
     @param progress
     Playing progress
     */
    - (void)progressDidUpdate:(CGFloat)progress;

    /*!
     @brief Called whenever downloaded item progress changed
     @param progress
     Playing progress
     */
    - (void)downloadingProgress:(CGFloat)progress;

    /*!
     @brief Called when playing time changed
     @param time
     Playing progress
     */
    - (void)progressTimeChanged:(CMTime)time;

    /*!
     @brief Called when player finish play item
    */
    - (void)playerDidPlayItem;

    /*!
     @brief Called when player ready to play item
     */
    - (void)isReadyToPlay;

    @end

    /*!
     @class VideoPlayer
     @discussion Video Player
     @code
         self.videoPlayer = [[VideoPlayer alloc] initVideoPlayerWithURL:someURL playerView:self.playerView];
         [self.videoPlayer prepareToPlay];
         self.videoPlayer.delegate = self; //optional

         //after when required play item
         [self.videoPlayer play];
     @endcode
     */
    @interface VideoPlayer : NSObject

    /*!
     @var delegate
     @abstract Delegate for VideoPlayer
     @discussion Set object to this property for getting response and notifications from this class
     */
    @property (weak, nonatomic) id <VideoPlayerDelegate> delegate;

    /*!
     @var volume
     @discussion volume of played asset
     */
    @property (assign, nonatomic) CGFloat volume;

    /*!
     @var autoRepeat
     @discussion indicate whenever player should repeat content on finish playing
     */
    @property (assign, nonatomic) BOOL autoRepeat;

    /*!
     @brief Create player with asset URL
     @param urlAsset
     Source URL
     @result
     instance of VideoPlayer
     */
    - (instancetype)initVideoPlayerWithURL:(NSURL *)urlAsset;

    /*!
     @brief Create player with asset URL and configure selected view for showing result
     @param urlAsset
     Source URL
     @param view
     View on wchich result will be showed
     @result
     instance of VideoPlayer
     */
    - (instancetype)initVideoPlayerWithURL:(NSURL *)urlAsset playerView:(PlayerView *)view;

    /*!
     @brief Call this method after creating player to prepare player to play
    */
    - (void)prepareToPlay;

    /*!
     @brief Play item
     */
    - (void)play;
    /*!
     @brief Pause item
     */
    - (void)pause;
    /*!
     @brief Stop item
     */
    - (void)stop;

    /*!
     @brief Seek required position in item and pla if rquired
     @param progressValue
     % of position to seek
     @param isPlaying
     YES if player should start to play item implicity
     */
    - (void)seekPositionAtProgress:(CGFloat)progressValue withPlayingStatus:(BOOL)isPlaying;

    /*!
     @brief Player state
     @result
     YES - if playing, NO if not playing
     */
    - (BOOL)isPlaying;

    /*!
     @brief Indicate whenever player can provide CVPixelBufferRef frame from item
     @result
     YES / NO
     */
    - (BOOL)canProvideFrame;

    /*!
     @brief CVPixelBufferRef frame from item
     @result
     CVPixelBufferRef frame
     */
    - (CVPixelBufferRef)getCurrentFramePicture;

    @end

VideoPlayer.m

    #import "VideoPlayer.h"

    typedef NS_ENUM(NSUInteger, InternalStatus) {
        InternalStatusPreparation,
        InternalStatusReadyToPlay,
    };

    static const NSString *ItemStatusContext;

    @interface VideoPlayer()

    @property (strong, nonatomic) AVPlayer *assetPlayer;
    @property (strong, nonatomic) AVPlayerItem *playerItem;
    @property (strong, nonatomic) AVURLAsset *urlAsset;
    @property (strong, atomic) AVPlayerItemVideoOutput *videoOutput;

    @property (assign, nonatomic) CGFloat assetDuration;
    @property (strong, nonatomic) PlayerView *playerView;

    @property (assign, nonatomic) InternalStatus status;

    @end

    @implementation VideoPlayer

    #pragma mark - LifeCycle

    - (instancetype)initVideoPlayerWithURL:(NSURL *)urlAsset
    {
        if (self = [super init]) {
            [self initialSetupWithURL:urlAsset];
        }
        return self;
    }

    - (instancetype)initVideoPlayerWithURL:(NSURL *)urlAsset playerView:(PlayerView *)view
    {
        if (self = [super init]) {
            ((AVPlayerLayer *)view.layer).videoGravity = AVLayerVideoGravityResizeAspectFill;
            [self initialSetupWithURL:urlAsset playerView:view];
        }
        return self;
    }

    #pragma mark - Public

    - (void)play
    {
        if ((self.assetPlayer.currentItem) && (self.assetPlayer.currentItem.status == AVPlayerItemStatusReadyToPlay)) {
            [self.assetPlayer play];
        }
    }

    - (void)pause
    {
        [self.assetPlayer pause];
    }

    - (void)seekPositionAtProgress:(CGFloat)progressValue withPlayingStatus:(BOOL)isPlaying
    {
        [self.assetPlayer pause];
        int32_t timeScale = self.assetPlayer.currentItem.asset.duration.timescale;

        __weak typeof(self) weakSelf = self;
        [self.assetPlayer seekToTime:CMTimeMakeWithSeconds(progressValue, timeScale) completionHandler:^(BOOL finished) {
            DLog(@"SEEK To time %f - success", progressValue);
            if (isPlaying && finished) {
                [weakSelf.assetPlayer play];
            }
        }];
    }

    - (void)setPlayerVolume:(CGFloat)volume
    {
        self.assetPlayer.volume = volume > .0 ? MAX(volume, 0.7) : 0.0f;
        [self.assetPlayer play];
    }

    - (void)setPlayerRate:(CGFloat)rate
    {
        self.assetPlayer.rate = rate > .0 ? rate : 0.0f;
    }

    - (void)stop
    {
        [self.assetPlayer seekToTime:kCMTimeZero];
        self.assetPlayer.rate =.0f;
    }

    - (BOOL)isPlaying
    {
        return self.assetPlayer.rate > 0 ? YES : NO;
    }

    #pragma mark - Private

    - (void)initialSetupWithURL:(NSURL *)url
    {
        self.status = InternalStatusPreparation;
        [self setupPlayerWithURL:url];
    }

    - (void)initialSetupWithURL:(NSURL *)url playerView:(PlayerView *)view
    {
        [self setupPlayerWithURL:url];
        self.playerView = view;
    }

    - (void)setupPlayerWithURL:(NSURL *)url
    {
        NSDictionary *assetOptions = @{ AVURLAssetPreferPreciseDurationAndTimingKey : @YES };
        self.urlAsset = [AVURLAsset URLAssetWithURL:url options:assetOptions];
    }

    - (void)prepareToPlay
    {
        NSArray *keys = @[@"tracks"];
        __weak VideoPlayer *weakSelf = self;
        [weakSelf.urlAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^{
            dispatch_async(dispatch_get_main_queue(), ^{
                [weakSelf startLoading];
            });
        }];
    }

    - (void)startLoading
    {
        NSError *error;
        AVKeyValueStatus status = [self.urlAsset statusOfValueForKey:@"tracks" error:&error];
        if (status == AVKeyValueStatusLoaded) {
            self.assetDuration = CMTimeGetSeconds(self.urlAsset.duration);
            NSDictionary* videoOutputOptions = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};
            self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:videoOutputOptions];
            self.playerItem = [AVPlayerItem playerItemWithAsset: self.urlAsset];

            [self.playerItem addObserver:self
                              forKeyPath:@"status"
                                 options:NSKeyValueObservingOptionInitial
                                 context:&ItemStatusContext];
            [self.playerItem addObserver:self
                              forKeyPath:@"loadedTimeRanges"
                                 options:NSKeyValueObservingOptionNew|NSKeyValueObservingOptionOld
                                 context:&ItemStatusContext];
            [[NSNotificationCenter defaultCenter] addObserver:self
                                                     selector:@selector(playerItemDidReachEnd:)
                                                         name:AVPlayerItemDidPlayToEndTimeNotification
                                                       object:self.playerItem];
            [[NSNotificationCenter defaultCenter] addObserver:self
                                                     selector:@selector(didFailedToPlayToEnd)
                                                         name:AVPlayerItemFailedToPlayToEndTimeNotification
                                                       object:nil];

            [self.playerItem addOutput:self.videoOutput];
            self.assetPlayer = [AVPlayer playerWithPlayerItem:self.playerItem];
            [self addPeriodicalObserver];
            [((AVPlayerLayer *)self.playerView.layer) setPlayer:self.assetPlayer];
            DLog(@"Player created");
        } else {
            DLog(@"The asset's tracks were not loaded:\n%@", error.localizedDescription);
        }
    }

    #pragma mark - Observation

    - (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
    {
        BOOL isOldKey = [change[NSKeyValueChangeNewKey] isEqual:change[NSKeyValueChangeOldKey]];

        if (!isOldKey) {
            if (context == &ItemStatusContext) {
                if ([keyPath isEqualToString:@"status"] && !self.status) {
                    if (self.assetPlayer.status == AVPlayerItemStatusReadyToPlay) {
                        self.status = InternalStatusReadyToPlay;
                    }
                    [self moviePlayerDidChangeStatus:self.assetPlayer.status];
                } else if ([keyPath isEqualToString:@"loadedTimeRanges"]) {
                    [self moviewPlayerLoadedTimeRangeDidUpdated:self.playerItem.loadedTimeRanges];
                }
            }
        }
    }

    - (void)moviePlayerDidChangeStatus:(AVPlayerStatus)status
    {
        if (status == AVPlayerStatusFailed) {
            DLog(@"Failed to load video");
        } else if (status == AVPlayerItemStatusReadyToPlay) {
            DLog(@"Player ready to play");
            self.volume = self.assetPlayer.volume;

            if (self.delegate && [self.delegate respondsToSelector:@selector(isReadyToPlay)]) {
                [self.delegate isReadyToPlay];
            }
        }
    }

    - (void)moviewPlayerLoadedTimeRangeDidUpdated:(NSArray *)ranges
    {
        NSTimeInterval maximum = 0;

        for (NSValue *value in ranges) {
            CMTimeRange range;
            [value getValue:&range];
            NSTimeInterval currenLoadedRangeTime = CMTimeGetSeconds(range.start) + CMTimeGetSeconds(range.duration);
            if (currenLoadedRangeTime > maximum) {
                maximum = currenLoadedRangeTime;
            }
        }
        CGFloat progress = (self.assetDuration == 0) ? 0 : maximum / self.assetDuration;
        if (self.delegate && [self.delegate respondsToSelector:@selector(downloadingProgress:)]) {
            [self.delegate downloadingProgress:progress];
        }
    }

    - (void)playerItemDidReachEnd:(NSNotification *)notification
    {
        if (self.delegate && [self.delegate respondsToSelector:@selector(playerDidPlayItem)]){
            [self.delegate playerDidPlayItem];
        }
        [self.assetPlayer seekToTime:kCMTimeZero];
        if (self.autoRepeat) {
            [self.assetPlayer play];
        }
    }

    - (void)didFailedToPlayToEnd
    {
        DLog(@"Failed play video to the end");
    }

    - (void)addPeriodicalObserver
    {
        CMTime interval = CMTimeMake(1, 1);
        __weak typeof(self) weakSelf = self;
        [self.assetPlayer addPeriodicTimeObserverForInterval:interval queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {
            [weakSelf playerTimeDidChange:time];
        }];
    }

    - (void)playerTimeDidChange:(CMTime)time
    {
        double timeNow = CMTimeGetSeconds(self.assetPlayer.currentTime);
        if (self.delegate && [self.delegate respondsToSelector:@selector(progressDidUpdate:)]) {
            [self.delegate progressDidUpdate:(CGFloat) (timeNow / self.assetDuration)];
        }
    }

    #pragma mark - Notification

    - (void)setupAppNotification
    {
        [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(didEnterBackground) name:UIApplicationDidEnterBackgroundNotification object:nil];
        [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(willEnterForeground) name:UIApplicationWillEnterForegroundNotification object:nil];
    }

    - (void)didEnterBackground
    {
        [self.assetPlayer pause];
    }

    - (void)willEnterForeground
    {
        [self.assetPlayer pause];
    }

    #pragma mark - GetImagesFromVideoPlayer

    - (BOOL)canProvideFrame
    {
        return self.assetPlayer.status == AVPlayerItemStatusReadyToPlay;
    }

    - (CVPixelBufferRef)getCurrentFramePicture
    {
        CMTime currentTime = [self.videoOutput itemTimeForHostTime:CACurrentMediaTime()];
        if (self.delegate && [self.delegate respondsToSelector:@selector(progressTimeChanged:)]) {
            [self.delegate progressTimeChanged:currentTime];
        }
        if (![self.videoOutput hasNewPixelBufferForItemTime:currentTime]) {
            return 0;
        }
        CVPixelBufferRef buffer = [self.videoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:NULL];

        return buffer;
    }

    #pragma mark - CleanUp

    - (void)removeObserversFromPlayer
    {
        @try {
            [self.playerItem removeObserver:self forKeyPath:@"status"];
            [self.playerItem removeObserver:self forKeyPath:@"loadedTimeRanges"];
            [[NSNotificationCenter defaultCenter] removeObserver:self];
            [[NSNotificationCenter defaultCenter] removeObserver:self.assetPlayer];        
        }
        @catch (NSException *ex) {
            DLog(@"Cant remove observer in Player - %@", ex.description);
        }
    }

    - (void)cleanUp
    {
        [self removeObserversFromPlayer];

        self.assetPlayer.rate = 0;
        self.assetPlayer = nil;
        self.playerItem = nil;
        self.urlAsset = nil;
    }

    - (void)dealloc
    {
        [self cleanUp];    
    }

    @end

Off cause resource (video file) should have target membership setted to your project

enter image description here

Additional - link to perfect Apple Developer guide

hbk
  • 10,908
  • 11
  • 91
  • 124
  • Shouldn't the line `layer.player = player` (in Swift's `PlayerView`) be `layer.player = newValue`? Since you want to set the new value not the current one, which is nil. – Ferran Maylinch Apr 12 '17 at 14:19
11

Here a solution for Swift 5.2

PlayerView.swift:

import AVFoundation
import UIKit

class PlayerView: UIView {

    var player: AVPlayer? {
        get {
            return playerLayer.player
        }
        set {
            playerLayer.player = newValue
        }
    }

    var playerLayer: AVPlayerLayer {
        return layer as! AVPlayerLayer
    }

    // Override UIView property
    override static var layerClass: AnyClass {
        return AVPlayerLayer.self
    }
}

VideoPlayer.swift

import AVFoundation
import Foundation

protocol VideoPlayerDelegate {
    func downloadedProgress(progress:Double)
    func readyToPlay()
    func didUpdateProgress(progress:Double)
    func didFinishPlayItem()
    func didFailPlayToEnd()
}

let videoContext: UnsafeMutableRawPointer? = nil

class VideoPlayer : NSObject {

    private var assetPlayer:AVPlayer?
    private var playerItem:AVPlayerItem?
    private var urlAsset:AVURLAsset?
    private var videoOutput:AVPlayerItemVideoOutput?

    private var assetDuration:Double = 0
    private var playerView:PlayerView?

    private var autoRepeatPlay:Bool = true
    private var autoPlay:Bool = true

    var delegate:VideoPlayerDelegate?

    var playerRate:Float = 1 {
        didSet {
            if let player = assetPlayer {
                player.rate = playerRate > 0 ? playerRate : 0.0
            }
        }
    }

    var volume:Float = 1.0 {
        didSet {
            if let player = assetPlayer {
                player.volume = volume > 0 ? volume : 0.0
            }
        }
    }

    // MARK: - Init

    convenience init(urlAsset:NSURL, view:PlayerView, startAutoPlay:Bool = true, repeatAfterEnd:Bool = true) {
        self.init()

        playerView = view
        autoPlay = startAutoPlay
        autoRepeatPlay = repeatAfterEnd

        if let playView = playerView, let playerLayer = playView.layer as? AVPlayerLayer {
            playerLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
        }
        initialSetupWithURL(url: urlAsset)
        prepareToPlay()
    }

    override init() {
        super.init()
    }

    // MARK: - Public

    func isPlaying() -> Bool {
        if let player = assetPlayer {
            return player.rate > 0
        } else {
            return false
        }
    }

    func seekToPosition(seconds:Float64) {
        if let player = assetPlayer {
            pause()
            if let timeScale = player.currentItem?.asset.duration.timescale {
                player.seek(to: CMTimeMakeWithSeconds(seconds, preferredTimescale: timeScale), completionHandler: { (complete) in
                    self.play()
                })
            }
        }
    }

    func pause() {
        if let player = assetPlayer {
            player.pause()
        }
    }

    func play() {
        if let player = assetPlayer {
            if (player.currentItem?.status == .readyToPlay) {
                player.play()
                player.rate = playerRate
            }
        }
    }

    func cleanUp() {
        if let item = playerItem {
            item.removeObserver(self, forKeyPath: "status")
            item.removeObserver(self, forKeyPath: "loadedTimeRanges")
        }
        NotificationCenter.default.removeObserver(self)
        assetPlayer = nil
        playerItem = nil
        urlAsset = nil
    }

    // MARK: - Private

    private func prepareToPlay() {
        let keys = ["tracks"]
        if let asset = urlAsset {
            asset.loadValuesAsynchronously(forKeys: keys, completionHandler: {
                DispatchQueue.main.async {
                    self.startLoading()
                }
            })
        }
    }

    private func startLoading(){
        var error:NSError?
        guard let asset = urlAsset else {return}
        let status:AVKeyValueStatus = asset.statusOfValue(forKey: "tracks", error: &error)

        if status == AVKeyValueStatus.loaded {
            assetDuration = CMTimeGetSeconds(asset.duration)

            let videoOutputOptions = [kCVPixelBufferPixelFormatTypeKey as String : Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)]
            videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: videoOutputOptions)
            playerItem = AVPlayerItem(asset: asset)

            if let item = playerItem {
                item.addObserver(self, forKeyPath: "status", options: .initial, context: videoContext)
                item.addObserver(self, forKeyPath: "loadedTimeRanges", options: [.new, .old], context: videoContext)

                NotificationCenter.default.addObserver(self, selector: #selector(playerItemDidReachEnd), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil)
                NotificationCenter.default.addObserver(self, selector: #selector(didFailedToPlayToEnd), name: NSNotification.Name.AVPlayerItemFailedToPlayToEndTime, object: nil)

                if let output = videoOutput {
                    item.add(output)

                    item.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithm.varispeed
                    assetPlayer = AVPlayer(playerItem: item)

                    if let player = assetPlayer {
                        player.rate = playerRate
                    }

                    addPeriodicalObserver()
                    if let playView = playerView, let layer = playView.layer as? AVPlayerLayer {
                        layer.player = assetPlayer
                        print("player created")
                    }
                }
            }
        }
    }

    private func addPeriodicalObserver() {
        let timeInterval = CMTimeMake(value: 1, timescale: 1)

        if let player = assetPlayer {
            player.addPeriodicTimeObserver(forInterval: timeInterval, queue: DispatchQueue.main, using: { (time) in
                self.playerDidChangeTime(time: time)
            })
        }
    }

    private func playerDidChangeTime(time:CMTime) {
        if let player = assetPlayer {
            let timeNow = CMTimeGetSeconds(player.currentTime())
            let progress = timeNow / assetDuration

            delegate?.didUpdateProgress(progress: progress)
        }
    }

    @objc private func playerItemDidReachEnd() {
        delegate?.didFinishPlayItem()

        if let player = assetPlayer {
            player.seek(to: CMTime.zero)
            if autoRepeatPlay == true {
                play()
            }
        }
    }

    @objc private func didFailedToPlayToEnd() {
        delegate?.didFailPlayToEnd()
    }

    private func playerDidChangeStatus(status:AVPlayer.Status) {
        if status == .failed {
            print("Failed to load video")
        } else if status == .readyToPlay, let player = assetPlayer {
            volume = player.volume
            delegate?.readyToPlay()

            if autoPlay == true && player.rate == 0.0 {
                play()
            }
        }
    }

    private func moviewPlayerLoadedTimeRangeDidUpdated(ranges:Array<NSValue>) {
        var maximum:TimeInterval = 0
        for value in ranges {
            let range:CMTimeRange = value.timeRangeValue
            let currentLoadedTimeRange = CMTimeGetSeconds(range.start) + CMTimeGetSeconds(range.duration)
            if currentLoadedTimeRange > maximum {
                maximum = currentLoadedTimeRange
            }
        }
        let progress:Double = assetDuration == 0 ? 0.0 : Double(maximum) / assetDuration

        delegate?.downloadedProgress(progress: progress)
    }

    deinit {
        cleanUp()
    }

    private func initialSetupWithURL(url:NSURL) {
        let options = [AVURLAssetPreferPreciseDurationAndTimingKey : true]
        urlAsset = AVURLAsset(url: url as URL, options: options)
    }

    // MARK: - Observations
    override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {

        if context == videoContext {
            if let key = keyPath {
                if key == "status", let player = assetPlayer {
                    playerDidChangeStatus(status: player.status)
                } else if key == "loadedTimeRanges", let item = playerItem {
                    moviewPlayerLoadedTimeRangeDidUpdated(ranges: item.loadedTimeRanges)
                }
            }
        }
    }
}

Usage:

private var playerView: PlayerView = PlayerView()
private var videoPlayer:VideoPlayer?

and inside viewDidLoad():

view.addSubview(playerView)
preparePlayer()

// set Constraints (if you do it purely in code)
playerView.translatesAutoresizingMaskIntoConstraints = false
playerView.topAnchor.constraint(equalTo: view.topAnchor, constant: 10.0).isActive = true
playerView.leadingAnchor.constraint(equalTo: view.leadingAnchor, constant: 10.0).isActive = true
playerView.trailingAnchor.constraint(equalTo: view.trailingAnchor, constant: -10.0).isActive = true
playerView.bottomAnchor.constraint(equalTo: view.bottomAnchor, constant: 10.0).isActive = true
private func preparePlayer() {
    if let filePath = Bundle.main.path(forResource: "my video", ofType: "mp4") {
        let fileURL = NSURL(fileURLWithPath: filePath)
        videoPlayer = VideoPlayer(urlAsset: fileURL, view: playerView)
        if let player = videoPlayer {
            player.playerRate = 0.67
        }
    }
}
iKK
  • 6,394
  • 10
  • 58
  • 131
10

Swift 3

if let filePath = Bundle.main.path(forResource: "small", ofType: ".mp4") {
    let filePathURL = NSURL.fileURL(withPath: filePath)

    let player = AVPlayer(url: filePathURL)
    let playerController = AVPlayerViewController()
    playerController.player = player
    self.present(playerController, animated: true) {
        player.play()
    }
}
Aayushi
  • 787
  • 10
  • 14
8

None of the above worked for me Swift 5 for Local Video Player

after reading apple documentation I was able to create simple example for playing video from Local resources

Here is code snip

import UIKit
import AVKit

class ViewController: UIViewController {

    override func viewDidLoad() {
        super.viewDidLoad()
        //TODO : Make Sure Add and copy "SampleVideo.mp4" file in project before play
    }

    @IBAction func playLocalVideo(_ sender: Any) {

        guard let path = Bundle.main.path(forResource: "SampleVideo", ofType: "mp4") else {
            return
        }
        let videoURL = NSURL(fileURLWithPath: path)

        // Create an AVPlayer, passing it the local video url path
        let player = AVPlayer(url: videoURL as URL)
        let controller = AVPlayerViewController()
        controller.player = player
        present(controller, animated: true) {
            player.play()
        }
    }

}

PS: Make sure you don't forget to add and copy video named "SampleVideo.mp4" in project

swiftBoy
  • 35,607
  • 26
  • 136
  • 135
5

these code is converted code from gbk answer in swift 4

1.in your main controllview :

 if let filePath = Bundle.main.path(forResource: "clip", ofType: "mp4") {
            let fileURL = NSURL(fileURLWithPath: filePath)
            videoPlayer = VideoPlayer(urlAsset: fileURL, view: playerView)
            if let player = videoPlayer {
                player.playerRate = 1.00
            }
        }
  1. you need VideoPlayer class

    import AVFoundation
    
    import Foundation
    
    protocol VideoPlayerDelegate {
        func downloadedProgress(progress:Double)
        func readyToPlay()
        func didUpdateProgress(progress:Double)
        func didFinishPlayItem()
        func didFailPlayToEnd()
    }
    
    let videoContext:UnsafeMutablePointer<Void>? = nil
    class VideoPlayer : NSObject {
        private var assetPlayer:AVPlayer?
        private var playerItem:AVPlayerItem?
        private var urlAsset:AVURLAsset?
        private var videoOutput:AVPlayerItemVideoOutput?
    
        private var assetDuration:Double = 0
        private var playerView:PlayerView?
    
        private var autoRepeatPlay:Bool = true
        private var autoPlay:Bool = true
    
        var delegate:VideoPlayerDelegate?
    
        var playerRate:Float = 1 {
            didSet {
                if let player = assetPlayer {
                    player.rate = playerRate > 0 ? playerRate : 0.0
                }
            }
        }
    
        var volume:Float = 0 {
            didSet {
                if let player = assetPlayer {
                    player.volume = 50
                }
            }
        }
    
        // MARK: - Init
    
        convenience init(urlAsset:NSURL, view:PlayerView, startAutoPlay:Bool = true, repeatAfterEnd:Bool = true) {
            self.init()
    
            playerView = view
            autoPlay = startAutoPlay
            autoRepeatPlay = repeatAfterEnd
    
            if let playView = playerView{
                if let playerLayer = playView.layer as? AVPlayerLayer {
                    playerLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
                }
            }
    
            initialSetupWithURL(url: urlAsset)
            prepareToPlay()
        }
    
        override init() {
            super.init()
        }
    
        // MARK: - Public
    
        func isPlaying() -> Bool {
            if let player = assetPlayer {
                return player.rate > 0
            } else {
                return false
            }
        }
    
        func seekToPosition(seconds:Float64) {
            if let player = assetPlayer {
                pause()
                if let timeScale = player.currentItem?.asset.duration.timescale {
                    player.seek(to: CMTimeMakeWithSeconds(seconds, timeScale), completionHandler: { (complete) in
                        self.play()
                    })
                }
            }
        }
    
        func pause() {
            if let player = assetPlayer {
                player.pause()
            }
        }
    
        func play() {
            if let player = assetPlayer {
                if (player.currentItem?.status == .readyToPlay) {
                    player.play()
                    player.rate = playerRate
                }
            }
        }
    
        func cleanUp() {
            if let item = playerItem {
                item.removeObserver(self, forKeyPath: "status")
                item.removeObserver(self, forKeyPath: "loadedTimeRanges")
            }
            NotificationCenter.default.removeObserver(self)
            assetPlayer = nil
            playerItem = nil
            urlAsset = nil
        }
    
        // MARK: - Private
    
        private func prepareToPlay() {
            let keys = ["tracks"]
            if let asset = urlAsset {
    
    
    
                asset.loadValuesAsynchronously(forKeys: keys, completionHandler: {
                    DispatchQueue.global(qos: .userInitiated).async {
                        // Bounce back to the main thread to update the UI
                        DispatchQueue.main.async {
                            self.startLoading()
                        }
                    }
                })
    
            }
        }
    
        private func startLoading(){
            var error:NSError?
    
            guard let asset = urlAsset else {return}
    
    
    //         let status:AVKeyValueStatus = asset.statusOfValueForKey("tracks", error: &error)
       let status:AVKeyValueStatus = asset.statusOfValue(forKey: "tracks", error: nil)
    
    
            if status == AVKeyValueStatus.loaded {
                assetDuration = CMTimeGetSeconds(asset.duration)
    
                let videoOutputOptions = [kCVPixelBufferPixelFormatTypeKey as String : Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)]
                videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: videoOutputOptions)
                playerItem = AVPlayerItem(asset: asset)
    
                if let item = playerItem {
                    item.addObserver(self, forKeyPath: "status", options: .initial, context: videoContext)
                    item.addObserver(self, forKeyPath: "loadedTimeRanges", options: [.new, .old], context: videoContext)
    
                    NotificationCenter.default.addObserver(self, selector: #selector(playerItemDidReachEnd), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil)
                    NotificationCenter.default.addObserver(self, selector: #selector(didFailedToPlayToEnd), name:NSNotification.Name.AVPlayerItemFailedToPlayToEndTime, object: nil)
    
    
                    if let output = videoOutput {
                        item.add(output)
    
                        item.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithm.varispeed
                        assetPlayer = AVPlayer(playerItem: item)
    
                        if let player = assetPlayer {
                            player.rate = playerRate
                        }
    
                        addPeriodicalObserver()
                        if let playView = playerView, let layer = playView.layer as? AVPlayerLayer {
                            layer.player = assetPlayer
                        }
                    }
                }
            }
        }
    
        private func addPeriodicalObserver() {
            let timeInterval = CMTimeMake(1, 1)
    
    
    
    
            if let player = assetPlayer {
                player.addPeriodicTimeObserver(forInterval: timeInterval, queue:
                   DispatchQueue.main
                    , using: { (time) in
                    self.playerDidChangeTime(time: time)
                })
            }
    
    
        }
    
        private func playerDidChangeTime(time:CMTime) {
            if let player = assetPlayer {
                let timeNow = CMTimeGetSeconds(player.currentTime())
                let progress = timeNow / assetDuration
    
                delegate?.didUpdateProgress(progress: progress)
            }
        }
    
        @objc private func playerItemDidReachEnd() {
            delegate?.didFinishPlayItem()
    
            if let player = assetPlayer {
                player.seek(to: kCMTimeZero)
                if autoRepeatPlay == true {
                    play()
                }
            }
        }
    
        @objc private func didFailedToPlayToEnd() {
            delegate?.didFailPlayToEnd()
        }
    
        private func playerDidChangeStatus(status:AVPlayerStatus) {
            if status == .failed {
                print("Failed to load video")
            } else if status == .readyToPlay, let player = assetPlayer {
                volume = player.volume
                delegate?.readyToPlay()
    
                if autoPlay == true && player.rate == 0.0 {
                    play()
                }
            }
        }
    
        private func moviewPlayerLoadedTimeRangeDidUpdated(ranges:Array<NSValue>) {
            var maximum:TimeInterval = 0
            for value in ranges {
                let range:CMTimeRange = value.timeRangeValue
                let currentLoadedTimeRange = CMTimeGetSeconds(range.start) + CMTimeGetSeconds(range.duration)
                if currentLoadedTimeRange > maximum {
                    maximum = currentLoadedTimeRange
                }
            }
            let progress:Double = assetDuration == 0 ? 0.0 : Double(maximum) / assetDuration
    
            delegate?.downloadedProgress(progress: progress)
        }
    
        deinit {
            cleanUp()
        }
    
        private func initialSetupWithURL(url:NSURL) {
            let options = [AVURLAssetPreferPreciseDurationAndTimingKey : true]
    
    
    //        urlAsset = AVURLAsset(URL: url, options: options)
            urlAsset = AVURLAsset(url: url as URL, options: options)
    
        }
    
        // MARK: - Observations
    
    
    
    
        override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?){
            if (context as? UnsafeMutablePointer<Void> ) == videoContext {
                if let key = keyPath {
                    if key == "status", let player = assetPlayer {
                        playerDidChangeStatus(status: player.status)
                    } else if key == "loadedTimeRanges", let item = playerItem {
                        moviewPlayerLoadedTimeRangeDidUpdated(ranges: item.loadedTimeRanges)
                    }
                }
            }
        }
    
    
    }
    
  2. and a PlayerView class to create a videoview:

      import AVFoundation
    
      import UIKit
    
      class PlayerView: UIView {
    
    
    override class var layerClass: AnyClass {
        get {
          return AVPlayerLayer.self
        }
    }
    
    
    var player:AVPlayer? {
        set {
            if let layer = layer as? AVPlayerLayer {
                layer.player = player
            }
        }
        get {
            if let layer = layer as? AVPlayerLayer {
                return layer.player
            } else {
                return nil
            }
        }
    }
    }
    
Dhaval Bhimani
  • 980
  • 1
  • 13
  • 24
iman kazemayni
  • 1,255
  • 1
  • 19
  • 20
  • You have 2 retain cycles in this code. First one is in prepareToPlay, second is addPeriodicalObserver. You need to pass the self weak to the blocks. – cekisakurek Aug 31 '18 at 14:37
  • what is videoPlayer in mainControllerView and where you define playerView? – Chandan Jee Sep 06 '19 at 13:31
  • @ChandanJee as you can see, the video player is an instance of VideoPlayer. Instead of using a local variable in that function, I have used this variable in my Class. if you want to use it in that function please write like this: var videoPlayer:VideoPlayer ... playerView inherits from UIView. You can create a class with its name and then copy and paste the codes. after that behave with it like other views. add it in your nib or xib views – iman kazemayni Sep 07 '19 at 05:02
5

PlayerView for swift 4.2

import AVFoundation
import UIKit

class PlayerView: UIView {

    var player: AVPlayer? {
        get {
            return playerLayer.player
        }
        set {
            playerLayer.player = newValue
        }
    }

    var playerLayer: AVPlayerLayer {
        return layer as! AVPlayerLayer
    }

    // Override UIView property
    override static var layerClass: AnyClass {
        return AVPlayerLayer.self
    }
}
Sangwon Park
  • 454
  • 5
  • 4
3

gbk's solution in swift 3

PlayerView

import AVFoundation
import UIKit

class PlayerView: UIView {

override class var layerClass: AnyClass {
    return AVPlayerLayer.self
}

var player:AVPlayer? {
    set {
        if let layer = layer as? AVPlayerLayer {
            layer.player = player
        }
    }
    get {
        if let layer = layer as? AVPlayerLayer {
            return layer.player
        } else {
            return nil
        }
    }
}
}

VideoPlayer

import AVFoundation
import Foundation

protocol VideoPlayerDelegate {
    func downloadedProgress(progress:Double)
    func readyToPlay()
    func didUpdateProgress(progress:Double)
    func didFinishPlayItem()
    func didFailPlayToEnd()
}

let videoContext:UnsafeMutableRawPointer? = nil

class VideoPlayer : NSObject {



private var assetPlayer:AVPlayer?
private var playerItem:AVPlayerItem?
private var urlAsset:AVURLAsset?
private var videoOutput:AVPlayerItemVideoOutput?

private var assetDuration:Double = 0
private var playerView:PlayerView?

private var autoRepeatPlay:Bool = true
private var autoPlay:Bool = true

var delegate:VideoPlayerDelegate?

var playerRate:Float = 1 {
    didSet {
        if let player = assetPlayer {
            player.rate = playerRate > 0 ? playerRate : 0.0
        }
    }
}

var volume:Float = 1.0 {
    didSet {
        if let player = assetPlayer {
            player.volume = volume > 0 ? volume : 0.0
        }
    }
}

// MARK: - Init

convenience init(urlAsset: URL, view:PlayerView, startAutoPlay:Bool = true, repeatAfterEnd:Bool = true) {
    self.init()

    playerView = view
    autoPlay = startAutoPlay
    autoRepeatPlay = repeatAfterEnd

    if let playView = playerView, let playerLayer = playView.layer as? AVPlayerLayer {
        playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
    }
    initialSetupWithURL(url: urlAsset)
    prepareToPlay()
}

override init() {
    super.init()
}

// MARK: - Public

func isPlaying() -> Bool {
    if let player = assetPlayer {
        return player.rate > 0
    } else {
        return false
    }
}

func seekToPosition(seconds:Float64) {
    if let player = assetPlayer {
        pause()
        if let timeScale = player.currentItem?.asset.duration.timescale {
            player.seek(to: CMTimeMakeWithSeconds(seconds, timeScale), completionHandler: { (complete) in
                self.play()
            })
        }
    }
}

func pause() {
    if let player = assetPlayer {
        player.pause()
    }
}

func play() {
    if let player = assetPlayer {
        if (player.currentItem?.status == .readyToPlay) {
            player.play()
            player.rate = playerRate
        }
    }
}

func cleanUp() {
    if let item = playerItem {
        item.removeObserver(self, forKeyPath: "status")
        item.removeObserver(self, forKeyPath: "loadedTimeRanges")
    }
    NotificationCenter.default.removeObserver(self)
    assetPlayer = nil
    playerItem = nil
    urlAsset = nil
}

// MARK: - Private

private func prepareToPlay() {
    let keys = ["tracks"]
    if let asset = urlAsset {
        asset.loadValuesAsynchronously(forKeys: keys, completionHandler: {
            DispatchQueue.main.async {
                self.startLoading()
            }
        })
    }
}

private func startLoading(){
    var error:NSError?
    guard let asset = urlAsset else {return}
    let status:AVKeyValueStatus = asset.statusOfValue(forKey: "tracks", error: &error)

    if status == AVKeyValueStatus.loaded {
        assetDuration = CMTimeGetSeconds(asset.duration)

        let videoOutputOptions = [kCVPixelBufferPixelFormatTypeKey as String : Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)]
        videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: videoOutputOptions)
        playerItem = AVPlayerItem(asset: asset)

        if let item = playerItem {
            item.addObserver(self, forKeyPath: "status", options: .initial, context: videoContext)
            item.addObserver(self, forKeyPath: "loadedTimeRanges", options: [.new, .old], context: videoContext)

            NotificationCenter.default.addObserver(self, selector: #selector(playerItemDidReachEnd), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil)
            NotificationCenter.default.addObserver(self, selector: #selector(didFailedToPlayToEnd), name: NSNotification.Name.AVPlayerItemFailedToPlayToEndTime, object: nil)

            if let output = videoOutput {
                item.add(output)

                item.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed
                assetPlayer = AVPlayer(playerItem: item)

                if let player = assetPlayer {
                    player.rate = playerRate
                }

                addPeriodicalObserver()
                if let playView = playerView, let layer = playView.layer as? AVPlayerLayer {
                    layer.player = assetPlayer
                    print("player created")
                }
            }
        }
    }
}

private func addPeriodicalObserver() {
    let timeInterval = CMTimeMake(1, 1)

    if let player = assetPlayer {
        player.addPeriodicTimeObserver(forInterval: timeInterval, queue: DispatchQueue.main, using: { (time) in
            self.playerDidChangeTime(time: time)
        })
    }
}

private func playerDidChangeTime(time:CMTime) {
    if let player = assetPlayer {
        let timeNow = CMTimeGetSeconds(player.currentTime())
        let progress = timeNow / assetDuration

        delegate?.didUpdateProgress(progress: progress)
    }
}

@objc private func playerItemDidReachEnd() {
    delegate?.didFinishPlayItem()

    if let player = assetPlayer {
        player.seek(to: kCMTimeZero)
        if autoRepeatPlay == true {
            play()
        }
    }
}

@objc private func didFailedToPlayToEnd() {
    delegate?.didFailPlayToEnd()
}

private func playerDidChangeStatus(status:AVPlayerStatus) {
    if status == .failed {
        print("Failed to load video")
    } else if status == .readyToPlay, let player = assetPlayer {
        volume = player.volume
        delegate?.readyToPlay()

        if autoPlay == true && player.rate == 0.0 {
            play()
        }
    }
}

private func moviewPlayerLoadedTimeRangeDidUpdated(ranges:Array<NSValue>) {
    var maximum:TimeInterval = 0
    for value in ranges {
        let range:CMTimeRange = value.timeRangeValue
        let currentLoadedTimeRange = CMTimeGetSeconds(range.start) + CMTimeGetSeconds(range.duration)
        if currentLoadedTimeRange > maximum {
            maximum = currentLoadedTimeRange
        }
    }
    let progress:Double = assetDuration == 0 ? 0.0 : Double(maximum) / assetDuration

    delegate?.downloadedProgress(progress: progress)
}

deinit {
    cleanUp()
}

private func initialSetupWithURL(url: URL) {
    let options = [AVURLAssetPreferPreciseDurationAndTimingKey : true]
    urlAsset = AVURLAsset(url: url, options: options)
}

// MARK: - Observations

override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
    if context == videoContext {
        if let key = keyPath {
            if key == "status", let player = assetPlayer {
                playerDidChangeStatus(status: player.status)
            } else if key == "loadedTimeRanges", let item = playerItem {
                moviewPlayerLoadedTimeRangeDidUpdated(ranges: item.loadedTimeRanges)
            }
        }
    }
}
}
Woody Jean-louis
  • 475
  • 6
  • 12