5

How can I get the video size of a video from AVPlayer to set the geometry size of my node?

For example, I have an SCNPlane with a width and height

let planeGeo = SCNPlane(width: 5, height: 5)

So now I instantiate my video player

let videoURL = NSURL(string: someURL)
let player = AVPlayer(URL: videoURL!)

and my SKVideoNode

let spriteKitScene = SKScene(size: CGSize(width: 1920, height: 1080))
spriteKitScene.scaleMode = .AspectFit

videoSpriteKitNode = SKVideoNode(AVPlayer: player)
videoSpriteKitNode.anchorPoint = CGPointMake(0,0)
videoSpriteKitNode.size.width = spriteKitScene.size.width
videoSpriteKitNode.size.height = spriteKitScene.size.height

spriteKitScene.addChild(videoSpriteKitNode)

planeGeo!.firstMaterial.diffuse.contents = spriteKitScene
videoSpriteKitNode.play()

So now I want to have the video size to resize my plane to a correct aspect ratio. I already fiddled around with AVLPlayerLayer but this gives me always 0

let avLayer = AVPlayerLayer(player: player)
print(avLayer.videoRect.width) //0
print(avLayer.videoRect.height) //0

Also I tried that here but it doesn't work as well

let avLayer = AVPlayerLayer(player: player)
let layer = avLayer.sublayers![0]
let transformedBounds = CGRectApplyAffineTransform(layer.bounds, CATransform3DGetAffineTransform(layer.sublayerTransform))
print(transformedBounds.width) //0
print(transformedBounds.height) //0
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
Nico S.
  • 3,056
  • 1
  • 30
  • 64
  • Noting that this question involves SpriteKit. Ten yrs later, you don't have to use SpriteKit for this! It's now very easy https://stackoverflow.com/a/74667281/294884 – Fattie Dec 03 '22 at 14:19

3 Answers3

2

Ok I figured it out, KVO is the way to go. Add in viewDidLoad:

player.currentItem?.addObserver(self, forKeyPath: "presentationSize", options: .New, context: nil)

in deinit:

player.currentItem?.removeObserver(self, forKeyPath: "presentationSize")

and then add:

override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
    
    if keyPath == "presentationSize" {
        if let item = object as? AVPlayerItem {
            let size = item.presentationSize
            let width = size.width
            let height = size.height

            //Set size of geometry here
        }
    }
}

Typical code for 2022 syntax:

var py: AVPlayer?
private var pyContext = 0
...

guard let url = URL(string: "https:// .. /test.m4v") else { return }
py = AVPlayer(url: url)
someNode.geometry?.firstMaterial?.diffuse.contents = py

py?.currentItem?.addObserver(self,
   forKeyPath: "presentationSize",
   context: &pyContext)    
...

override func observeValue(forKeyPath keyPath: String?,
                           of object: Any?,
                           change: [NSKeyValueChangeKey : Any]?,
                           context: UnsafeMutableRawPointer?) {

    if context == &py1Context && keyPath == "presentationSize" {
        print("Found it ...")

        guard let item = object as AVPlayerItem else { return }

        let ps = item.presentationSize
        let aspect: Float = Float(ps.width) / Float(ps.height)
        someNode.geometry?.firstMaterial?
           .diffuse.contentsTransform =
           SCNMatrix4MakeScale( .. , .. , 1)
    }
}

Addendum. Calculating the correct scaling is tricky

Unfortunately any time you work with video, if the exact size of the streaming content is not always the same, then shaping the video is a huge pain. Of course you have many considerations like whether letterboxed, etc etc. In some simple cases the calculation looks like this:

// 1. You nave finally received the info on the video from the HLS stream:

let ps = item.presentationSize
let spect: Float = Float(ps.width) / Float(ps.height)

// 2. Over in your 3D code, you need to know the current facts on the mesh:

let planeW = ... width of your mesh
let planeH = ... height of your mesh
let planeAspect = planeW / planeH

It's possible you are using Apple's simple provided flat square mesh, such as

var simplePlane = SCNPlane()
simplePlane.width = 2.783
simplePlane.height = 1.8723

One way or another you need the w/h of your mesh. And then,

// 3. In many (not all) cases, the solution is:

let final = vidAspect / planeAspect
print("we'll try this: \(final)")
yourNode.geometry?
  .firstMaterial?.diffuse
  .contentsTransform = SCNMatrix4MakeScale(1.0, final, 1.0)
Fattie
  • 27,874
  • 70
  • 431
  • 719
Nico S.
  • 3,056
  • 1
  • 30
  • 64
2

As you noticed, SKVideoNode size property is loaded asynchronously after video is actually presented, which may take a second. If someone needs synchronous solution they may try this. Instantiate a AVURLAsset, there you will have access to a video track (AVAssetTrack) which has naturalSize property:

let videoURL = NSURL(string: someURL)!
let asset = AVURLAsset(url: videoURL)

let videoSize: CGSize
if let size = asset.tracks(withMediaType: .video).first?.naturalSize {
    videoSize = size
} else {
    // Handle error (shouldn't happen normally if the asset is
    // an actual video.)
    videoSize = .zero
}

let player = AVPlayer(item: AVPlayerItem(asset: asset))
// Alternatively:
// let player = AVPlayer(URL: videoURL!)

videoSpriteKitNode = SKVideoNode(AVPlayer: player)
// continue node setup...
kelin
  • 11,323
  • 6
  • 67
  • 104
  • But surely that only works with local files ??? @kelin – Fattie Nov 30 '22 at 12:34
  • @Fattie, works with local files, remote files - not tested. – kelin Nov 30 '22 at 18:27
  • Thanks. Will test when I can! You know, it's hard to see how it could work with remote files because, you don't get that info until a few frames stream in. But I'll give it a test! – Fattie Nov 30 '22 at 18:57
  • @Fattie, may be you'll get some metadata first. Also, it's a synchronous solution that mean you don't need to wait here, for the remote videos the async solution provided by Nico S. would fit better. – kelin Nov 30 '22 at 19:15
  • hi @kelin , right but "you'll get some metadata first" that can take quite a while (often a second or two) ... you could well be right that it is quicker than waiting for `presentationSize` .. will check it out cheers ! – Fattie Nov 30 '22 at 19:18
1

if you previously set equality between videoNode.size and skScene.size while coding with SpriteKit, you can get video node size expressed in points using SceneKit. Read this post for details.

import SceneKit
import SpriteKit

let model = scene.rootNode.childNode(withName: "someNode", recursively: true)
    
let skScene = model?.geometry?.firstMaterial?.diffuse.contents as? SKScene
    
let skVideoNode = skScene?.children[0] as? SKVideoNode
    
guard let size = skVideoNode?.scene?.size,
      let position = skVideoNode?.position,
      let frame = skVideoNode?.frame
else { return }
    
print(String(format: "%.1f, %.1f", size.width, size.height))
print(position)
print(frame)
print(skVideoNode?.speed as Any)
print(skVideoNode?.description as Any)
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • 1
    While this is awesome information, (1) these days I guess you do not, generally, have to use sprite kit in any way to put video on a node and (2) i have a concern that, when streaming in a video. regardless of 3D, sprite kit, UIView or anything else. it's absolutely impossible to know the simensions of the incoming video until enough of the HLS stream has arrived. the only way to do this is asynchronously (and indeed, the only mechanism for doing that with AVPlayer is using observeValueForKeyPath on presentationSize, as far as I know) so one would need that before calling this? ... IDK – Fattie Dec 03 '22 at 14:08
  • In general, I agree with your point of view, but this question is exclusively about SKVideoNode. – Andy Jazz Dec 03 '22 at 14:14
  • 1
    Ah, crap! Very good point, thank you. – Fattie Dec 03 '22 at 14:17