1

I have a video layer I want to render onto an SCNPlane. It works on the Simulator, but not on the device.

Here's a visual:

enter image description here

Here's the code:

    //DISPLAY PLANE
    SCNPlane * displayPlane = [SCNPlane planeWithWidth:displayWidth height:displayHeight];
    displayPlane.cornerRadius = cornerRadius;
    
    SCNNode * displayNode = [SCNNode nodeWithGeometry:displayPlane];
    [scene.rootNode addChildNode:displayNode];
    
    //apply material
    SCNMaterial * displayMaterial = [SCNMaterial material];
    displayMaterial.diffuse.contents = [[UIColor greenColor] colorWithAlphaComponent:1.0f];
    [displayNode.geometry setMaterials:@[displayMaterial]];
    
    //move to front + position for rim
    displayNode.position = SCNVector3Make(0, rimTop - 0.08, /*0.2*/ 1);
 
    //create video item
    NSBundle * bundle = [NSBundle mainBundle];
    NSString * path = [bundle pathForResource:@"tv_preview" ofType:@"mp4"];
    NSURL * url = [NSURL fileURLWithPath:path];
 
    AVAsset * asset = [AVAsset assetWithURL:url];
    AVPlayerItem * item = [AVPlayerItem playerItemWithAsset:asset];
    queue = [AVQueuePlayer playerWithPlayerItem:item];
    looper = [AVPlayerLooper playerLooperWithPlayer:queue templateItem:item];
    queue.muted = true;
    
    layer = [AVPlayerLayer playerLayerWithPlayer:queue];
    layer.frame = CGRectMake(0, 0, w, h);
    layer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    displayMaterial.diffuse.contents = layer;
    displayMaterial.doubleSided = true;
    [queue play];
    
    //[self.view.layer addSublayer:layer];

I can confirm that the actual plane exists (appears as green in the image above if avplayerlayer isn't applied to it) - first image above. If the video layer is added directly to the parent view layer (bottom line above) it runs fine - final image above. I thought it might be file system issue, but then I imagine (?) the video wouldn't play in the final image.

EDIT: setting queue (AVPlayer) directly works on Simulator, albeit ugly as hell, and crashes on Device, with following error log:

Error: Could not get pixel buffer (CVPixelBufferRef)
validateFunctionArguments:3797: failed assertion `Fragment Function(commonprofile_frag): incorrect type of texture (MTLTextureType2D) bound at texture binding at index 4 (expect MTLTextureTypeCube) for u_radianceTexture[0].'
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
Johnny Rockex
  • 4,136
  • 3
  • 35
  • 55
  • 1
    Is there anything in the logs? Does it work with other movie files? – mnuages Nov 10 '21 at 15:00
  • @mnuages unfortunately nothing in the log files. Tried with another mp4, with same result. – Johnny Rockex Nov 10 '21 at 15:33
  • 1
    Have you tried setting the `AVQueuePlayer` directly as the material contents? (with no `AVPlayerLayer` involved). – mnuages Nov 10 '21 at 16:03
  • it works on Simulator, setting it on device results in crash (error log added above) – Johnny Rockex Nov 11 '21 at 12:44
  • 1
    I suspect it may have to do with different loading characteristics on the device. Try out the code in the answer to this: https://stackoverflow.com/questions/47816292/avasset-tracks-is-empty/47816755#47816755 – Gerd K Nov 15 '21 at 15:04
  • 1
    Looking at the logs... I wonder if applying the material is what's causing the crash. – dispatchswift Nov 21 '21 at 07:29

1 Answers1

2

I think that my solution will suit you - the only thing is that small changes will be required here:

#import <SceneKit/SceneKit.h>
#import <AVKit/AVKit.h>
#import <SpriteKit/SpriteKit.h>

@interface ViewController: UIViewController

@end

It works for simulator (in Xcode 13.2.1) and for device (I run it on iPhone X with iOS 15.3). I used 640x360 H.265 video. The only problem here – a video looping is stuttering...

#import "ViewController.h"

@implementation ViewController

AVQueuePlayer *queue;
AVPlayerLooper *looper;
SKVideoNode *videoNode;

- (void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:animated];
    
    SCNView *sceneView = (SCNView *)self.view;
    sceneView.backgroundColor = [UIColor blackColor];
    sceneView.autoenablesDefaultLighting = YES;
    sceneView.allowsCameraControl = YES;
    sceneView.playing = YES;
    sceneView.loops = YES;
    
    SCNScene *scene = [SCNScene scene];
    sceneView.scene = scene;

    // Display
    SCNPlane *displayPlane = [SCNPlane planeWithWidth:1.6 height:0.9];
    SCNNode *displayNode = [SCNNode nodeWithGeometry:displayPlane];

    SCNMaterial *displayMaterial = [SCNMaterial material];
    displayMaterial.lightingModelName = SCNLightingModelConstant;
    displayMaterial.doubleSided = YES;
    
    // Video
    NSBundle *bundle = [NSBundle mainBundle];
    NSString *path = [bundle pathForResource:@"art.scnassets/hevc" 
                                                    ofType:@"mp4"];
    NSURL *url = [NSURL fileURLWithPath:path];

    AVAsset *asset = [AVAsset assetWithURL:url];
    AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:asset];
    queue = [AVQueuePlayer playerWithPlayerItem:item];
    queue.muted = NO;
    
    looper = [AVPlayerLooper playerLooperWithPlayer:queue templateItem:item];
    
    // SpriteKit video node
    videoNode = [SKVideoNode videoNodeWithAVPlayer:queue];
    videoNode.zRotation = -M_PI;
    videoNode.xScale = -1;
    
    SKScene *skScene = [SKScene sceneWithSize: CGSizeMake(
                                                displayPlane.width * 1000,
                                                displayPlane.height * 1000)];
    
    videoNode.position = CGPointMake(skScene.size.width / 2,
                                     skScene.size.height / 2);

    videoNode.size = skScene.size;
    [videoNode play];
    [skScene addChild:videoNode];
    
    displayMaterial.diffuse.contents = skScene;
    [displayNode.geometry setMaterials:@[displayMaterial]];
    
    [scene.rootNode addChildNode:displayNode];
}

@end
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220