2

I create a SKScene with SKVideoNode, then apply to a sphere geometry.

Here is the key code:

// Create a SKScene to play video
NSString* filePath = [[NSBundle mainBundle] pathForResource:@"2222" ofType:@"mp4"];
NSURL* sourceMovieURL = [NSURL fileURLWithPath:filePath];
AVPlayer* player = [AVPlayer playerWithURL:sourceMovieURL];
SKVideoNode* videoNode = [SKVideoNode videoNodeWithAVPlayer:player];

//CGSize size = CGSizeMake(512, 512);
CGSize size = [UIScreen mainScreen].bounds.size;
videoNode.size = size;
videoNode.position = CGPointMake(size.width/2.0, size.height/2.0);
SKScene* spriteScene = [SKScene sceneWithSize:size];
[spriteScene addChild:videoNode];


// create a material with SKScene
SCNMaterial* material = [SCNMaterial material];
material.doubleSided = true;
material.diffuse.contents = spriteScene;


[sphereNode.geometry replaceMaterialAtIndex:0 withMaterial:material];

[videoNode play];

[_scnScene.rootNode addChildNode:sphereNode];

// create SCNRenderer to render the scene
_renderer = [SCNRenderer rendererWithContext:cardboardView.context options:nil];
_renderer.scene = _scnScene;
_renderer.pointOfView = _scnCameraNode;

In the drawEye function:

- (void)cardboardView:(GVRCardboardView *)cardboardView drawEye:(GVREye)eye withHeadTransform:(GVRHeadTransform *)headTransform
{
//CGRect viewport = [headTransform viewportForEye:eye];

// Get the head matrix.
const GLKMatrix4 head_from_start_matrix = [headTransform headPoseInStartSpace];

// Get this eye's matrices.
GLKMatrix4 projection_matrix = [headTransform projectionMatrixForEye:eye near:_scnCamera.zNear far:_scnCamera.zFar];


GLKMatrix4 eye_from_head_matrix = [headTransform eyeFromHeadMatrix:eye];

// Compute the model view projection matrix.
GLKMatrix4 view_projection_matrix = GLKMatrix4Multiply(
                                                       projection_matrix, GLKMatrix4Multiply(eye_from_head_matrix, head_from_start_matrix));
// Set the projection matrix to camera
[_scnCamera setProjectionTransform:SCNMatrix4FromGLKMatrix4(view_projection_matrix)];


// Render the scene
[_renderer renderAtTime:0];

}

When run the code, it will break with

[_renderer renderAtTime:0]

and the output is:

Failed to create IOSurface image (texture)
Assertion failed: (result), function create_texture_from_IOSurface, file /BuildRoot/Library/Caches/com.apple.xbs/Sources/Jet/Jet-2.6.1/Jet/jet_context_OpenGL.mm, line 570.

When I remove the SKVideoNode from the SKScene, everything is OK.

Any help? Thanks.

guiyuan
  • 21
  • 1
  • 3
  • We have the same issue, except it doesn't occur on all of our devices... or possibly versions of iOS. What device(s) and version(s) of iOS, xcode, and mac are you running? – caseyh Jul 08 '16 at 19:31
  • Device is iphone 6s, ios9.3.2, Mac mini(Late 2014) OS X EI Caption version 10.11.5 with xcode 7.3.1 – guiyuan Jul 11 '16 at 07:56
  • So, our working hypothesis is that on the 5s and above with ios 9+ there is a change or bug introduced by the OS. We think it's possible it could be related to iOS's preference on those devices to use metal over OpenGL, but that's more speculation right now. We'll post more if we find more. – caseyh Jul 11 '16 at 16:32
  • OK, thanks! But GVRSDK uses OpenGL over metal, I hope Google will fix this someday! If you find any solutions please let me know. I'll appreciate it。 – guiyuan Jul 12 '16 at 03:05
  • we are seeing this with iOS 10 Beta with SPHERE and SKVIDEONODE. This was working fine on iOS 9. We have a sample app that setting PrefersOpenGL to NO and it works, but YES does not work and gives the same problem, but on iOS 10 Beta, and working on iOS 9 – ort11 Aug 31 '16 at 14:59
  • Same thing happening here: updated device to iOS 10 and it's now failing. Also using SPHERE and SKVIDEONODE :) – Guig Sep 14 '16 at 17:58
  • I've created a simple reproduction with a cube in a SCNScene that uses a video texture. It works with Metal and fails with opengl: https://github.com/gsabran/ios10OpenGLVideoRenderingBug – Guig Sep 14 '16 at 21:39

2 Answers2

1

Update: This works OK for low quality videos but high quality ones don't perform great copying so many bytes. I've not tried it yet but my next approach is using an OpenGL texture with a CVPixelBuffer for better performance.

-

I don't know if you're still looking for a solution but I've had some luck getting a video sphere working with SceneKit / GVR on iOS 9.3 on a 6s and iOS 8 sim. I can't vouch for all platforms though!

I've completely dropped SpriteKit and moved instead to use a CALayer whose contents I set as CGImage from a CVPixelBuffer.

The video code: (initially based on this OpenGL 360 video tutorial)

init(url: NSURL)
{
    self.videoURL = url
    super.init()
    self.configureVideoPlayback()
}

private override init()
{
    self.videoURL = nil
    super.init()
}

deinit
{
    self.playerItem.removeOutput(self.videoOutput)
}

private func configureVideoPlayback()
{
    let pixelBufferAttributes = [
        kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32ARGB),
        kCVPixelBufferCGImageCompatibilityKey as String: true,
        kCVPixelBufferOpenGLESCompatibilityKey as String: true
    ]
    self.videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: pixelBufferAttributes)
    
    self.playerItem = AVPlayerItem(URL: self.videoURL)
    self.playerItem.addOutput(self.videoOutput)
    NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(VideoReader.playerItemDidPlayToEndTime(_:)), name: AVPlayerItemDidPlayToEndTimeNotification, object: self.playerItem)

    self.player = AVPlayer(playerItem: self.playerItem)
    self.player.play()
}   

func currentPixelBuffer() - > CVPixelBuffer ? {
    guard self.playerItem ? .status == .ReadyToPlay
    else {
        return nil
    }
    
    let currentTime = self.playerItem.currentTime()
    return self.videoOutput.copyPixelBufferForItemTime(currentTime, itemTimeForDisplay: nil)
}

func playerItemDidPlayToEndTime(notification: NSNotification)
{
    self.player.seekToTime(kCMTimeZero)
    self.player.play()
}

Scene:

func setupScene() {
  self.scene = SCNScene()

  self.imageLayer = CALayer()
  self.imageLayer.frame = CGRectMake(0, 0, 2048, 2048) //Doesn't work if not power of 2 or incremenets inbetween - need to investigate

  let material = SCNMaterial()
  material.doubleSided = true
  material.diffuse.contents = self.imageLayer

  let geometry = SCNSphere(radius: 10)

  let sphere = SCNNode(geometry: geometry)
  sphere.geometry ? .replaceMaterialAtIndex(0, withMaterial: material)
  sphere.position = SCNVector3(0, 0, 0)
  sphere.scale.y = 1
  sphere.scale.z = -1

  self.scene!.rootNode.addChildNode(sphere)
}

Cardboard Draw Frame Prep:

func cardboardView(cardboardView: GVRCardboardView!, prepareDrawFrame headTransform: GVRHeadTransform!) {
  
  // .. boilerplate code

  if let pixelBuffer = self.videoReader.currentPixelBuffer() {
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);

    let width = CVPixelBufferGetWidth(pixelBuffer)
    let height = CVPixelBufferGetHeight(pixelBuffer)
    let pixels = CVPixelBufferGetBaseAddress(pixelBuffer);

    let pixelWrapper = CGDataProviderCreateWithData(nil, pixels, CVPixelBufferGetDataSize(pixelBuffer), nil);

    // Get a color-space ref... can't this be done only once?
    let colorSpaceRef = CGColorSpaceCreateDeviceRGB();

    // Get a CGImage from the data (the CGImage is used in the drawLayer: delegate method above)

    let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.NoneSkipFirst.rawValue)
    if let currentCGImage = CGImageCreate(width,
      height,
      8,
      32,
      4 * width,
      colorSpaceRef, [.ByteOrder32Big, bitmapInfo],
      pixelWrapper,
      nil,
      false,
      .RenderingIntentDefault) {
      self.imageLayer.contents = currentCGImage
    }

    // Clean up
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

  }
}

My initial implementation of GVR and SceneKit is based on this boilerplate

The frame rate still seems to be high enough for me - it's not optimised yet but I'll update this answer when I do.

Note:

  • The CALayer bounds need to be power of 2 it seems - or increments inbetween, it doesn't show any video when I go for values between like 1960.
  • It runs really slow in Simulator but at 60fps on device for me
Allan Weir
  • 628
  • 5
  • 12
  • Thanks for your answer! It's worked! But there is a new problem, it breaks after serval seconds with // Render the scene [_renderer renderAtTime:0]; Then I use the VRBoilerplate as mentioned above, it's also break at the same point. if glGetError() == GLenum(GL_NO_ERROR) { eyeRenderer.renderAtTime(0); } – guiyuan Jul 18 '16 at 10:04
  • That's not good. Have you tried it with a different video? Which phone are you using? I didn't put the full video code up as I didn't think it was relevant but I'll update it now so it uses 'loadValuesAsynchronouslyForKeys' on the AVURLAsset - the video reader part is based on this tutorial https://www.nomtek.com/video-360-in-opengl-ios-part-4-video-reader/ – Allan Weir Jul 19 '16 at 09:55
  • Actually it might be easier to just use an AVPlayer rather than AVURLAsset. Assuming the problem is coming from video playback if the GL code runs smoothly to begin with. I've simplified the code in my answer, hopefully it helps. If not try adding error notification listeners to the AVPlayerItem – Allan Weir Jul 19 '16 at 10:20
  • OK,Thank you! I'll try it. – guiyuan Jul 20 '16 at 02:36
  • That seems to work ok. Do you have a sense on the additional pressure it's putting on the CPU/GPU? Also, I've the impression that `cardboardView` is called at every rendering pass while it should be enough to not call it if the video has not moved yet to a new frame. Have you tried with large videos, like 4k etc? – Guig Sep 15 '16 at 01:08
  • on a 4k video, I get 20fps :(( – Guig Sep 15 '16 at 01:17
  • Hey, it's definitely not optimised and I've moved away from using this. An alternative which I've not looked into yet is taking the CVPixelBuffer and using it as an OpenGL texture in SceneKit - rather than copy the bytes to a CGImage I think it will read them directly from the buffer. – Allan Weir Sep 15 '16 at 10:20
0

I think the reason is the SDK set the SCNView's render API to OpenGLES. Just set the API to default(Metal) will fix this problem.

Check if there is some code like :

[[SVNView alloc] initWithFrame:CGRect() options:@{SCNPreferredRenderingAPIKey: [NSNumber numberWithInt:SCNRenderingAPIOpenGLES2]}]

Change this to :

[[SVNView alloc] initWithFrame:CGRect() options:nil] or [[SVNView alloc] initWithFrame:CGRect()]
noufalcep
  • 3,446
  • 15
  • 33
  • 51
lusnaow
  • 51
  • 5