4

I put some objects in AR space using ARKit and SceneKit. That works well. Now I'd like to add an additional camera (SCNCamera) that is placed elsewhere in the scene attached and positioned by a common SCNNode. It is oriented to show me the current scene from an other (fixed) perspective.

Now I'd like to show this additional SCNCamera feed on i.Ex. a SCNPlane (as the diffuse first material) - Like a TV screen. Of course I am aware that it will only display the SceneKit content which stays in the camera focus and not rest of the ARKit image (which is only possible by the main camera of course). A simple colored background then would be fine.

I have seen tutorials that describes, how to play a video file on a virtual display in ARSpace, but I need a realtime camera feed from my own current scene.

I defined this objects:

let camera = SCNCamera()
let cameraNode = SCNNode()

Then in viewDidLoad I do this:

camera.usesOrthographicProjection = true
camera.orthographicScale = 9
camera.zNear = 0
camera.zFar = 100
cameraNode.camera = camera
sceneView.scene.rootNode.addChildNode(cameraNode)

Then I call my setup function to place the virtual Display next to all my AR stuff, position the cameraNode as well (pointing in the direction where objects stay in the scene)

cameraNode.position = SCNVector3(initialStartPosition.x, initialStartPosition.y + 0.5, initialStartPosition.z)

let cameraPlane = SCNNode(geometry: SCNPlane(width: 0.5, height: 0.3))
cameraPlane.geometry?.firstMaterial?.diffuse.contents = cameraNode.camera
cameraPlane.position = SCNVector3(initialStartPosition.x - 1.0, initialStartPosition.y + 0.5, initialStartPosition.z)

sceneView.scene.rootNode.addChildNode(cameraPlane)

Everything compiles and loads... The display shows up at the given position, but it stays entirely gray. Nothing is displayed at all from the SCNCamera I put in the scene. Everything else in the AR scene works well, I just don't get any feed from that camera.

Gray plane instead of camera feed

Hay anyone an approach to get this scenario working?

To even better visualize, I add some more print screens.

The following shows the Image trough the SCNCamera according to ARGeo's input. But it takes the whole screen, instead of displaying its contents on a SCNPlane, like I need.

View from the SCNCamera

The next Print screen actually shows the current ARView result as I got it using my posted code. As you can see, the gray Display-Plane remains gray - it shows nothing.

Current AR View

The last print screen is a photomontage, showing the expected result, as I'd like to get.

Expected or Desired AR View

How could this be realized? Am I missing something fundamental here?

ZAY
  • 3,882
  • 2
  • 13
  • 21

3 Answers3

5

After some research and sleep, I came to the following, working solution (including some inexplainable obstacles):

Currently, the additional SCNCamera feed is not linked to a SCNMaterial on a SCNPlane, as it was the initial idea, but I will use an additional SCNView (for the moment)

In the definitions I add an other view like so:

let overlayView   = SCNView() // (also tested with ARSCNView(), no difference)
let camera        = SCNCamera()
let cameraNode    = SCNNode()

then, in viewDidLoad, I setup the stuff like so...

camera.automaticallyAdjustsZRange = true
camera.usesOrthographicProjection = false
cameraNode.camera                 = camera
cameraNode.camera?.focalLength    = 50
sceneView.scene.rootNode.addChildNode(cameraNode) // add the node to the default scene

overlayView.scene                    = scene // the same scene as sceneView
overlayView.allowsCameraControl      = false
overlayView.isUserInteractionEnabled = false
overlayView.pointOfView              = cameraNode // this links the new SCNView to the created SCNCamera
self.view.addSubview(overlayView)    // don't forget to add as subview

// Size and place the view on the bottom
overlayView.frame  = CGRect(x: 0, y: 0, width: self.view.bounds.width * 0.8, height: self.view.bounds.height * 0.25)
overlayView.center = CGPoint(x: self.view.bounds.width * 0.5, y: self.view.bounds.height - 175)

then, in some other function, I place the node containing the SCNCamera to my desired position and angle.

// (exemplary)
cameraNode.position = initialStartPosition + SCNVector3(x: -0.5, y: 0.5, z: -(Float(shiftCurrentDistance * 2.0 - 2.0)))          
cameraNode.eulerAngles = SCNVector3(-15.0.degreesToRadians, -15.0.degreesToRadians, 0.0)

The result, is a kind of window (the new SCNView) at the bottom of the screen, displaying the same SceneKit content as in the main sceneView, viewed trough the perspective of the SCNCamera plus its node position, and that very nicely.

Main AR view plus additional view from other perspective

In a common iOS/Swift/ARKit project, this construct generates some side effects, that one may struggle into.

1) Mainly, the new SCNView shows SceneKit content from the desired perspective, but the background is always the actual physical camera feed. I could not figure out, how to make the background a static color, by still displaying all the SceneKit content. Changing the new scene's background property affects also the whole main scene, what is actually NOT desired.

2) It might sound confusing, but as soon as the following code get's included (which is essential to make it work):

overlayView.scene = scene

the animation speed of the entire scenes (both) DOUBLES! (Why?)

I got this corrected by adding/changing the following property, which restores the animation speed behavour almost like normal (default):

// add or change this in the scene setup
scene.physicsWorld.speed = 0.5

3) If there are actions like SCNAction.playAudio in the project, all the effects will no longer play - as long as I don't do this:

overlayView.scene = nil

Of course, the additional SCNView stops working but everything else gets gets back to its normal.

ZAY
  • 3,882
  • 2
  • 13
  • 21
  • You might already have it in your code somewhere, but I had to add `overlayView.isPlaying = true` to make `overlayView` update with the main `sceneView`. – Remy Cilia Jun 24 '20 at 20:58
0

Use this code (as a starting point) to find out how to setup a virtual camera.

Just create a default ARKit project in Xcode and copy-paste my code:

import UIKit
import SceneKit
import ARKit

class ViewController: UIViewController, ARSCNViewDelegate {

    @IBOutlet var sceneView: ARSCNView!

    override func viewDidLoad() {
        super.viewDidLoad()
        sceneView.delegate = self
        sceneView.showsStatistics = true
        let scene = SCNScene(named: "art.scnassets/ship.scn")!
        sceneView.scene = scene

        let cameraNode = SCNNode()
        cameraNode.camera = SCNCamera()
        cameraNode.position = SCNVector3(0, 0, 1)
        cameraNode.camera?.focalLength = 70
        cameraNode.camera?.categoryBitMask = 1
        scene.rootNode.addChildNode(cameraNode)

        sceneView.pointOfView = cameraNode
        sceneView.allowsCameraControl = true
        sceneView.backgroundColor = UIColor.darkGray

        let plane = SCNNode(geometry: SCNPlane(width: 0.8, height: 0.45))
        plane.position = SCNVector3(0, 0, -1.5)

        // ASSIGN A VIDEO STREAM FROM SCENEKIT-RECORDER TO YOUR MATERIAL
        plane.geometry?.materials.first?.diffuse.contents = capturedVideoFromSceneKitRecorder
        scene.rootNode.addChildNode(plane)
    }
    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        let configuration = ARWorldTrackingConfiguration()
        sceneView.session.run(configuration)
    }
}

UPDATED:

Here's a SceneKit Recorder App that you can tailor to your needs (you don't need to write a video to disk, just use a CVPixelBuffer stream and assign it as a texture for a diffuse material).

enter image description here

Hope this helps.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • This switches the current ARView to the SCNView of my placed camera. Indeed, that works well. But I am looking into a solution, where I can fetch that camera feed and embed it like on a sub scene placed inside a kind of Window or SCNPlane or whatever to display that second camera feed in addition to my current ARScene. This would probably mean, that the scene have to be rendered twice. Is such a scenario even possible? – ZAY May 09 '19 at 09:19
  • On the Image I added above, you can see this gray SCNPlane, that hovers in the air. Then, there, where I was drawing the CAM Arrow and the cam on the Print-Screen, a SCNNode containing a SCNCamera is set in place (like in the code). I'd like to have a permanent camera feed on that gray "display" coming from this additional camera. On the Gray display I should see now the blue half-tube from the lower position perspective - but it stays gray, instead of showing me the video Feed trough this indicated camera at its position. I hope, I described my requirement in an understandable manner. – ZAY May 09 '19 at 12:09
  • I dont have such a visual example - or I do completely miss-understand you… I could create more print screens, if this helps... – ZAY May 09 '19 at 12:27
  • If you have any approach how to get this working, it would be so welcome. – ZAY May 09 '19 at 13:45
  • I actually fetched your meanwhile deleted comment with the guard let on the AVCaptureDevice and quickly tested it on the project. It produces the exact opposite of what I would like to have. The Gray display was showing the Physical cammera Feed, whyle the entire rest of the scene was displayed trough the SCNCamera. :) is it possible to have the SCNCamera as input to the AVCaptureDevice, or is this only for physical cameras? – ZAY May 09 '19 at 15:14
  • Sorry, I thought it would be a solution to my issue… – ZAY May 09 '19 at 15:21
  • Yeah, I've had a look at it - and this is unfortunately more something for someone with SO reputation like yours. With other words I dont understand anything from that. Could you give me a helping hand, so that I dont drown myself in that code… I dont have any Idea how I could link my SCNCamera construct to to that screen recorder - and then how to fetch that pixel buffer and link it to the Material. (is it really that complicated to get the stream from a SCNCamera)? Yesterday I tryed to add an other SCNSView with SCNScene to link this as point of view to the SCNCamera - but wo no success. – ZAY May 10 '19 at 07:15
  • btw: I also miss some Modul called BrightFutures - never heard about – ZAY May 10 '19 at 07:17
0

I'm a little late to the party, but I've had a similar issue recently.

As far as I can tell, you cannot directly connect a camera to a node's material. You can, however, use a scene's layer as a texture for a node.

The code below is not verified, but should be more or less ok:

class MyViewController: UIViewController {
    override func loadView() {
       let projectedScene = createProjectedScene()
       let receivingScene = createReceivingScene()
       let projectionPlane = receivingScene.scene?.rootNode.childNode(withName: "ProjectionPlane", recursively: true)!

       // Here's the important part:
       // You can't directly connect a camera to a material's diffuse texture.
       // But you can connect a scene's layer as a texture.
       projectionPlane.geometry?.firstMaterial?.diffuse.contents = projectedScene.layer
       projectedScene.layer.contentsScale = 1


      // Note how we only need to connect the receiving view to the controller.
      // The projected view is not directly connected as a subview,
      // but updates in projectedScene will still be reflected in receivingScene.
      self.view = receivingScene
    }

    func createProjectedScene() -> SCNView {
        let view = SCNView()
        // ... set up scene ...
        return view
    }

    func createReceivingScene() -> SCNView {
        let view = SCNView()
        // ... set up scene ...

        let projectionPlane = SCNNode(geometry: SCNPlane(width: 2, height: 2)
        projectionPlane.name = "ProjectionPlane"
        view.scene.rootNode.addChildNode(projectionPlane)

        return view
    }
}