32

How can I use the horizontal and vertical planes tracked by ARKit to hide objects behind walls/ behind real objects? Currently the 3D added objects can be seen through walls when you leave a room and/ or in front of objects that they should be behind. So is it possible to use the data ARKit gives me to provide a more natural AR experience without the objects appearing through walls?

Benjohn
  • 13,228
  • 9
  • 65
  • 127
Steve
  • 423
  • 1
  • 5
  • 4
  • Can you add pictures from what you see and what you expect? (I know nothing of ARKit, but still can't even understand your question) – mfaani Jul 09 '17 at 23:58

5 Answers5

35

You have two issues here.

(And you didn't even use regular expressions!)

How to create occlusion geometry for ARKit/SceneKit?

If you set a SceneKit material's colorBufferWriteMask to an empty value ([] in Swift), any objects using that material won't appear in the view, but they'll still write to the z-buffer during rendering, which affects the rendering of other objects. In effect, you'll get a "hole" shaped like your object, through which the background shows (the camera feed, in the case of ARSCNView), but which can still obscure other SceneKit objects.

You'll also need to make sure that an occluded renders before any other nodes it's supposed to obscure. You can do this using node hierarchy ( I can't remember offhand whether parent nodes render before their children or the other way around, but it's easy enough to test). Nodes that are peers in the hierarchy don't have a deterministic order, but you can force an order regardless of hierarchy with the renderingOrder property. That property defaults to zero, so setting it to -1 will render before everything. (Or for finer control, set the renderingOrders for several nodes to a sequence of values.)

How to detect walls/etc so you know where to put occlusion geometry?

In iOS 11.3 and later (aka "ARKit 1.5"), you can turn on vertical plane detection. (Note that when you get vertical plane anchors back from that, they're automatically rotated. So if you attach models to the anchor, their local "up" direction is normal to the plane.) Also new in iOS 11.3, you can get a more detailed shape estimate for each detected plane (see ARSCNPlaneGeometry), regardless of its orientation.

However, even if you have the horizontal and the vertical, the outer limits of a plane are just estimates that change over time. That is, ARKit can quickly detect where part of a wall is, but it doesn't know where the edges of the wall are without the user spending some time waving the device around to map out the space. And even then, the mapped edges might not line up precisely with those of the real wall.

So... if you use detected vertical planes to occlude virtual geometry, you might find places where virtual objects that are supposed to be hidden show through, either by being not quite hiding right at the edge of the wall, or being visible through places where ARKit hasn't mapped the entire real wall. (The latter issue you might be able to solve by assuming a larger extent than ARKit does.)

rickster
  • 124,678
  • 26
  • 272
  • 326
  • So the example code in the documentation has the option for `configuration.planeDetection =.horizontal` that does nothing? – Steve Jul 05 '17 at 13:11
  • That code turns plane detection on. Without it, ARKit doesn't report planes at all. – rickster Jul 05 '17 at 15:08
  • Yes, but I meant that if you could set plane detection to horizontal you should be able to set it to vertical – Steve Jul 11 '17 at 11:43
  • @Steve (In Xcode) you can Jump to the Definition (⌃⌘click) of `.horizontal`, and you'll find no other options there. I wouldn't be surprised if Apple extends the option set with "vertical"—and possibly other types of planes in the future though. – PDK Aug 04 '17 at 14:06
  • Hi Rick, would partial occlusion be significantly more complex than this technique? – Benjohn Sep 04 '17 at 17:20
  • @Benjohn what do you mean by partial occlusion ? Only the part that’s actually behind the occluder (from the camera’s perspective) won’t be visible. – mnuages Sep 04 '17 at 19:13
  • I was unclear, sorry: I am asking about simulating a partially transparent "occluder" rather than a completely opaque one. The occluded part of an object might be rendered with reduced alpha, for example. So in the case of a table occluder with a block pushing in to it, you would still see the piece of block under the table, but it would be faint. – Benjohn Sep 04 '17 at 21:15
  • Ah! Got it working. As noted [here](https://stackoverflow.com/a/45167391/2547229) it is also necessary to set the occluder to be rendered before the other objects with: `occluderNode.renderingOrder = -1` – should this be part of the answer? – Benjohn Sep 05 '17 at 12:40
2

For creating an occlusion material (also known as blackhole material or blocking material) you have to use the following instance properties: .colorBufferWriteMask, .readsFromDepthBuffer, .writesToDepthBuffer and .renderingOrder.

You can use them this way:

plane.geometry?.firstMaterial?.isDoubleSided = true
plane.geometry?.firstMaterial?.colorBufferWriteMask = .alpha  
plane.geometry?.firstMaterial?.writesToDepthBuffer = true
plane.geometry?.firstMaterial?.readsFromDepthBuffer = true
plane.renderingOrder = -100

...or this way:

func occlusion() -> SCNMaterial {

    let occlusionMaterial = SCNMaterial()
    occlusionMaterial.isDoubleSided = true
    occlusionMaterial.colorBufferWriteMask = []
    occlusionMaterial.readsFromDepthBuffer = true
    occlusionMaterial.writesToDepthBuffer = true

    return occlusionMaterial
}

plane.geometry?.firstMaterial = occlusion()
plane.renderingOrder = -100
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
1

In order to create an occlusion material it's really simple

    let boxGeometry = SCNBox(width: 0.1, height: 0.1, length: 0.1, chamferRadius: 0)

    // Define a occlusion material 
    let occlusionMaterial = SCNMaterial()
    occlusionMaterial.colorBufferWriteMask = []

    boxGeometry.materials = [occlusionMaterial]
    self.box = SCNNode(geometry: boxGeometry)
    // Set rendering order to present this box in front of the other models
    self.box.renderingOrder = -1
kakashy
  • 714
  • 9
  • 24
0

Great solution:

GitHub: arkit-occlusion

Worked for me.

But in my case i wanted to set the walls by code. So if you don't want to set the Walls by user -> use the plane detection to detect walls and set the walls by code.

Or in a range of 4 meters the iphone depht sensor works and you can detect obstacles with ARHitTest.

MoD
  • 564
  • 4
  • 14
0

ARKit 6.0 and LiDAR scanner

You can hide any object behind a virtual invisible wall that replicates real wall geometry. iPhones and iPads Pro equipped with a LiDAR scanner help us reconstruct a 3d topological map of surrounding environment. LiDAR scanner greatly improves a quality of Z channel that allows occlude or remove humans from AR scene.

Also LiDAR improves such feature as Object Occlusion, Motion Tracking and Raycasting. With LiDAR scanner you can reconstruct a scene even in a unlit environment or in a room having white walls with no features at all. 3d reconstruction of surrounding environment has become possible in ARKit 6.0 thanks to sceneReconstruction instance property. Having a reconstructed mesh of your walls it's now super easy to hide any object behind real walls.

To activate a sceneReconstruction instance property in ARKit 6.0 use the following code:

@IBOutlet var arView: ARView!

arView.automaticallyConfigureSession = false

guard ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh)
else { return }

let config = ARWorldTrackingConfiguration()
config.sceneReconstruction = .mesh

arView.debugOptions.insert([.showSceneUnderstanding])
arView.environment.sceneUnderstanding.options.insert([.occlusion])
arView.session.run(config)


Also if you're using SceneKit try the following approach:

@IBOutlet var sceneView: ARSCNView!

func renderer(_ renderer: SCNSceneRenderer, 
          nodeFor anchor: ARAnchor) -> SCNNode? {

    guard let meshAnchor = anchor as? ARMeshAnchor 
    else { return nil }

    let geometry = SCNGeometry(arGeometry: meshAnchor.geometry)

    geometry.firstMaterial?.diffuse.contents = 
                            colorizer.assignColor(to: meshAnchor.identifier)
‍
    let node = SCNNode()
    node.name = "Node_\(meshAnchor.identifier)"
    node.geometry = geometry
    return node
}

func renderer(_ renderer: SCNSceneRenderer,
          didUpdate node: SCNNode,
              for anchor: ARAnchor) {

    guard let meshAnchor = anchor as? ARMeshAnchor 
    else { return }

    let newGeometry = SCNGeometry(arGeometry: meshAnchor.geometry)

    newGeometry.firstMaterial?.diffuse.contents = 
                               colorizer.assignColor(to: meshAnchor.identifier)

    node.geometry = newGeometry
}

And here are SCNGeometry and SCNGeometrySource extensions:

extension SCNGeometry {
    convenience init(arGeometry: ARMeshGeometry) {
        let verticesSource = SCNGeometrySource(arGeometry.vertices, 
                                               semantic: .vertex)
        let normalsSource = SCNGeometrySource(arGeometry.normals, 
                                               semantic: .normal)
        let faces = SCNGeometryElement(arGeometry.faces)
        self.init(sources: [verticesSource, normalsSource], elements: [faces])
    }
}

extension SCNGeometrySource {
    convenience init(_ source: ARGeometrySource, semantic: Semantic) {
        self.init(buffer: source.buffer, vertexFormat: source.format,
                                             semantic: semantic,
                                          vertexCount: source.count,
                                           dataOffset: source.offset,
                                           dataStride: source.stride)
    }
}

...and SCNGeometryElement and SCNGeometryPrimitiveType extensions:

extension SCNGeometryElement {
    convenience init(_ source: ARGeometryElement) {
        let pointer = source.buffer.contents()
        let byteCount = source.count * 
                        source.indexCountPerPrimitive * 
                        source.bytesPerIndex
        let data = Data(bytesNoCopy: pointer, 
                              count: byteCount, 
                        deallocator: .none)
        self.init(data: data, primitiveType: .of(source.primitiveType),
                             primitiveCount: source.count,
                              bytesPerIndex: source.bytesPerIndex)
    }
}

extension SCNGeometryPrimitiveType {
    static func of(type: ARGeometryPrimitiveType) -> SCNGeometryPrimitiveType {
        switch type {
            case .line: return .line
            case .triangle: return .triangles
        }
    }
}
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • 2
    This only works if you have a ARView and not an ARSCNView. Do you know how this would be accomplished with an ARSCNView? – Mikael Nov 23 '20 at 09:58
  • Here you can find out how to: https://developer.apple.com/forums/thread/654431. Use occlusion material instead of coloured one. – Andy Jazz Nov 23 '20 at 10:00
  • 1
    Can you please describe a little bit how it could be done with ARSCNView ? Thank you. – pavelcauselov Dec 23 '20 at 16:29
  • @pavelcauselov, I've added a code touching SceneKit' LiDAR scanning implementation. – Andy Jazz Dec 27 '20 at 09:58
  • 1
    @AndyFedoroff thank you ! But could you please share work code via github may be, cause I got no luck and my "sticky note" on the wall still in front of real objects... – pavelcauselov Dec 27 '20 at 14:46
  • Sorry @pavelcauselov, I'm not able to share a full version of code 'cause I'm writing on smartphone. You can copy-paste just this code... – Andy Jazz Dec 27 '20 at 14:48