51

How is it possible to implement a vertical plane detection (i.e. for walls)?

let configuration = ARWorldTrackingSessionConfiguration()
configuration.planeDetection = .horizontal //TODO
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
Jan F.
  • 1,681
  • 2
  • 15
  • 27

7 Answers7

66

Edit: This is now supported as of ARKit 1.5 (iOS 11.3). Simply use .vertical. I have kept the previous post below for historical purposes.


TL;DR

Vertical plane detection is not (yet) a feature that exists in ARKit. The .horizontal suggests that this feature could be being worked on and might be added in the future. If it was just a Boolean value, this would suggest that it is final.

Confirmation

This suspicion was confirmed by a conversation that I had with an Apple engineer at WWDC17.

Explanation

You could argue that creating an implementation for this would be difficult as there are infinitely many more orientations for a vertical plane rather than a horizontal one, but as rodamn said, this is probably not the case.

From rodamn’s comment: At its simplest, a plane is defined to be three coplanar points. You have a surface candidate once there are sufficient detected coplanar features detected along a surface (vertical, horizontal, or at any arbitrary angle). It's just that the normal for horizontals will be along the up/down axis, while vertical's normals will be parallel to the ground plane. The challenge is that unadorned drywall tends to generate few visual features, and plain walls may often go undetected. I strongly suspect that this is why the .vertical feature is not yet released.

However, there is a counter argument to this. See comments from rickster for more information.

Zack
  • 1,585
  • 1
  • 18
  • 29
  • Hey @Zack, did that engineer give you any hint as to when that feature might be added? Hopefully that's not something they hold off until iOS 12. – T. Steele Jul 11 '17 at 14:30
  • No @T.Steele, I didn’t get any hint, but I wouldn’t think that they would hold it off that long. After all, it would seem to be that hard to implement. – Zack Jul 11 '17 at 22:49
  • I think you meant "wouldn't seem to be that hard", but what you wrote is probably closer to correct. A horizontal plane can have only one possible orientation, but there are infinitely many ways for a plane to be "vertical"... – rickster Jul 19 '17 at 07:32
  • Yes that is what I meant, though I do see what you are saying now, @rickster . – Zack Jul 19 '17 at 21:35
  • 3
    It's not that hard. At its simplest, a plane is defined to be three coplanar points. You have a surface candidate once there are sufficient detected coplanar features detected along a surface (vertical, horizontal, or at any arbitrary angle). It's just that the normal for horizontals will be along the up/down axis, while vertical's normals will be parallel to the ground plane. The challenge is that unadorned drywall tends to generate few visual features, and plain walls may often go undetected. I strongly suspect that this is why the .vertical feature is not yet released. – rodamn Sep 18 '17 at 05:12
  • Starting from iOS 11.3 ARKit will support vertical planes detection https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration.planedetection/2867271-vertical – Alexander Vasenin Jan 25 '18 at 02:34
  • This answer should really be updated, as of today, the latest stable iOS version comes with ARKit 1.5 and vertical surface tracking. – irreal Mar 30 '18 at 14:50
  • @rodamn Belated clarification: the difference between one vs many possible orientations isn't about what's possible, but what's a best guess for the imprecise data you have. Plane estimation is driven by feature point detection, but feature point positions aren't necessarily accurate. If you have *n* feature points that are all within some tolerance of the same *z* position, it's a realistic best guess for ARKit use cases to assume there's a horizontal plane (even if regression could fit a not-quite-horizontal plane to those points). – rickster Sep 05 '18 at 17:31
  • Continued: if you have *n* feature points and you're trying to fit a vertical plane, there's no one preferred orientation that it might be in. You have to use regression or a similar technique to guess the orientation, and account for some "fudge factor" in that your detected feature points aren't precise, so they aren't necessarily coplanar with each other or with the real-world surface you're trying to detect. If your detected plane is even a fraction of a degree out of alignment with the real-world surface, the AR user experience you build on top of detection won't be great. – rickster Sep 05 '18 at 17:41
6

Support for this is coming with iOS 11.3:

static var vertical: ARWorldTrackingConfiguration.PlaneDetection

The session detects surfaces that are parallel to gravity (regardless of other orientation).

https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration.planedetection https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration.planedetection/2867271-vertical

janpio
  • 10,645
  • 16
  • 64
  • 107
5

Apple has release iOS 11.3 will feature various updates for AR, including ARKit 1.5. In this update ARKit includes the ability for ARKit to recognize and place virtual objects on vertical surfaces like wall and door.

Support for vertical is supported now in ARWorldTrackingConfiguration

let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal, .vertical]
sceneView.session.run(configuration)
shri
  • 856
  • 1
  • 10
  • 26
3

In ARKit 1.0 there was just .horizontal enum's case for detecting horizontal surfaces like a table or a floor. In ARKit 1.5 and higher there are .horizontal and .vertical type properties of a PlaneDetection struct that conforms to OptionSet protocol.

To implement a vertical plane detection in ARKit 2.0 to ARKit 6.0 use the following code:

configuration.planeDetection = .vertical

Or you can use values for both types of detected planes:

private func configureSceneView(_ sceneView: ARSCNView) {

    let config = ARWorldTrackingConfiguration()
    config.planeDetection = [.horizontal, .vertical]         //BOTH TYPES
    config.sceneReconstruction = .meshWithClassification
    config.isLightEstimationEnabled = true
    sceneView.session.run(config)
}

Also you can add an extension of your class to handle the delegate calls:

extension ARSceneManager: ARSCNViewDelegate {

    func renderer(_ renderer: SCNSceneRenderer, 
                 didAdd node: SCNNode, 
                  for anchor: ARAnchor) {

        guard let planeAnchor = anchor as? ARPlaneAnchor else { return }
        print("Found plane: \(planeAnchor)")
    }
}
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
2

As the iPhone X is featuring a front facing depth camera, my suspicion is that a back facing one will be on the next version and perhaps the .vertical capability will be delegated until then.

Mike M
  • 4,879
  • 5
  • 38
  • 58
1

i did it with Unity, but i need to do my math.

I use Random Sample Consensus to detect vertical plane from the point cloud returned by ARkit. It's like having a loop that randomly picks 3 points to create a plane and counts points that matches it, and see which try is the best.

It's working. But because ARkit can't return many points when the wall is in plain color. So it doesn't work in many situation.

Sunny Chow
  • 432
  • 3
  • 13
0

Apple is said to be working on extra AR capabilities for the new iPhone i.e extra sensors for the Camera. Maybe this will be a feature when those device capabilities are known. Some speculation here. http://uk.businessinsider.com/apple-iphone-8-rumors-3d-laser-camera-augmented-reality-2017-7 and another source https://www.fastcompany.com/40440342/apple-is-working-hard-on-an-iphone-8-rear-facing-3d-laser-for-ar-and-autofocus-source

Alex McPherson
  • 3,185
  • 3
  • 30
  • 41
  • 1
    The sensors ended up being for the front camera and have too many limitations (1 meter range, IR is washed can be washed out in sunlight) to be useful for AR. Will have to see what arrives with iPhone XI. – rodamn Sep 18 '17 at 05:19
  • @rodamn, Apple did announce that the iPhone 8 and iPhone X would both have an A11 bionic chip. This is said to be very powerful and useful for AR applications, so there wasn’t absolutely no progress. – Zack Sep 18 '17 at 06:01