4

I am trying to play with Augmented Reality using Reality Kit.

I want to have my program do one of the following things, selectable by the user:

  1. Detect horizontal surfaces only.
  2. Detect vertical surfaces only.
  3. Detect both, horizontal and vertical surfaces.
  4. Detect images, like I print a target, attach to an object on the real world and the app detects it.

In order to do that I have, as far as I understand, to adjust 3 things:

ARWorldTrackingConfiguration

doing something like

func initSession() {

    let config = ARWorldTrackingConfiguration()
    config.planeDetection = .vertical
  
    arView.session.delegate = self
    arView.session.run(config)
}

Create scenes inside Experience.rcproject

One for the type of anchoring I need. I have created three "scenes" with the following anchor types: horizontal, vertical and image.

Create an ARCoachingOverlayView

To instruct the user to make the detection work properly.

These are the problems:

  1. ARWorldTrackingConfiguration has only two options for planeDetection: horizontal or vertical.

  2. The scenes inside Experience.rcproject, can only be of 3 kinds: horizontal, vertical or image.

  3. The options for ARCoachingOverlayView.goal are: tracking (that is difficult to figure out without proper documentation), horizontalPlane, verticalPlane and anyPlane.

Questions:

  1. How do I configure ARWorldTrackingConfiguration and ARCoachingOverlayView.goal to make the app detect horizontal only, vertical only, horizontal and vertical and images if they don't have all these four options?

  2. I have 3 scenes inside Experience.rcproject one for horizontal, one for vertical and one for image detection. Is that how to do it?

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
Duck
  • 34,902
  • 47
  • 248
  • 470
  • I haven't worked with it, but it appears that `.planeDetection` is a `Set` .. this compiles without complaint, so it may answer one of your questions: `config.planeDetection = [.vertical, .horizontal]` ... other than that, I expect you know you can get better answers here by asking **one question at a time**. – DonMag Aug 05 '20 at 17:11
  • ok, it is a set, it solves the problem for horizontal, vertical or both, but there is no option for images. – Duck Aug 05 '20 at 19:35
  • also the 3 elements have to have the same options... the problem is that there is no explanation on how to do it. – Duck Aug 05 '20 at 19:48
  • I just grabbed these two tutorials: https://www.appcoda.com/arkit-image-recognition/ and https://www.appcoda.com/arkit-horizontal-plane/ ... set both `configuration.detectionImages = referenceImages` and `configuration.planeDetection = [.vertical, .horizontal]` ... "merged together" the two `renderer` funcs... and I have a view controller that detects Horizontal and Vertical planes AND images. Obviously, more to do depending on what you're app goal is, but I expect that could be a good starting point (and it took me about 5 minutes). – DonMag Aug 05 '20 at 19:53
  • Here's a link to my forked repo on GitHub: https://github.com/DonMag/ARKitImageRecognition – DonMag Aug 05 '20 at 20:11

1 Answers1

2

Let's assume that we've created three scenes in Reality Composer called BoxScene for horizontal plane detection (world tracking), StarScene for vertical plane detection (world tracking) and PrismScene for image detection (image tracking) respectively. In each scene we gave names to our models – there are automatic variables generated from these names – goldenBox, plasticStar and paintedPrism.

To switch from World Tracking config to Image Tracking config in RealityKit we must use definite AnchorEntity's initializers written inside buttons' @IBActions – .image and .plane.

Look at the following code to find out how to do what you want.

import RealityKit
import UIKit

class ViewController: UIViewController {

    @IBOutlet var arView: ARView!
    
    let cubeScene = try! Experience.loadBoxScene()
    let starScene = try! Experience.loadStarScene()
    let prismScene = try! Experience.loadPrismScene()



    // IMAGE TRACKING
    @IBAction func image(_ button: UIButton) {
        
        arView.scene.anchors.removeAll()
        
        let anchor = AnchorEntity(.image(group: "AR Resources", 
                                          name: "image"))
        
        let prism = prismScene.paintedPrism!
        anchor.addChild(prism)
        arView.scene.anchors.append(anchor)
    }

    
    // WORLD TRACKING
    @IBAction func verticalAndHorizontal(_ button: UIButton) {
        
        arView.scene.anchors.removeAll()
        
        let trackingAnchor = AnchorEntity(.plane([.vertical, .horizontal],
                                  classification: .any,
                                   minimumBounds: [0.1, 0.1]))
         
        let cube = cubeScene.goldenBox!
        let star = starScene.plasticStar!
        
        if trackingAnchor.anchor?.anchoring.target == .some(.plane([.vertical, 
                                                                    .horizontal], 
                                                    classification: .any, 
                                                     minimumBounds: [0.1, 0.1])) {
            
            let anchor1 = AnchorEntity(.plane(.horizontal,
                              classification: .any,
                               minimumBounds: [0.1, 0.1]))
            
            anchor1.addChild(cube)
            arView.scene.anchors.append(anchor1)
        }
        
        if trackingAnchor.anchor?.anchoring.target == .some(.plane([.vertical, 
                                                                    .horizontal], 
                                                    classification: .any, 
                                                     minimumBounds: [0.1, 0.1])) {
            
            let anchor2 = AnchorEntity(.plane(.vertical,
                              classification: .any,
                               minimumBounds: [0.1, 0.1]))


            anchor2.addChild(star)
            arView.scene.anchors.append(anchor2)
        }           
    }
}

P. S.

At the moment I have no computer with me, I've written it on iPhone. So I don't know if there are any errors in this code...

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • 1
    Sorry about the delay but just today I was able to test your answer deeply and I am experiencing several problems: (1) the lines like this `cubeScene.goldenBox!` sometimes come nil and crash the app. This is strange. These lines work without crash 8 out of 10 times and suddenly they are nil. (Bug?). (2) The `if` lines are exactly the same? Something is wrong there. (3) after selecting horizontal, for example, the objects does not appear on the scene. (4) coaching is not changing from horizontal to vertical. – Duck Aug 13 '20 at 17:27
  • Hi @Duck. (1) Yes, it's bug. (2) Yep, I definitely need to rewrite `if` statement. (3) I haven't checked it because I have no comp with me, sorry(( and (4) I haven't tried to use coaching view with such a scene. – Andy Jazz Aug 13 '20 at 19:34