5

What is a raycasting in RealityKit and ARKit for?

And when do I need to use a makeRaycastQuery instance method?

func makeRaycastQuery(from point: CGPoint, 
                 allowing target: ARRaycastQuery.Target, 
                       alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery?

Any help appreciated.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220

1 Answers1

22

Simple Ray-Casting, the same way as Hit-Testing, helps to locate a 3D point on a real-world surface by projecting an imaginary ray from a screen 2D point onto a detected plane. In Apple documentation (2019) there was the following definition of a raycasting:

Ray-casting is the preferred method for finding positions on surfaces in the real-world environment, but the hit-testing functions remain present for compatibility. With tracked raycasting, ARKit and RealityKit continue to refine the results to increase the position's accuracy of virtual content you placed with a raycast.

When the user wants to place a virtual content onto detected surface, it's a good idea to have a tip for this. Many AR apps draw a focus circle or square that give the user visual confirmation of the shape and alignment of the surfaces that RealityKit or ARKit is aware of. So, to find out where to put a focus circle or a square in the real world, you may use an ARRaycastQuery to ask a framework where any surfaces exist in the real world.


UIKit implementation

Here's an example where you can see how to implement the raycast(query) instance method:

import UIKit
import RealityKit

class ViewController: UIViewController {
    
    @IBOutlet var arView: ARView!
    let model = try! Entity.loadModel(named: "usdzModel")
    
    override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
        self.raycasting()
    }

    fileprivate func raycasting() {
            
        guard let query = arView.makeRaycastQuery(from: arView.center,
                                              allowing: .estimatedPlane,
                                             alignment: .horizontal)
        else { return }

        guard let result = arView.session.raycast(query).first
        else { return }

        let raycastAnchor = AnchorEntity(world: result.worldTransform)
        raycastAnchor.addChild(model)
        arView.scene.anchors.append(raycastAnchor)
    }
}

If you wanna know how to use a Convex-Ray-Casting in RealityKit, read this post.


If you wanna know how to use Hit-Testing in RealityKit, read this post.


SwiftUI implementation

Here's a sample code where you can find out how to implement a raycasting logic in SwiftUI:

import SwiftUI
import RealityKit

struct ContentView: View {
    
    @State private var arView = ARView(frame: .zero)
    var model = try! Entity.loadModel(named: "robot")
    
    var body: some View {            
        ARViewContainer(arView: $arView)
            .onTapGesture(count: 1) { self.raycasting() }
            .ignoresSafeArea()
    }
    
    fileprivate func raycasting() {                    
        guard let query = arView.makeRaycastQuery(from: arView.center,
                                              allowing: .estimatedPlane,
                                             alignment: .horizontal)
        else { return }

        guard let result = arView.session.raycast(query).first
        else { return }

        let raycastAnchor = AnchorEntity(world: result.worldTransform)
        raycastAnchor.addChild(model)
        arView.scene.anchors.append(raycastAnchor)
    }
}

and then...

struct ARViewContainer: UIViewRepresentable {
    
    @Binding var arView: ARView
    
    func makeUIView(context: Context) -> ARView { return arView }
    func updateUIView(_ uiView: ARView, context: Context) { }
}

P.S.

If you're building either of these two app variations from scratch (i.e. not using Xcode AR template), don't forget to enable the Privacy - Camera Usage Description key in the Info tab.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • 1
    Thanks! Where do we have to call the recasting methods from? From the `ARSessionDelegate` – `func session(_ session: ARSession, didUpdate frame: ARFrame)`? Doing so with the trackedRaycast leads to wired race conditions trying to update a scene root anchor in my case. – HelloTimo Sep 17 '19 at 11:21
  • 2
    Thanks a lot! I forgot to add that I was attempting to have the recast result continuously follow the camera position. Trying to derive from Apple's iOS13 ARKit3 + SceneKit sample code [Placing Objects and Handling 3D Interaction](https://developer.apple.com/documentation/arkit/placing_objects_and_handling_3d_interaction) I got my requisites here working by indeed putting the call to the raycast into `session(_ session: ARSession, didUpdate frame: ARFrame)`. In fact it must not be the `trackedRaycast()` apparently. Does your code get called continuously? I am going to try it again. – HelloTimo Sep 17 '19 at 13:24
  • Can test it only in 4 days)) – Andy Jazz Sep 17 '19 at 16:44
  • @HelloTimo you shouldn't be calling trackedRaycast() from `session:didUpdateFrame:` because that would create trackedRaycasts at 60 times a second and that's a lot of tracked raycasts. You should instead call it from places like a tap gesture but depends on what you are trying to do with your raycast. – Praveen Gowda I V Sep 25 '19 at 23:28