3

For some reason I can't get people occlusion to work, even though I looked at someone's question on Stackoverflow. Here is my code:

//Load ARView
let arView = ARView(frame: .zero)

//Load people occlusion
let session = ARSession()

if let configuration = session.configuration as? ARWorldTrackingConfiguration {
    configuration.frameSemantics.insert(.personSegmentationWithDepth)
    session.run(configuration)
}

//Load custom model(not in use)
let model = try! Entity.loadModel(named: "Mug")

//Load Anchor + Entity
let anchor = AnchorEntity(plane: .horizontal)
let box = MeshResource.generateBox(size: 0.1)
let material = SimpleMaterial(color: .red, isMetallic: true)
let entity = ModelEntity(mesh: box, materials: [material])
arView.scene.anchors.append(anchor)
anchor.addChild(entity)
return arView

What am I missing?

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220

1 Answers1

1

Your code should look like this:

let arView = ARView(frame: .zero)

override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {

    self.switchOcclusion()
}
    
fileprivate func switchOcclusion() {
        
    guard let config = arView.session.configuration as ARWorldTrackingConfiguration
    else { return }
        
    guard ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth)
    else { return }
        
    switch config.frameSemantics {            
        case [.personSegmentationWithDepth]: config.frameSemantics.remove(.personSegmentationWithDepth)                    
        default: config.frameSemantics.insert(.personSegmentationWithDepth)
    }           
    arView.session.run(config)
}


Or there's also a cool solution with type(of:) method:

let config = ARWorldTrackingConfiguration()
config.planeDetection = [.horizontal]
arView.session.run(config)

if type(of: config).supportsFrameSemantics(.sceneDepth) {
    config.frameSemantics = .personSegmentationWithDepth
} else {
    print("This device doesn't support segmentation with depth")
}
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • For some reason I tried to do exactly what you answered and I get "Value of type 'ARViewContainer' has no member 'arView'" even when I change for Generic iOS Device – Elementio 21809 Apr 02 '20 at 15:06
  • I have this exact code (except `print` has to be `fatalError` in a `guard` block) and it doesn't work. It seems like there's some time while `arView.session.configuration` is `nil`. – Boris Verkhovskiy Jun 08 '21 at 12:59
  • @Boris, what's your iPhone model? – Andy Jazz Jun 08 '21 at 14:58
  • 1
    @AndyFedoroff 11-inch iPad Pro, 3rd generation. Bought it last month. – Boris Verkhovskiy Jun 08 '21 at 15:56
  • @Boris, Updated my answer. I've checked it, it works. – Andy Jazz Jun 08 '21 at 16:10
  • @AndyFedoroff do you mind explaining why `touchesBegan` is needed? I don't want to wait for the user input to enable oclussion. I just want to have it enabled as soon as it's possible. – Boris Verkhovskiy Jun 08 '21 at 16:55
  • It's just for toggling Occlusion on and off when needed, because Occlusion is extremely CPU/GPU intensive. – Andy Jazz Jun 08 '21 at 16:58
  • @AndyFedoroff but why can't I (or how can I) start the app with people occlusion enabled? If I try to access the `.configuration` while creating the view, it seems to be `nil` – Boris Verkhovskiy Jun 09 '21 at 14:49
  • There are several cases when you may get `nil`. Let your AR app to collect some data about scene before enabling occlusion. Use `.asyncAfter(deadline:execute:)`method to schedule an execution at the specified time (for instance `deadline = .now() + 1.0`). – Andy Jazz Jun 09 '21 at 14:56
  • Take into consideration, you start getting depth data from Depth API only after starting a session + some time for camera tracking procedure + sync uploading your model. Depth data starts generating not from the first frame of ARSession, because occlusion comes from a second stage of ARKit – Scene Understanding... – Andy Jazz Jun 09 '21 at 15:01
  • Hope it helps, Boris! – Andy Jazz Jun 09 '21 at 15:04
  • @AndyFedoroff thank for the help. Is there a way of detecting when it's available instead of just hard coding waiting N seconds? Or will I have to poll? – Boris Verkhovskiy Jun 10 '21 at 17:40
  • 1
    Try something like this – `let frame = arView.session.currentFrame` and then `guard frame?.estimatedDepthData != nil else { print("depth is being captured") return }` inside delegate method `renderer(...)` or `session(...)`. – Andy Jazz Jun 10 '21 at 18:03