1

I have an augmented reality app with a simple Reality Composer project. It works fine on an ipad 14.4 but I'm having problems on higher versions (14.7 and 15).

Anchor detection is much more sensitive. This has the consequence of restarting my scenes with each new image detection. On the other hand, the scenes are interrupted as soon as the image of the anchor is no longer visible by the camera.

I am using xcode 13.1

I use this simple code :

import RealityKit

class ViewController: UIViewController {

    @IBOutlet var arView: ARView!

    override func viewDidLoad() {
        super.viewDidLoad()

        guard let anchor2 = try? Enigme1.loadDebut() else { return }

        arView.scene.anchors.append(anchor2)

    }

}

Thank you very much for the help you could give me.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220

1 Answers1

0

The behavior style of Reality Composer's and RealityKit's AnchorEntity(.image) is the same as ARKit's anchor in ARImageTrackingConfiguration – if a tracked image is no longer visible in a view, there will be no ARImageAnchor, thus, there will be no 3D model.

When using AnchorEntity(.image), if your 3D model has more than 100,000+ polygons, every time it reappears on the screen it will cause a slight freeze.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220