ARKit
/RealityKit
world tracking system is based on a combination of five sensors:
- Rear RGB Camera
- LiDAR Scanner
- Gyroscope
- Accelerometer
- Magnetometer
Three latter ones are known as Inertial Measurement Unit
(IMU) that operates at 1000 fps. But what sees your RGB Camera (running at 60 fps) and LiDAR (also at 60 fps) is very important too.
Hence, a stability of world tracking greatly depends on camera image.
Here are some recommendations for high-quality tracking:
- Track only well-lit environment (if you don't have LiDAR)
- Track only static objects (not moving)
- Don't track poor-textured surfaces like white walls (if you don't have LiDAR)
- Don't track surfaces with repetitive texture pattern (like Polka Dots)
- Don't track mirrors, chrome and glass objects (reflective and refractive)
- Move your iPhone slowly when tracking
- Don't shake iPhone when tracking
- Track as much environment as possible
- Track high-contrast objects in environment (if you don't have LiDAR)
If you follow these recommendations, coordinate system in ARKit will be stable.
And look at the picture in this SO post – there are a good example for tracking and a bad one.