I learn from the document that ARKit is CoreMotion
plus AVFoundation
, but how I get the sensor data:
- gravity
- acceleration
- rotation matrix
that CoreMotion provided from ARKit instead of set a listener of Core Motion?
I learn from the document that ARKit is CoreMotion
plus AVFoundation
, but how I get the sensor data:
that CoreMotion provided from ARKit instead of set a listener of Core Motion?
ARKit (and its new satellite RealityKit) not only contains some classes, methods and properties from CoreMotion
and AVFoundation
frameworks, but also some classes, methods and properties from:
However, you can't get any raw data from iPhone sensors (Apple doesn't allow it) but you can definitely use data what access is allowed to.
For instance:
1. A pixel buffer containing the image captured by the camera:
let pixelBuffer: CVPixelBuffer? = sceneView.session.currentFrame?.capturedImage
2. The position and orientation of the camera in world coordinate space ( simd_float4x4 ):
let cameraMatrix = (sceneView.session.currentFrame?.camera.transform)!
3. Options for how ARKit constructs a scene coordinate system based on real-world device motion:
ARConfiguration.WorldAlignment.gravityAndHeading
ARConfiguration.WorldAlignment.gravity
ARConfiguration.WorldAlignment.camera
Alas, but you can't get a pure acceleration data from IMU.