1

I learn from the document that ARKit is CoreMotion plus AVFoundation, but how I get the sensor data:

  • gravity
  • acceleration
  • rotation matrix

that CoreMotion provided from ARKit instead of set a listener of Core Motion?

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
Onion Shen
  • 21
  • 3

1 Answers1

0

ARKit (and its new satellite RealityKit) not only contains some classes, methods and properties from CoreMotion and AVFoundation frameworks, but also some classes, methods and properties from:

  • UIKit
  • SceneKit
  • SpriteKit
  • Metal
  • CoreML
  • CoreLocation
  • MultipeerConnectivity
  • etc.

However, you can't get any raw data from iPhone sensors (Apple doesn't allow it) but you can definitely use data what access is allowed to.

For instance:

1. A pixel buffer containing the image captured by the camera:

let pixelBuffer: CVPixelBuffer? = sceneView.session.currentFrame?.capturedImage

2. The position and orientation of the camera in world coordinate space ( simd_float4x4 ):

let cameraMatrix = (sceneView.session.currentFrame?.camera.transform)!

3. Options for how ARKit constructs a scene coordinate system based on real-world device motion:

ARConfiguration.WorldAlignment.gravityAndHeading
ARConfiguration.WorldAlignment.gravity
ARConfiguration.WorldAlignment.camera

Alas, but you can't get a pure acceleration data from IMU.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • But how can I map core motion data to ar kit to ensure that two users are using the same world coordinate space https://stackoverflow.com/questions/60507826/device-camera-direction-excluding-device-landscape-portrait-orientation/60508070?noredirect=1#comment107116833_60508070 – user426132 Mar 06 '20 at 13:17
  • If you need multiuser experience in AR app use MultipeerConnectivity framework. – Andy Jazz Mar 06 '20 at 14:14