1

Is it possible to read the current 6 degrees of freedom movement values (e.g. translation and rotation vectors) when using ARKit with ARWorldTrackingConfiguration?

I am referring to ARWorldTrackingConfiguration with its 6 degrees of freedom as explained at https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration

I would like to obtain the current values of the device movement such as translation and rotation vectors, relative to an origin (e.g. the starting point of the AR session).

Xartec
  • 2,369
  • 11
  • 22
jakob.j
  • 942
  • 13
  • 28

2 Answers2

1
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .horizontal
arSceneView.session.run(configuration)

This will give you 6DOF. Just make sure you detect a plane before moving around.

You use your touch location to move your object in the ARKit scene. You can do a ray trace to achieve this. It all works on top of the horizontal plane that you detect through your camera and nothing more.

let hitResult = sceneView.hitTest(touchLocation, types: .existingPlane)

This hitResult array will help you place your object. for instance.

let velocity :CGPoint = recognizer.velocity(in: self.arSceneView)
self.objectModel.node.position.y = (self.objectModel.node.position.y + Float(velocity.y * -0.0001))

Thats your translation. let translation = recognizer.translation(in: recognizer.view!)

    let x = Float(translation.x)
    let y = Float(-translation.y)

    let anglePan = (sqrt(pow(GLKMathDegreesToRadians(x),2)+pow(GLKMathDegreesToRadians(y),2)))
    var rotationVector = SCNVector4()
    rotationVector.x = -y
    rotationVector.y = x
    rotationVector.z = 0
    rotationVector.w = anglePan

    self.objectModel.node.rotation = rotationVector
    self.sphereNode.rotation = rotationVector

Thats your rotation on a model in the SceneKit. These are just examples of how to do translation and rotation in an ARScene. Make changes as required.

arSceneView.pointOfView is your camera. The rotation and position transform of this node should give you the device's position and rotation.

arSceneView.pointOfView?.transform // Gives you your camera's/device's SCN4Matrix transform
arSceneView.pointOfView?.eulerAngles // Gives you the SCNVector3 rotation matrix.
arSceneView.pointOfView?.position // Gives you the camera's SCNVector3 position matrix.
SWAT
  • 1,107
  • 8
  • 19
  • Plane detection doesn't have anything to do with whether you get 6DOF tracking — it's an added feature for when you do have 6DOF tracking. – rickster Feb 22 '18 at 19:27
  • The 6DOF is attained with a sort of image recognition. It uses anchor points detected through the data that comes from the camera and thereby make a virtual world into which you can add your SceneKit / SpriteKit objects. – SWAT Feb 22 '18 at 19:50
  • True. But you don't need to turn plane detection on to get 6DOF poses. You get that merely from running `ARWorldTrackingConfiguration`. – rickster Feb 22 '18 at 19:52
  • If you don't have the plane detection enabled and you are supposed to apply translation and rotation vector on ARScene objects, how would you propose to do it in a convincing manner? I mean, why not use plane detection when you are given one of the most useful feature in ARKit? – SWAT Feb 22 '18 at 20:02
  • And why the downvote when you agreed just there that ARWorldTrackingConfiguration gets you the 6DOF. And that the question is just about that? – SWAT Feb 22 '18 at 20:06
  • Plane detection, placing objects, and rotating them based on a gesture are useful, but don't seem to be relevant to the OP's question. – rickster Feb 22 '18 at 20:54
  • Translation and rotation of device with respect to origin. I gave these for device and other objects in the scene. You just downvotes because you have your own answer posted for the question, and nothing more. – SWAT Feb 23 '18 at 03:33
  • Thanks, this helped a lot. The `sceneView.pointOfView.transform` matrix seems to be a direct way to get the current values - since they are identical to the values of the `sceneView.session.currentFrame.camera.transform` matrix. However (as I pointed out i my comment to rickster's answer), the last column of the matrix (translation vector?) does not update as I move around. It always stays (0,0,0,1) (vertical). Can you explain this? In contrast, the `sceneView.pointOfView.position` vector behaves as I would expect the translation vector to behave - the values update as I move around. – jakob.j Feb 27 '18 at 12:14
  • @SWAT can you please explain me these lines? let velocity :CGPoint = recognizer.velocity(in: self.arSceneView) self.objectModel.node.position.y = (self.objectModel.node.position.y + Float(velocity.y * -0.0001)) – Roshan Bade Aug 20 '19 at 09:59
  • I found using velocity much more reliable than position. I tried to move the node in the y axis using that code. I chose velocity because I'm only concerned if the movement is in +/- y direction and not the actual values of the velocity itself. I add or substract a very small amount from the current y when pan is activated. This would give a favourable output. – SWAT Aug 20 '19 at 13:13
  • @jakob.j I know this is a very late reply and you probably got your answer already. Still, the last column of the transform matrix will remain the same no matter what you do to the node. The first three values of the first row is your position vector. The remaining 3x3 matrix is a mix of your scale and rotation vectors. – SWAT Aug 20 '19 at 13:36
1

ARCamera represents the device pose in any ARKit session. If you're running a world tracking session, the camera's transform matrix is the concatenation of both rotation and translation transforms. (And that transform is relative to the world coordinate origin, which is based on where you were when you started the session.) If you don't have a world tracking session, there's no translation (the transform is just a rotation matrix).

If you need help decomposing a transform matrix to rotation/translation vectors, that's nothing specific to ARKit — check up on that common 3D graphics question if you want to see how it works. Some shortcuts, though:

  • The translation vector is the last column of the matrix (e.g. transform.columns.3)
  • You can get rotation expressed as pitch/roll/yaw angles through the ARCamera.eulerAngles property.
  • You can get rotation as a quarternion by passing the whole matrix to a simd_quatf initializer.
rickster
  • 124,678
  • 26
  • 272
  • 326
  • Just to add: an easy way to decompose the transform matrix would be to assign it to an empty SCNNode’s worldTransform property. After that the node’s worldPosition axis can be accessed individually as well as the various rotation and orientation properties. – Xartec Feb 23 '18 at 17:41
  • Thanks, using `sceneView.session.currentFrame.camera.transform`, I am able to get the transform matrix of the ARCamera. (The current values seem to be identical to the values of the `sceneView.pointOfView.transform` matrix which is suggested in SWAT's answer.) However, the translation vector (last column of the matrix) does not change as I move around. It always stays (0,0,0,1) (vertical). Can you explain this? In contrast, the `sceneView.pointOfView.position` vector behaves as I would expect the translation vector to behave - the values update as I move around. Can I use this one instead? – jakob.j Feb 27 '18 at 12:01
  • Just in advance, I verified that ARWorldTrackingConfiguration is active with `print(sceneView.session.configuration.debugDescription)`. – jakob.j Feb 27 '18 at 12:03