4

Trying to use CoreMotion to correctly rotate a SceneKit camera. The scene I've built is done rather simple ... all I do is create a bunch of boxes, distributed in an area, and the camera just points down the Z axis.

Unfortunately, the data coming back from device motion doesn't seem to relate to the device's physical position and orientation in any way. It just seems to meander randomly.

As suggested in this SO post, I'm passing the attitude's quaternion directly to the camera node's orientation property.

Am I misunderstanding what data core motion is giving me here? shouldn't the attitude reflect the device's physical orientation? or is it incremental movement and I should be building upon the prior orientation?

Community
  • 1
  • 1
Joel Martinez
  • 46,929
  • 26
  • 130
  • 185

2 Answers2

7

This snippet here might help you:

    var motionManager = CMMotionManager()
    motionManager?.deviceMotionUpdateInterval = 1.0 / 60.0
    motionManager?.startDeviceMotionUpdatesToQueue(
        NSOperationQueue.mainQueue(),
        withHandler: { (motion: CMDeviceMotion!, error: NSError!) -> Void in

            let currentAttitude = motion.attitude
            var roll = Float(currentAttitude.roll) + (0.5*Float(M_PI))
            var yaw = Float(currentAttitude.yaw)
            var pitch = Float(currentAttitude.pitch)

            self.cameraNode.eulerAngles = SCNVector3(
                x: -roll,
                y: yaw,
                z: -pitch)
    })

This setting is for the device in landscape right. You can play around with different orientations by changing the + and -

Import CoreMotion.

Nico S.
  • 3,056
  • 1
  • 30
  • 64
  • how would portrait orientation work? i've be fiddling with his for a while and get it to work in the portrait orientation, only landscape – stanley Aug 30 '16 at 03:32
4

For anyone who stumbles on this, here's a more complete answer so you can understand the need for negations and pi/2 shifts. You first need to know your reference frame. Spherical coordinate systems define points as vectors angled away from the z- and x- axes. For the earth, let's define the z-axis as the line from the earth's center to the north pole and the x-axis as the line from the center through the equator at the prime meridian (mid-Africa in the Atlantic).

For (lat, lon, alt), we can then define roll and yaw around the z- and y- axes in radians:

let roll = lon * Float.pi / 180
let yaw = (90 - lat) * Float.pi / 180

I'm pairing roll, pitch, and yaw with z, x, and y, respectively, as defined for eulerAngles.

The extra 90 degrees accounts for the north pole being at 90 degrees latitude instead of zero.

To place my SCNCamera on the globe, I used two SCNNodes: an 'arm' node and the camera node:

let scnCamera = SCNNode()
scnCamera.camera = SCNCamera()
scnCamera.position = SCNVector3(x: 0.0, y: 0.0, z: alt + EARTH_RADIUS)

let scnCameraArm = SCNNode()
scnCameraArm?.position = SCNVector3(x: 0, y: 0, z: 0)
scnCameraArm?.addChildNode(scnCamera)

The arm is positioned at the center of the earth, and the camera is place at alt + EARTH_RADIUS away, i.e. the camera is now at the north pole. To move the camera on every location update, we can now just rotate the arm node with new roll and yaw values:

scnCameraArm.eulerAngles.z = roll
scnCameraArm.eulerAngles.y = yaw

Without changing the camera's orientation, it's virtual lens is always facing the ground and it's virtual 'up' direction is pointed westward.

To change the virtual camera's orientation, the CMMotion callback returns a CMAttitude with roll, pitch, and yaw values relative to a different z- and x- axis reference of your choosing. The magnetometer-based ones use a z-axis pointed away from gravity and an x-axis pointed at the north pole. So a phone with zero pitch, roll, and yaw, would have its screen facing away from gravity, it's back camera pointed at the ground, and its right side of portrait mode facing north. Notice that this orientation is relative to gravity, not to the phone's portrait/landscape mode (which is also relative to gravity). So portrait/landscape is irrelevant.

If you imagine the phone's camera in this orientation near the north pole on the prime meridian, you'll notice that the CMMotion reference is in a different orientation than the virtual camera (SCNCamera). Both cameras are facing the ground, but their respective y-axes (and x) are 180 degrees apart. To line them up, we need to spin one around its respective z-axis, i.e. add/subtract 180 degrees to the roll ...or, since they're expressed in radians, negate them for the same effect.

Also, as far as I can tell, CMAttitude doesn't explicitly document that its roll value means a rotation about the z-axis coming out of the phone's screen, and from experimenting, it seems that attitude.roll and attitude.yaw have opposite definitions than defined in eulerAngles, but maybe this is an artifact of the order that the rotational transformations are applied in virtual space with eulerAngles (?). Anyway, the callback:

motionManager?.startDeviceMotionUpdates(using: .xTrueNorthZVertical, to: OperationQueue.main, withHandler: { (motion: CMDeviceMotion?, err: Error?) in
            guard let m = motion else { return }
            scnCamera.eulerAngles.z = Float(m.attitude.yaw  - Double.pi)
            scnCamera.eulerAngles.x = Float(m.attitude.pitch)
            scnCamera.eulerAngles.y = Float(m.attitude.roll)
        })

You can also start with a different reference frame for your virtual camera, e.g. z-axis pointing through the prime meridian at the equator and x-axis pointing through the north pole (i.e. the CMMotion reference), but you'll still need to invert the longitude somewhere.

With this set up, you can build a scene heavily reliant on GPS locations pretty easily.

Dylan Knight
  • 131
  • 3
  • 6