15

There are 2 ways to get the 3 rotation values (azimuth, pitch, roll).

One is registering a listener of a type TYPE_ORIENTATION. It's the easiest way and I get a correct range of values from every rotation as the documentation says: azimuth: [0, 359] pitch: [-180, 180] roll: [-90, 90]

The other one, the most precise and complex to understand the first time you see it. Android recommends it, so I want to use it, but I get different values.

azimuth: [-180, 180]. -180/180 is S, 0 i N, 90 E and -90 W.
pitch: [-90, 90]. 90 is 90, -90 is -90, 0 is 0 but -180/180 (lying with the screen downwards) is 0.
roll: [-180, 180].

I should get the same values but with decimals, right?

I have the following code:

aValues = new float[3];
mValues = new float[3];

sensorListener = new SensorEventListener (){
    public void onSensorChanged (SensorEvent event){
        switch (event.sensor.getType ()){
            case Sensor.TYPE_ACCELEROMETER:
                aValues = event.values.clone ();
                break;
            case Sensor.TYPE_MAGNETIC_FIELD:
                mValues = event.values.clone ();
                break;
        }

        float[] R = new float[16];
        float[] orientationValues = new float[3];

        SensorManager.getRotationMatrix (R, null, aValues, mValues);
        SensorManager.getOrientation (R, orientationValues);

        orientationValues[0] = (float)Math.toDegrees (orientationValues[0]);
        orientationValues[1] = (float)Math.toDegrees (orientationValues[1]);
        orientationValues[2] = (float)Math.toDegrees (orientationValues[2]);

        azimuthText.setText ("azimuth: " + orientationValues[0]);
        pitchText.setText ("pitch: " + orientationValues[1]);
        rollText.setText ("roll: " + orientationValues[2]);
    }

    public void onAccuracyChanged (Sensor sensor, int accuracy){}
};

Please help. It's very frustrating.

Do I have to treat with those values or I'm doing something wrong?

Thanks.

Gabriel Llamas
  • 18,244
  • 26
  • 87
  • 112
  • 1
    I've been working on this myself for about 2 weeks now. Your code looks like it's supposed to (according to documentation that I've been able to find), but as you've noted it doesn't match the TYPE_ORIENTATION sensor results. It seemed to be a simple thing to check orientationValues[0] for a negative value and add to it. But that doesn't quite do it. You don't show the frequency of your sensor updates. I found that faster updates make for better results, even though the TYPE_ORIENTATION results appear to be rather stable. If you'd like to work together, contact davemac327@gmail.com – Dave MacLean Nov 14 '10 at 00:41
  • All sensors speed are GAME, but I don't think that the problem is related to the speed. It's weird because all blogs and forums (and the book I'm reading!) implement the sensors in the same way. – Gabriel Llamas Nov 14 '10 at 08:21
  • 3
    Well, everybody says how to use accelerometer and magnetic field but no one says that the returned values ARE DIFFERENT from the TYPE_ORIENTATION, even the official documentation. Good job people. Waiting for an answer... – Gabriel Llamas Nov 14 '10 at 19:42
  • Have a look at my question [here](http://stackoverflow.com/questions/4819626/android-phone-orientation-overview-including-compass). It might be of use. – Tim Mar 04 '11 at 21:46

3 Answers3

21

I know I'm playing thread necromancer here, but I've been working on this stuff a lot lately, so I thought I'd throw in my 2¢.

The device doesn't contain compass or inclinometers, so it doesn't measure azimuth, pitch, or roll directly. (We call those Euler angles, BTW). Instead, it uses accelerometers and magnetometers, both of which produce 3-space XYZ vectors. These are used to compute the azimuth, etc. values.

Vectors are in device coordinate space:

Device coordinates

World coordinates have Y facing north, X facing east, and Z facing up:

World coordinates

Thus, a device's "neutral" orientation is lying flat on its back on a table, with the top of the device facing north.

The accelerometer produces a vector in the "UP" direction. The magnetometer produces a vector in the "north" direction. (Note that in the northern hemisphere, this tends to point downward due to magnetic dip.)

The accelerometer vector and magnetometer vector can be combined mathematically through SensorManager.getRotationMatrix() which returns a 3x3 matrix which will map vectors in device coordinates to world coordinates or vice-versa. For a device in the neutral position, this function would return the identity matrix.

This matrix does not vary with the screen orientation. This means your application needs to be aware of orientation and compensate accordingly.

SensorManager.getOrientation() takes the transformation matrix and computes azimuth, pitch, and roll values. These are taken relative to a device in the neutral position.

I have no idea what the difference is between calling this function and just using TYPE_ORIENTATION sensor, except that the function lets you manipulate the matrix first.

If the device is tilted up at 90° or near it, then the use of Euler angles falls apart. This is a degenerate case mathematically. In this realm, how is the device supposed to know if you're changing azimuth or roll?

The function SensorManager.remapCoordinateSystem() can be used to manipulate the transformation matrix to compensate for what you may know about the orientation of the device. However, my experiments have shown that this doesn't cover all cases, not even some of the common ones. For example, if you want to remap for a device held upright (e.g. to take a photo), you would want to multiply the transformation matrix by this matrix:

1 0 0
0 0 1
0 1 0

before calling getOrientation(), and this is not one of the orientation remappings that remapCoordinateSystem() supports [someone please correct me if I've missed something here].

OK, so this has all been a long-winded way of saying that if you're using orientation, either from the TYPE_ORIENTATION sensor or from getOrientation(), you're probably doing it wrong. The only time you actually want the Euler angles is to display orientation information in a user-friendly form, to annotate a photograph, to drive flight instrument display, or something similar.

If you want to do computations related to device orientation, you're almost certainly better off using the transformation matrix and working with XYZ vectors.

Working as a consultant, whenever someone comes to me with a problem involving Euler angles, I back up and ask them what they're really trying to do, and then find a way to do it with vectors instead.

Looking back at your original question, getOrientation() should return three values in [-180 180] [-90 90] and [-180 180] (after converting from radians). In practice, we think of azimuth as numbers in [0 360), so you should simply add 360 to any negative numbers you receive. Your code looks correct as written. It would help if I knew exactly what results you were expecting and what you were getting instead.

Edited to add: A couple more thoughts. Modern versions of Android use something called "sensor fusion", which basically means that all available inputs -- acceleromter, magnetometer, gyro -- are combined together in a mathematical black box (typically a Kalman filter, but depends on vendor). All of the different sensors -- acceleration, magnetic field, gyros, gravity, linear acceleration, and orientation -- are taken as outputs from this black box.

Whenever possible, you should use TYPE_GRAVITY rather than TYPE_ACCELEROMETER as the input to getRotationMatrix().

Community
  • 1
  • 1
Edward Falk
  • 9,991
  • 11
  • 77
  • 112
  • thanks for your explanation,Can you guide me about this question? http://stackoverflow.com/questions/27137239/get-moving-device-direction-without-gps – Arash Nov 26 '14 at 12:12
  • 1
    I've added an answer to that question. You probably won't like it; it's a very very hard problem to solve. – Edward Falk Nov 26 '14 at 17:20
7

I might be shooting in the dark here, but if I understand your question correctly, you are wondering why you get [-179..179] instead of [0..360]?

Note that -180 is the same as +180 and the same as 180 + N*360 where N is a whole number (integer).

In other words, if you want to get the same numbers as with orientation sensor you can do this:

// x = orientationValues[0];
// y = orientationValues[1];
// z = orientationValues[2];
x = (x + 360.0) % 360.0;
y = (y + 360.0) % 360.0;
z = (z + 360.0) % 360.0;

This will give you the values in the [0..360] range as you wanted.

johndodo
  • 17,247
  • 15
  • 96
  • 113
1

You are missing one critical computation in your calculations.
The remapCoordinateSystem call afer you do a getRotationMatrix.

Add that to your code and all will be fine.
You can read more about it here.

TheCodeArtist
  • 21,479
  • 4
  • 69
  • 130