12

I wish to get my phone's current orientation by the following method:

  1. Get the initial orientation (azimuth) first via the getRotationMatrix() and getOrientation().
  2. Add the integration of gyroscope reading over time to it to get the current orientation.

Phone Orientation:

The phone's x-y plane is fixed parallel with the ground plane. i.e., is in a "texting-while-walking" orientation.

"getOrientation()" Returnings:

Android API allows me to easily get the orientation, i.e., azimuth, pitch, roll, from getOrientation().

Please note that this method always returns its value within the range: [0, -PI] and [o, PI].

My Problem:

Since the integration of the gyroscope reading, denoted by dR, may be quite big, so when I do CurrentOrientation += dR, the CurrentOrientation may exceed the [0, -PI] and [o, PI] ranges.

What manipulations are needed so that I can ALWAYS get the current orientation within the the [0, -PI] and [o, PI] ranges?

I have tried the following in Python, but I highly doubt its correctness.

rotation = scipy.integrate.trapz(gyroSeries, timeSeries) # integration
if (headingDirection - rotation) < -np.pi:
    headingDirection += 2 * np.pi
elif (headingDirection - rotation) > np.pi:
    headingDirection -= 2 * np.pi
# Complementary Filter
headingDirection = ALPHA * (headingDirection - rotation) + (1 - ALPHA) * np.mean(azimuth[np.array(stepNo.tolist()) == i])
if headingDirection < -np.pi:
    headingDirection += 2 * np.pi
elif headingDirection > np.pi:
    headingDirection -= 2 * np.pi

Remarks

This is NOT that simple, because it involves the following trouble-makers:

  1. The orientation sensor reading goes from 0 to -PI, and then DIRECTLY JUMPS to +PI and gradually gets back to 0 via +PI/2.
  2. The integration of the gyrocope reading also leads to some trouble. Should I add dR to the orientation or subtract dR.

Do please refer to the Android Documentations first, before giving a confirmed answer.

Estimated answers will not help.

Sibbs Gambling
  • 19,274
  • 42
  • 103
  • 174
  • Why are you first using orientation and then gyro? what do you want to achieve as the final result? – pxm Aug 08 '13 at 22:31
  • @pxm I get the INITIAL orientation first, and then I add the integrated gyro reading to get the CURRENT orientation. The final result is the current orientation. – Sibbs Gambling Aug 09 '13 at 02:29
  • What is your reason for using gyro here? Why not using TYPE_MAGNETIC_FIELD and TYPE_GRAVITY to get the azimuth? – Hoan Nguyen Aug 11 '13 at 22:10
  • @HoanNguyen I get your point. but my point is the magnetic field is heavily distorted here. So I decide to use gyro here – Sibbs Gambling Aug 12 '13 at 00:36
  • If the magnetic is distorted then your initial orientation is not accurate so using gyro would not help as your subsequent azimuth values depend on the initial value. – Hoan Nguyen Aug 12 '13 at 00:42
  • @HoanNguyen I only use the magnetic field ONCE for the INITIAL orientation. This is justified by experiment. – Sibbs Gambling Aug 12 '13 at 01:02
  • Say if your true initial value is PI/2, but because of the magnetic field distortion your reading is 3PI/4. How does using gyro would help here. As I understand gyro would help in filtering the accelerometer value to get as an accurate value of the gravity as possible. An accurate gravity value would give you a more accurate rotation matrix as the getRotationMatrix gravity parameter assume that this is the a vector in the gravity direction. As for magnetic field interference, I do not think there is anything you can do to eliminate it. – Hoan Nguyen Aug 12 '13 at 01:20
  • My implementation of a compass is pretty stable and accurate using the two sensors in the above comment. If you post another question like "how to obtain stable azimuth using sensors", I will post my code. – Hoan Nguyen Aug 12 '13 at 01:21
  • @HoanNguyen Thanks I also heard about something like you have mentioned. but my concern is that by doing so, how do you know the INITIAL orientation? I mean the gyro only gives the variation. But without the initial orientation, how would the variation help? – Sibbs Gambling Aug 12 '13 at 01:23
  • @HoanNguyen Yes sure, please! Maybe your codes will just help solve the problem. :) – Sibbs Gambling Aug 12 '13 at 01:24
  • After you post your question, post the link. Also, do you want the azimuth when the device is flat? You can just rephrase your title above since if you post a question without code you may got negative vote. – Hoan Nguyen Aug 12 '13 at 01:45
  • @HoanNguyen could you post your code as an answer? – Sibbs Gambling Aug 12 '13 at 01:54
  • Do you want just flat or if the device can be in any position? if the device is not flat it only make sense to calculate the direction of the back camera. – Hoan Nguyen Aug 12 '13 at 02:09
  • @HoanNguyen just flat will do. :) – Sibbs Gambling Aug 12 '13 at 02:14

4 Answers4

9

The orientation sensor actually derives its readings from the real magnetometer and the accelerometer.

I guess maybe this is the source of the confusion. Where is this stated in the documentation? More importantly, does the documentation somewhere explicitly state that the gyro readings are ignored? As far as I know the method described in this video is implemented:

Sensor Fusion on Android Devices: A Revolution in Motion Processing

This method uses the gyros and integrates their readings. This pretty much renders the rest of the question moot; nevertheless I will try to answer it.


The orientation sensor is already integrating the gyro readings for you, that is how you get the orientation. I don't understand why you are doing it yourself.

You are not doing the integration of the gyro readings properly, it is more complicated than CurrentOrientation += dR (which is incorrect). If you need to integrate the gyro readings (I don't see why, the SensorManager is already doing it for you) please read Direction Cosine Matrix IMU: Theory how to do it properly (Equation 17).

Don't try integrating with Euler angles (aka azimuth, pitch, roll), nothing good will come out.

Please use either quaternions or rotation matrices in your computations instead of Euler angles. If you work with rotation matrices, you can always convert them to Euler angles, see

Computing Euler angles from a rotation matrix by Gregory G. Slabaugh

(The same is true for quaternions.) There are (in the non-degenrate case) two ways to represent a rotation, that is, you will get two Euler angles. Pick the one that is in the range you need. (In case of gimbal lock, there are infinitely many Euler angles, see the PDF above). Just promise you won't start using Euler angles again in your computations after the rotation matrix to Euler angles conversion.

It is unclear what you are doing with the complementary filter. You can implement a pretty damn good sensor fusion based on the Direction Cosine Matrix IMU: Theory manuscript, which is basically a tutorial. It's not trivial to do it but I don't think you will find a better, more understandable tutorial than this manuscript.

One thing that I had to discover myself when I implemented sensor fusion based on this manuscript was that the so-called integral windup can occur. I took care of it by bounding the TotalCorrection (page 27). You will understand what I am talking about if you implement this sensor fusion.



UPDATE: Here I answer your questions that you posted in comments after accepting the answer.

I think the compass gives me my current orientation by using gravity and magnetic field, right? Is gyroscope used in the compass?

Yes, if the phone is more or less stationary for at least half a second, you can get a good orientation estimate by using gravity and the compass only. Here is how to do it: Can anyone tell me whether gravity sensor is as a tilt sensor to improve heading accuracy?

No, the gyroscopes are not used in the compass.

Could you please kindly explain why the integration done by me is wrong? I understand that if my phone's pitch points up, euler angle fails. But any other things wrong with my integration?

There are two unrelated things: (i) the integration should be done differently, (ii) Euler angles are trouble because of the Gimbal lock. I repeat, these two are unrelated.

As for the integration: here is a simple example how you can actually see what is wrong with your integration. Let x and y be the axes of the horizontal plane in the room. Get a phone in your hands. Rotate the phone around the x axis (of the room) by 45 degrees, then around the y axis (of the room) by 45 degrees. Then, repeat these steps from the beginning but now rotate around the y axis first, and then around the x axis. The phone ends up in a totally different orientation. If you do the integration according to CurrentOrientation += dR you will see no difference! Please read the above linked Direction Cosine Matrix IMU: Theory manuscript if you want to do the integration properly.

As for the Euler angles: they screw up the stability of the application and it is enough for me not to use them for arbitrary rotations in 3D.

I still don't understand why you are trying to do it yourself, why you don't want to use the orientation estimate provided by the platform. Chances are, you cannot do better than that.

Community
  • 1
  • 1
Ali
  • 56,466
  • 29
  • 168
  • 265
  • 1. Sorry for the misleading that may have been caused: I am using the compass instead of the orientation sensor. I think the compass gives me my current orientation by using gravity and magnetic field, right? Is gyroscope used in the compass? Because the compass readings are heavily distorted by the metallic stuffs, I wish to only use it for once and add the orientation change reported by the gyroscope. 2. Could you please kindly explain why the integration done by me is wrong? I understand that if my phone's pitch points up, euler angle fails. But any other things wrong with my integration? – Sibbs Gambling Aug 15 '13 at 07:17
  • Thanks! Please read this: http://www.thousand-thoughts.com/2012/03/android-sensor-fusion-tutorial/ I am doing the same thing as in this. The orientation directly obtained from the platform is very inaccurate due to the magnetic field distortion, whereas the gyroscope does not suffer from that. So I wish to user orientation derived from the platform ONLY ONCE as the initial orientation, and then rely FULLY on gyroscope. – Sibbs Gambling Aug 16 '13 at 01:51
  • @perfectionm1ng I took a shallow look at that blog post. As far as I can tell, it is probably OK what he is doing. I didn't check the thing in details though. He encourages you to contact him if you have questions (see page 3) and that is what I suggest too. – Ali Aug 16 '13 at 08:40
  • Could you explain why I cannot directly integrate the gyroscope readings and then do the 'CurrentOrientation += dR'? Why do I have to turn to quaterion and all that? THanks!!! – Sibbs Gambling Aug 31 '13 at 04:59
  • @perfectionm1ng Please re-read the paragraph starting with: *"As for the integration: here is a simple example how you can actually see what is wrong with your integration."* – Ali Aug 31 '13 at 09:58
2

I think you should avoid the depreciated "Orientation Sensor", and use sensor fusion methods like getRotationVector, getRotationMatrix that already implement fusion algorithms specially of Invensense, which already use gyroscope data.

If you want a simple sensor fusion algorithm called a balance filter (refer http://www.filedump.net/dumped/filter1285099462.pdf) can be used. Approach is as in

http://postimg.org/image/9cu9dwn8z/

This integrates the gyroscope to get angle, then high-pass filters the result to remove drift, and adds it to the smoothed accelerometer and compass results. The integrated, high-pass-fil-tered gyro data and the accelerometer/compass data are added in such a way that the two parts add to one, so that the output is an accurate estimate in units that make sense. For the balance filter, the time constant may be tweaked to tune the response. The shorter the time constant, the better the response but the more acceleration noise will be allowed to pass through.

To see how this works, imagine you have the newest gyro data point (in rad/s) stored in gyro, the newest angle measurement from the accelerometer is stored in angle_acc, and dtis the time from the last gyro data until now. Then your new angle would be calculated using

angle = b * (angle + gyro*dt) + (1 - b) *(angle_acc);

You may start by trying b = 0.98 for instance. You will also probably want to use a fast gyroscope measurement time dt so the gyro doesn’t drift more than a couple of degrees before the next measurement is taken. The balance filter is useful and simple to implement, but is not the ideal sensor fusion approach. Invensense’s approach involves some clever algorithms and probably some form of Kalman filter.

Source: Professional Android Sensor Programming, Adam Stroud.

pxm
  • 1,611
  • 2
  • 18
  • 34
  • Actually, what I used is the getRotationMatrix. I did not use the orientation sensor. – Sibbs Gambling Aug 09 '13 at 02:30
  • @pxm Your first paragraph basically repeats mine. Complementary filters are good but **you are giving the wrong example here:** The balance filter is **not** for tracking arbitrary orientations in 3D. How can a single angle determine the orientation in 3D? You have a 3 dimensional vector as `gyro`; how would you like to add a 3D vector and a number? If you turn `angle` into a 3D vector, it is wrong again, it is not how you integrate gyro readings in 3D. – Ali Aug 09 '13 at 09:39
2

If the azimuth value is inaccurate due to magnetic interference, there is nothing that you can do to eliminate it as far as I know. To get a stable reading of the azimuth you need to filter the accelerometer values if TYPE_GRAVITY is not available. If TYPE_GRAVITY is not available, then I am pretty sure that the device does not have a gyro, so the only filter that you can use is low pass filter. The following code is an implementation of a stable compass using TYPE_GRAVITY and TYPE_MAGNETIC_FIELD.

public class Compass  implements SensorEventListener
{
    public static final float TWENTY_FIVE_DEGREE_IN_RADIAN = 0.436332313f;
    public static final float ONE_FIFTY_FIVE_DEGREE_IN_RADIAN = 2.7052603f;

    private SensorManager mSensorManager;
    private float[] mGravity;
    private float[] mMagnetic;
    // If the device is flat mOrientation[0] = azimuth, mOrientation[1] = pitch
    // and mOrientation[2] = roll, otherwise mOrientation[0] is equal to Float.NAN
    private float[] mOrientation = new float[3];
    private LinkedList<Float> mCompassHist = new LinkedList<Float>();
    private float[] mCompassHistSum = new float[]{0.0f, 0.0f};
    private int mHistoryMaxLength;

    public Compass(Context context)
    {
         mSensorManager = (SensorManager) context.getSystemService(Context.SENSOR_SERVICE);
         // Adjust the history length to fit your need, the faster the sensor rate
         // the larger value is needed for stable result.
         mHistoryMaxLength = 20;
    }

    public void registerListener(int sensorRate)
    {
        Sensor magneticSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
        if (magneticSensor != null)
        {
            mSensorManager.registerListener(this, magneticSensor, sensorRate);
        }
        Sensor gravitySensor = mSensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY);
        if (gravitySensor != null)
        {
            mSensorManager.registerListener(this, gravitySensor, sensorRate);
        }
    }

    public void unregisterListener()
    {
         mSensorManager.unregisterListener(this);
    }

    @Override
    public void onAccuracyChanged(Sensor sensor, int accuracy)
    {

    }

    @Override
    public void onSensorChanged(SensorEvent event)
    {
        if (event.sensor.getType() == Sensor.TYPE_GRAVITY)
        {
            mGravity = event.values.clone();
        }
        else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
        {
            mMagnetic = event.values.clone();
        }
        if (!(mGravity == null || mMagnetic == null))
        {
            mOrientation = getOrientation();
        } 
    }

    private void getOrientation()
    {
        float[] rotMatrix = new float[9];
        if (SensorManager.getRotationMatrix(rotMatrix, null, 
            mGravity, mMagnetic))
        {
            float inclination = (float) Math.acos(rotMatrix[8]);
            // device is flat
            if (inclination < TWENTY_FIVE_DEGREE_IN_RADIAN 
                || inclination > ONE_FIFTY_FIVE_DEGREE_IN_RADIAN)
            {
                float[] orientation = sensorManager.getOrientation(rotMatrix, mOrientation);
                mCompassHist.add(orientation[0]);
                mOrientation[0] = averageAngle();
            }
            else
            {
                mOrientation[0] = Float.NAN;
                clearCompassHist();
            }
        }
    }

    private void clearCompassHist()
    {
        mCompassHistSum[0] = 0;
        mCompassHistSum[1] = 0;
        mCompassHist.clear();
    }

    public float averageAngle()
    {
        int totalTerms = mCompassHist.size();
        if (totalTerms > mHistoryMaxLength)
        {
            float firstTerm = mCompassHist.removeFirst();
            mCompassHistSum[0] -= Math.sin(firstTerm);
            mCompassHistSum[1] -= Math.cos(firstTerm);
            totalTerms -= 1;
        }
        float lastTerm = mCompassHist.getLast();
        mCompassHistSum[0] += Math.sin(lastTerm);
        mCompassHistSum[1] += Math.cos(lastTerm);
        float angle = (float) Math.atan2(mCompassHistSum[0] / totalTerms, mCompassHistSum[1] / totalTerms);

        return angle;
    }
}

In your activity instantiate a Compass object say in onCreate, registerListener in onResume and unregisterListener in onPause

private Compass mCompass;

@Override
protected void onCreate(Bundle savedInstanceState)
{
    super.onCreate(savedInstanceState);

    mCompass = new Compass(this);
}

@Override
protected void onPause()
{
    super.onPause();

    mCompass.unregisterListener();
}

@Override
protected void onResume()
{
    super.onResume();

    mCompass.registerListener(SensorManager.SENSOR_DELAY_NORMAL);
}
Hoan Nguyen
  • 18,033
  • 3
  • 50
  • 54
  • I cut and pasted and modified codes in several files in my project. I hope there will be no error, otherwise let me know and I will make a project and run it to find the error. – Hoan Nguyen Aug 12 '13 at 04:16
  • Change && to || in if (!(mGravity == null || mMagnetic == null)) – Hoan Nguyen Aug 12 '13 at 04:47
  • Could you please elaborate how your implementation is different from the off-the-shelf `getRotationMatrix()` and `getOrientation()` by Google? – Sibbs Gambling Aug 13 '13 at 06:58
  • The only difference is that instead of using the azimuth value as returned by getOrientation(), I keep a history of the azimuths returned by getOrientation() and then set the azimuth as the average of these values. Also, I use the inclination to determine when the device is flat, so that the "compass" direction make sense as the azimuth returned by getOrientation() is the direction of the device y-axis. – Hoan Nguyen Aug 13 '13 at 07:13
2

Its better to let android's implementation of Orientation detection handle it. Now, yes values you get are from -PI to PI, and you can convert them to degrees (0-360).Some Relevant parts:

Saving data to be processed:

@Override
public void onSensorChanged(SensorEvent sensorEvent) {
    switch (sensorEvent.sensor.getType()) {
        case Sensor.TYPE_ACCELEROMETER:
            mAccValues[0] = sensorEvent.values[0];
            mAccValues[1] = sensorEvent.values[1];
            mAccValues[2] = sensorEvent.values[2];
            break;
        case Sensor.TYPE_MAGNETIC_FIELD:
            mMagValues[0] = sensorEvent.values[0];
            mMagValues[1] = sensorEvent.values[1];
            mMagValues[2] = sensorEvent.values[2];
            break;
    }

}

Calculating roll, pitch and yaw (azimuth).mR and mI are arrys to hold rotation and inclination matrices, mO is a temporary array. The array mResults has the values in degrees, at the end:

    private void updateData() {
    SensorManager.getRotationMatrix(mR, mI, mAccValues, mMagValues);

    /**
     * arg 2: what world(according to app) axis , device's x axis aligns with
     * arg 3: what world(according to app) axis , device's y axis aligns with
     * world x = app's x = app's east
     * world y = app's y = app's north
     * device x = device's left side = device's east
     * device y = device's top side  = device's north
     */

    switch (mDispRotation) {
        case Surface.ROTATION_90:
            SensorManager.remapCoordinateSystem(mR, SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, mR2);
            break;
        case Surface.ROTATION_270:
            SensorManager.remapCoordinateSystem(mR, SensorManager.AXIS_MINUS_Y, SensorManager.AXIS_X, mR2);
            break;
        case Surface.ROTATION_180:
            SensorManager.remapCoordinateSystem(mR, SensorManager.AXIS_MINUS_X, SensorManager.AXIS_MINUS_Y, mR2);
            break;
        case Surface.ROTATION_0:
        default:
            mR2 = mR;
    }

    SensorManager.getOrientation(mR2, mO);


    //--upside down when abs roll > 90--
    if (Math.abs(mO[2]) > PI_BY_TWO) {
        //--fix, azimuth always to true north, even when device upside down, realistic --
        mO[0] = -mO[0];

        //--fix, roll never upside down, even when device upside down, unrealistic --
        //mO[2] = mO[2] > 0 ? PI - mO[2] : - (PI - Math.abs(mO[2]));

        //--fix, pitch comes from opposite , when device goes upside down, realistic --
        mO[1] = -mO[1];
    }

    CircleUtils.convertRadToDegrees(mO, mOut);
    CircleUtils.normalize(mOut);

    //--write--
    mResults[0] = mOut[0];
    mResults[1] = mOut[1];
    mResults[2] = mOut[2];
}
S.D.
  • 29,290
  • 3
  • 79
  • 130