I've searched a while about how to get the most accurate azimuth of device and came up with this:
//Activity members
private SensorManager sensorManager;
private float mAzimuth = 0.0f;
private float[] mRotation = null;
private float rMat[] = new float[16];
private float orientation[] = new float[3];
//Later inside the Activity: Rotation Listener
SensorEventListener rotationListener = new SensorEventListener() {
public void onAccuracyChanged(Sensor sensor, int accuracy) {}
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) {
mRotation = event.values.clone();
if (mRotation != null) {
float RmatVector[] = new float[16];
SensorManager.getRotationMatrixFromVector(RmatVector, mRotation);
SensorManager.remapCoordinateSystem(RmatVector, SensorManager.AXIS_X, SensorManager.AXIS_Z, rMat);
SensorManager.getOrientation(rMat, orientation);
mAzimuth = (float)Math.toDegrees((double)orientation[0]); //azimuth
if (mAzimuth < 0) {
mAzimuth = 360 + mAzimuth;
}
}
mRotation = null;
}
}
};
What I tried to do: Implement all newest guidelines of how to achieve the most accurate azimuth reading:
- When testing I well aware to all this (calibrating, biasing of electrical devices etc.)
- Also it can be seen in the code I've implemented this Google advice:
TYPE_ORIENTATION: This sensor was deprecated in Android 2.2 (API Level 8). The sensor framework provides alternate methods for acquiring device orientation, which are discussed in Using the Orientation Sensor.
The problem is that the resulted azimuth is not accurate significantly and it's something tested with several good devices.
So - Maybe I have some SW bug in my implementation? Or the whole method is wrong?
Thanks,