I want to track the movement of the phone in the user's hand.
Given the condition that the app must be started while the phone is stationary, I want to track the position of the phone relative to the position the phone was in when the app was started. I am using the orientation of the phone and the values I read from the accelerometer.
Here's some code I tried to use. I'm transforming the accelerometer sensor values (which are relative to the phone screen) to earth coordinate system (relative to north pole) using the rotation matrix which I got from the magnetometer and accelerometer values. Then I integrate the acceleration values to get the speed, and then integrate the speed values to get the position.
I'm not sure if there are problems in my code or the error from the sensors is just to big. Any advice would be appreciated.
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values.clone();
}
else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mGeomagnetic = event.values.clone();
}
else if (event.sensor.getType() == Sensor.TYPE_LINEAR_ACCELERATION) {
mAcceleration[0] = event.values[0];
mAcceleration[1] = event.values[1];
mAcceleration[2] = event.values[2];
mAcceleration[3] = 0;
}
if (mGeomagnetic != null &&
mGravity != null &&
event.sensor.getType() == Sensor.TYPE_LINEAR_ACCELERATION) {
// mRotationMatrix and mInclinationMatrix are outputs
SensorManager.getRotationMatrix(mRotationMatrix, mInclinationMatrix, mGravity, mGeomagnetic);
// mOrientation is output matrix
SensorManager.getOrientation(mRotationMatrix, mOrientation);
// mRotationMatrixInverse is output matrix
Matrix.invertM(mRotationMatrixInverse, 0, mRotationMatrix, 0);
// mAccelerationEarthRelative is output matrix
Matrix.multiplyMV(mAccelerationEarthRelative, 0, mRotationMatrixInverse, 0, mAcceleration, 0);
long currTimestamp = event.timestamp;
if (lastTimestamp == -1) {
lastTimestamp = currTimestamp;
return;
}
// timestamps are in nanoseconds
long timeDeltaNs = (currTimestamp - lastTimestamp);
float timeDelta = timeDeltaNs * 0.000000001f;
lastTimestamp = currTimestamp;
mSpeed[0] += mAccelerationEarthRelative[0] * timeDelta;
mSpeed[1] += mAccelerationEarthRelative[1] * timeDelta;
mSpeed[2] += mAccelerationEarthRelative[2] * timeDelta;
mPosition[0] += mSpeed[0] * timeDelta;
mPosition[1] += mSpeed[1] * timeDelta;
mPosition[2] += mSpeed[2] * timeDelta;
mTextView.setText(String.format(
"AZM: %+05.2f PTC: %+05.2f ROL: %+05.2f\n\n" +
"ACC:\n X: %+05.2f Y: %+05.2f Z: %+05.2f\n\n" +
"SPD:\n X: %+05.2f Y: %+05.2f Z: %+05.2f\n\n" +
"POS:\n X: %+05.2f Y: %+05.2f Z: %+05.2f\n\n",
mOrientation[0] * 57.2957795,
mOrientation[1] * 57.2957795,
mOrientation[2] * 57.2957795,
mAccelerationEarthRelative[0],
mAccelerationEarthRelative[1],
mAccelerationEarthRelative[2],
mSpeed[0], mSpeed[1], mSpeed[2],
mPosition[0], mPosition[1], mPosition[2]
));
}
}
To give more context to what I'm trying to build - I'm trying to create an AR application that can render 3D objects on top of the camera input as if they were present in the actual scene. My idea was to use the sensors to track the phone position in space and move the camera in the 3D scene accordingly.