Updated
I have a GoPro Hero4 Black mounted on a drone receiving the drone's telemetry, which tends to be inaccurate at times. I need to verify the telemetry by calculating the camera's rotation and translation from two different frames.
I have calculated the essential matrix using the intrinsic parameters, that I extracted from my camera matrix, which I had gotten by the OpenCV chessboard calibration.
Problem Now the results I am getting are rubbish unless I am comparing the same photo to itself. They also differ based on which feature detector I use or whether I use RANSAC or LMEDS but I am pretty sure that key point matching itself works well. Therefore I believe I
Either decompose the essential matrix and retrieve its corresponding angles incorrectly
Or my camera is not calibrated well even though the reprojection error was 0.23.
I, however, have no idea how to verify which of them is true (or if any). I have found other ways of the essential matrix decomposition such as this one Decomposition of Essential Matrix: Validation of the four possible solutions for R and T and I do not know what the difference is between this approach and mine - using org.opencv.calib3d.Calib3d.decomposeEssentialMat(..). Anyone care to clarify?
Question 1: Could anyone please check if my angle retrieval is correct?
Question 2: I am not sure how to check if my camera calibration is correct but I am posting it below along with its parameters retrieved by Calib3d.calibrationMatrixValues(..). I think it is not since the field of view retrieved is different than the one of my camera's specs. Could you please confirm this?
I am using Java and OpenCV 3.1.
I appreciate your help.
Main function
// Match keypoints
// ....
MatOfPoint2f firstImageMop2f = new MatOfPoint2f();
firstImageMop2f.fromList(firstImageList);
MatOfPoint2f secondImageMop2f = new MatOfPoint2f();
secondImageMop2f.fromList(secondImageList);
Mat essentialMat = org.opencv.calib3d.Calib3d.findEssentialMat(firstImageMop2f, secondImageMop2f,
3.5714310113232735, new Point(3.1427346523449033, 2.383214663142159),
org.opencv.calib3d.Calib3d.RANSAC, 0.999, 3);
decomposeEssential(essentialMat);
Decompose Essential
private void decomposeEssential(Mat essentialMat) {
Mat firstRotation = new Mat();
Mat secondRotation = new Mat();
Mat translation = new Mat();
org.opencv.calib3d.Calib3d.decomposeEssentialMat(essentialMat, firstRotation, secondRotation, translation);
printRotationMatrix(firstRotation, "Rotation matrix");
printRotationMatrix(secondRotation, "Rotation matrix");
printGeneralMatrix(translation, "Translation");
}
Print rotation matrix
private void printRotationMatrix(Mat matrix, String description) {
double x, y, z;
double sy = Math.sqrt(matrix.get(0, 0)[0] * matrix.get(0, 0)[0]
+ matrix.get(1, 0)[0] * matrix.get(1, 0)[0]);
boolean singular = sy < 1e-6;
if (!singular) {
System.out.println(" as a non singular matrix");
x = Math.atan2(matrix.get(2, 1)[0], matrix.get(2, 2)[0]); //roll
y = Math.atan2(-matrix.get(2, 0)[0], sy); // pitch
z = Math.atan2(matrix.get(1, 0)[0], matrix.get(0, 0)[0]); // yaw
} else {
System.out.println(" as a singular matrix");
x = Math.atan2(-matrix.get(2, 1)[0], matrix.get(2, 2)[0]);
y = Math.atan2(-matrix.get(2, 0)[0], sy);
z = 0;
}
System.out.println("Axes angles: ");
System.out.println("x (roll): " + Math.toDegrees(x));
System.out.println("y (pitch): " + Math.toDegrees(y));
System.out.println("z (yaw): " + Math.toDegrees(z));
}
Camera matrix
740.9127543750064 0.0 651.9773670991048
0.0 736.5104868078901 377.12407856315485
0.0 0.0 1.0
Calib3d.calibrationMatrixValues(cameraMatrix, new Size(1280, 720), 6.17, 4.55, output parameters...)
FoVx: 81.64086856941297
FoVy: 52.09797490556325
Focal length: 3.5714310113232735
Principal point of view; x: 3.1427346523449033, y: 2.383214663142159
Aspect ratio: 0.9940583185521892