1

I am working on the MRS1104C and I am trying to map an orthogonal box. I am using only 1 of the 4 layers. I have positioned the sensor perpendicular to the floor and I get the measurements of distance, angles and quaternions from the embedded to the sensor IMU. In order to fully map the box, I rotate the sensor 200 degrees relative to the vertical axis on the floor (rotation 1) and also some degrees relative to the horizontal axis (rotation 2).

In order to tranform to the global coordinate system I use the Quaternion-derived rotation matrix (Python).

x = distance * math.cos(phi) * math.cos(theta)
y = distance * math.cos(phi) * math.sin(theta)
z = distance * math.sin(phi)

a11 = 1 - (2 * (qy**2 + (qz**2)))
a12 = 2 * (qx * qy + qw * qz)
a13 = 2 * (-qw * qy + qx * qz)

a21 = 2 * (qx * qy - qw * qz)
a22 = 1 - (2 * (qx**2 + qz**2))
a23 = 2 * (qy * qz + qw * qx)

a31 = 2 * (qx * qz + qw * qy)
a32 = 2 * (-qw * qx + qy * qz)
a33 = 1 - (2 * (qx**2 + qy**2))

x_final = a11 * x + a12 * y + a13 * z 
y_final = a21 * x + a22 * y + a23 * z 
z_final = a31 * x + a32 * y + a33 * z

The data for the Quaternion-derived rotation matrix are obtained by the IMU of the sensor. So, new values from the sensor are received each time the sensor is moved. However, after the transformation, a different slope on the same side of the box is observed for the measurements from 0 to 137.5 degrees with respect to the measurements from -137.5 to 0 degrees, as can be seen at the attached figure. Figure of the box from measurements

I believe that this should not be observed, because each new measurement takes into account the new quaternions and thus the transformation for the same point on the same side of the box should give almost the same result.

1) Could you please give me some ideas why this happen and also, could you please give me some advices or guide me how can I solve it?

2) Am I missing something?

3) Is the above methodology correct?

Please see the figures below:

Image with colors blue is from -137.5 to 0 degrees and red from 0 to 137.5 degrees

Other viewpoint

Dr. Quest
  • 11
  • 4
  • The image was a bit confusing for me. Is it possible you could colorize the different scans or sketch the issue? I did look up your sensor but did not see the accuracy of the IMU. https://www.sick.com/us/en/detection-and-ranging-solutions/3d-lidar-sensors/mrs1000/mrs1104c-111011/p/p495044. – Cary H Mar 24 '20 at 13:58
  • Thanks for reply! I added two extra colored images. In their datasheet I did not find the IMU characteristics. – Dr. Quest Mar 24 '20 at 16:03
  • IMU's drift when they are stationary. I am still having a bit of trouble making out a box from the image but from the twist of the exterior it looks like your IMU might be off a little. Can you point your sensor at a small object that you can see the entire thing in your field of view. Then take a set of data and then rotate your sensor 90 degrees in the XY(Floor) plane and grab another set? That should tell you if your IMU calibration is good. Are you using pyquaternion? – Cary H Mar 24 '20 at 16:40
  • I am using the above code. I have also used pyquaternion library for transformation, however the results were the same. I do not know if my methodology is right. – Dr. Quest Mar 24 '20 at 16:43
  • You think that this may be caused by IMU bad calibration? If my calibration was not good, would i observe this phenomenon? – Dr. Quest Mar 24 '20 at 16:46
  • I do not think there is a flaw in the section of the program that you have shared. This does look like a scan from inside a cube. The twist in the data may be due to the very small error in the IMU. You may be able to use software to register one set of pointcloud to another. (Open3D or something). You may be able to get a something out of Numpy to reduce error due to float (if there is any). It might not get a bunch better if you are relying strictly on the IMU for global accuracy. This is a great looking attempt if this is early development. – Cary H Mar 24 '20 at 19:32
  • The part of the program that I have shared is for coordinate transformation from shperical to global Cartesian coordinate system. In the beginning, I transform from spherical (distance, theta, phi) to local Cartesian coordinate system (x,y,z). Then, I use quaternions from the IMU unit (qw, qx, qy, qz) to transform from local coordinate system to global ( x_final, y_final, z_final). Do you think that if the IMU data does not include error, this methodology is correct? If there is error in IMU data, which algorithm do you propose to overcome these errors? – Dr. Quest Mar 25 '20 at 11:29
  • That is what I do also. The method is correct. The data looks like it does not have a bunch of scatter error (it is not fuzzy). The shapes are just 'tilted'. Can you possibly view your IMU data without moving the sensor to see the 'drift" of (qw, qx, qy, qz)? If you can get repeatable IMU data you should have a good shot at this. It looks like the areas where your sensor has a low angle of incidence there is some error in the return but that is expected. (Distortion) – Cary H Mar 26 '20 at 20:24
  • As a check you could plot the data without transformation. It look the same, just rotated. – Cary H Mar 26 '20 at 20:26
  • https://arxiv.org/pdf/1909.06700.pdf – Cary H Apr 02 '20 at 16:32
  • 1
    If as input for qw, qx, qy, qz I use the actual movement of the sensor and not the data from the IMU, then I extract the cube correctely. I asked for the IMU technical data and this is the IMU that the sensor is equiped with ( Bosch bhi160): https://www.bosch-sensortec.com/media/boschsensortec/downloads/smart_sensors_1/bhi160/bst-bhi160-fl000.pdf. Do you believe that the accuracy of this IMU is high? Do I need to use some algorithm to correct this errors (something like this : arxiv.org/pdf/1909.06700.pdf) ? – Dr. Quest Apr 05 '20 at 17:33
  • Correct. +/- 2 degrees will not yield much better that what you are getting. You may need to perform cloud to cloud registration using some overlap. Open3D will register one cloud to another and then you can just keep building onto that cloud. I have not tried the real time code from https://github.com/PaulKrush/loam_livox. I will be trying this soon with the Livox unit if I can get one (China). – Cary H Apr 06 '20 at 20:47

0 Answers0