2

have IMU sensor that gives me the raw data such as orientation, Angular and Linear acceleration. Im using ROS and doing some Gazebo UUV simulation. Furthermore, I want to get linear velocity from the raw IMU data.

So the naive integration method is as follow: The simplest method for IMU state estimation is the naive integration of the measured data. We estimate the attitude by integrating 3D angular velocity data obtained by the IMU. Assuming that the time step Δ is small, the attitude at each time step can be incrementally calculated.

If I do integration over time there will be accumulated error and will not be accurate with the time when for example the robot makes turns. So Im looking for some methods ( ROS packages or outside ROS framework) or code that can correct that error.

Any help?

Bob9710
  • 205
  • 3
  • 15

1 Answers1

3

I would first recommend that you try fitting your input sensor data into an EKF or UKF node from the robot_localization package. This package is the most used & most optimized pose estimation package in the ROS ecosystem.

It can handle 3D sensor input, but you would have to configure the parameters (there are no real defaults, all config). Besides the configuration docs above, the github has good examples of yaml parameter configurations (Ex.) (you'll want a separate file from the launch file) and example launch files (Ex.).

JWCS
  • 1,120
  • 1
  • 7
  • 17
  • that doesnt solve the integration problem. In order to get velocity need to integrate the acceleration. And with the time that error will accumulate and will be huge. So my problem is how to minimize that error. Understand? i done research and there are some numerical methods in terms of optimization function and so on. One solution is https://robomechjournal.springeropen.com/articles/10.1186/s40648-018-0110-1#rightslink. Im looking for some better solution in that direction. Without additional sensor . So the error will be still there but with some mathematical methods you minimize that error – Bob9710 Sep 25 '21 at 08:55
  • 1
    @Bob9710 the point JWCS is trying to make is that to calculate velocity you shouldn't be integrating at all. It doesn't work because of the reasons you've seen. – BTables Sep 25 '21 at 17:24
  • so how to do it than? – Bob9710 Sep 25 '21 at 18:17
  • ok. the point with http://docs.ros.org/en/noetic/api/robot_localization/html/index.html is that the motion model is for robots with wheels and some how ground robots. My robot is under water fish robot UUV and think the http://docs.ros.org/en/noetic/api/robot_localization/html/index.html is not suitable for such robots. You got my point? – Bob9710 Sep 25 '21 at 18:40
  • 1
    @Bob9710 your perception of that package is incorrect. It was built with robots having wheels in mind, but that's not the *only* use case. The main purpose is to provide state estimation in 3D space; this is not specific to something with wheels. – BTables Sep 25 '21 at 22:33
  • so using that package for my underwater robot and having only IMU as a sensor input i can get accurate Velocity estimation?Is that correct? – Bob9710 Sep 26 '21 at 05:01
  • My first goal is to get linear velocity. Then in the second step i will do sensor fusion of IMU and lets say barometer . So http://docs.ros.org/en/noetic/api/robot_localization/html/index.html package can give me linear velocity only from IMU? – Bob9710 Sep 26 '21 at 07:00
  • Ok when use the package I got error `Could not obtain transform from thrbot/imu_link to base_link. Error was "base_link" passed to lookupTransform argument target_frame does not exist.` My UMU topic is `/thrbot/imu ` so how to set that? – Bob9710 Sep 26 '21 at 09:12
  • @JWCS what about this discussion https://answers.ros.org/question/232794/can-i-use-robot_localization-package-with-only-imu-data-if-i-couldhow-to-config-the-package/. Do i need to integrate the raw acceleration to obtain rough velocity and than input those values into a Nav_Msgs/Odometry message under linear twist and published that to the required Odom sensor in robot_localization? – Bob9710 Sep 26 '21 at 17:00
  • Sorry for the delayed responses; but thanks @BTables. You're asking about what algorithms and computational tricks you can do to minimize accumulated error; what ever you might begin to think of, this package/company has already considered or implemented, or has been submitted in a PR. (Or at least close; it's pretty darn as close to perfect as any competing option, especially developing it your self.) This is the best you can do. It's most often tested on wheeled 2d robots, but it is used by flying UAVs and underwater UUVs. With only an IMU, you will get as accurate velocity _as possible_. – JWCS Sep 29 '21 at 19:53
  • As for your frame transform question, it seems you're missing the correct frame transforms. Use some debugging tools like [`rosrun tf view_frames`](http://wiki.ros.org/tf/Tutorials/Introduction%20to%20tf) to see what the issue is. – JWCS Sep 29 '21 at 19:55
  • As for the discussion on answers.ros, or rather, how to configure the localization node, if you only have one sensor, the imu, then you don't have any twist, pose, or odom sensors. So remove the `odom0`, `pose0`, `twist0` etc parameters. You just have an `imu0`. From just an imu, it'll give you decent velocity estimates, but susceptible to drift, as you've noted. If you add a barometer / depth sensor, however, that's an absolute pose update in the map frame along the Z axis. This would greatly help reduce and correct drift. More direct to your question, you don't need to feed the outs back in. – JWCS Sep 29 '21 at 20:00