I am using the PX4 toolchain to build an autonomous UAV. I am using it through the Gazebo simulator, using ROS and MAVROS. For now, I need to work on path finding algorithms, not sensors, so I would like a "ground truth" map of a Gazebo simulation. If possible it would be great to be able to use it through the Octomap node of ROS.
All I have been able to do, is to connect a depth camera point cloud to the ROS octomap node, and look at it with RViz, like explained in this tutorial. I thought about putting depth camera sensor all around my world, in order to approximate the ground truth world with the octomap representation, but it looks complicated to do and no very efficient.
So is there a way to generate a point cloud representation of the current Gazebo world simulation, and connect it to an Octomap ?
Thank you very much for your help ! :)