3

Is there a library in python or c++ that is capable of estimating normals of point clouds in a consistent way? In a consistent way I mean that the orientation of the normals is globally preserved over the surface.

For example, when I use python open3d package:

downpcd.estimate_normals(search_param=o3d.geometry.KDTreeSearchParamHybrid(
    radius=4, max_nn=300))

I get an inconsistent results, where some of the normals point inside while the rest point outside.

many thanksestimated normals (black lines indicate outside directed normals)

Day_Dreamer
  • 3,311
  • 7
  • 34
  • 61

2 Answers2

6

UPDATE: GOOD NEWS!

The tangent plane algorithm is now implemented in Open3D!
The source code and the documentation.

You can just call pcd.orient_normals_consistent_tangent_plane(k=15).
And k is the knn graph parameter.


Original answer:

Like Mark said, if your point cloud comes from multiple depth images, then you can call open3d.geometry.orient_normals_towards_camera_location(pcd, camera_loc) before concatenating them together (assuming you're using python version of Open3D).


However, if you don't have that information, you can use the tangent plane algorithm:

  1. Build knn-graph for your point cloud.
    The graph nodes are the points. Two points are connected if one is the other's k-nearest-neighbor.
  2. Assign weights to the edges in the graph.
    The weight associated with edge (i, j) is computed as 1 - |ninj|
  3. Generate the minimal spanning tree of the resulting graph.
  4. Rooting the tree at an initial node, traverse the tree in depth-first order, assigning each node an orientation that is consistent with that of its parent.

Actually the above algorithm comes from Section 3.3 of Hoppe's 1992 SIGGRAPH paper Surface Reconstruction from Unorganized Points. The algorithm is also open sourced.

AFAIK the algorithm does not guarantee a perfect orientation, but it should be good enough.

Jing Zhao
  • 2,420
  • 18
  • 21
1

If you know the viewpoint from where each point was captured, it can be used to orient the normals. I assume that this not the case - so given your situation, which seems rather watertight and uniformly sampled, mesh reconstruction is promising.

PCL library offers many alternatives in the surface module. For the sake of normal estimation, I would start with either:

Although simple, they should be enough to produce a single coherent mesh.

Once you have a mesh, each triangle defines a normal (the cross product). It is important to note that a mesh isn't just a collection of independent faces. The faces are connected and this connectivity enforces a coherent orientation across the mesh.

pcl::PolygonMesh is an "half edge data structure". This means that every triangle face is defined by an ordered set of vertices, which defines the orientation: order of vertices => order of cross product => well defined unambiguous normals

You can either use the normals from the mesh (nearest neighbor), or calculate a low resolution mesh and just use it to orient the cloud.

Mark Loyman
  • 1,983
  • 1
  • 14
  • 23