3

Given a point cloud, what's the best way to find the closest plane that's fairly accurate but also fast enough?

I searched for nearest plane but couldn't find any related info.

I want to use this to snap them to this plane.

Joan Venge
  • 315,713
  • 212
  • 479
  • 689
  • You could start from building a line like here: http://en.wikipedia.org/wiki/Linear_regression, which is the 2d counterpart of your objective – BlackBear Mar 19 '12 at 20:43
  • Do you mean one of the "primary" planes (XY, XZ, YZ, with some Z, Y, or X offset, respectively), or determine the plane most closely described by the set of points? –  Mar 19 '12 at 20:44
  • @DavidEllis: the latter. – Joan Venge Mar 19 '12 at 20:45

2 Answers2

4

Least squares regression would be my guess. It will give you the coefficients for the plane that minimizes the mean square error over all the points.

You aren't the first:

3D Least Squares Plane

Community
  • 1
  • 1
duffymo
  • 305,152
  • 44
  • 369
  • 561
  • Do you know any examples specific for this or any example I can use to modify for my case? I heard of least squares but never used it before. – Joan Venge Mar 19 '12 at 20:44
1

I think you could also do this using principal component analysis:

Compute the average of your points:

C = (0,0,0);
for each point Ri in your dataset,
  { C += Ri; }
C = C * 1.0 / npoints;

Compute the covariance matrix of your points:

A = zeros(3,3);
for each point Ri in your dataset,
  {
  D = Ri - C;
  A += D*D';  // outer product
  }

Compute the inverse of A, A_inv:

  A_inv = inv(A)

Perform power iterations by repeatedly applying A_inv to a random initial vector:

  N = random vector.
  for i=1:20 (or so)
    {
    N = A_inv*N;
    N = normalize(N);
    }

The offset from the origin to your plane is k = dot(N,C). The equation that describes your plane is all points R such that k = dot(N,R).

rchilton
  • 121
  • 2
  • Thanks, I am not sure how to calculate these 2 principal vectors though. Do you have any example? – Joan Venge Mar 19 '12 at 21:24
  • I will take a crack at it - overflow noob here. – rchilton Mar 19 '12 at 21:25
  • Thanks man, I will try your implementation in a moment. Although I am not sure of the last line. Power methods, etc? – Joan Venge Mar 19 '12 at 21:36
  • Thanks for the edits. Can you please also show how to get the most dominant eigenvectors? I haven't used them before. – Joan Venge Mar 19 '12 at 21:50
  • Based on an answer from the other thread that duffymo linked, it does makes more sense to find N directly via inverse iteration than to compute P1, P2 and then look for their orthogonal complement. I think A is guaranteed to be symmetric positive definite (thus invertible) as long as you have 3 non collinear points. – rchilton Mar 19 '12 at 22:08
  • yeah, it makes a lot more sense. It's hard for me to see how this solution is better. calculating the full inverse does not seem to be a good idea - it's likely to be overrun with roundoff errors. – duffymo Mar 19 '12 at 22:15
  • A is only 3x3, makes it a little better. You can always replace the explicit inverse calculation with any "inverse-revealing" decomposition (LU, for instance - backsolving by L then by U is equivalent to applying the inverse but generally cheaper and more accurate - can do the same kinds of tricks with QR, SVD). You have a good point though - if your points are truly well approximated by a plane, then A is close to a matrix of low rank. An eigen or SVD decomposition is the best way to "invert" A. I do think the "first" PCA algorithm (N=P1xP2) is more robust in the face of near singular A. – rchilton Mar 19 '12 at 22:27
  • Just one more comment, it's exactly the singularity of A that you are really trying to find/measure. The nullspace of A (if it's singular) or it's least dominant eigen/singular vector (if it's close to singular) is the normal of the sought plane. That fact that PCA/SVD explicitly separates/orthogonalizes these spaces is (IMO) a good thing. – rchilton Mar 19 '12 at 22:31
  • Wouldn't A be 4x4? There's a constant plus one coefficient for each of x, y, and z. And yes, LU or anything else is likely to be better than inversion. – duffymo Mar 20 '12 at 01:07
  • A is indeed 3x3. There are 4 unknowns, but we just compute the normal (x,y,z) using PCA. The fourth unknown (offset from the origin) is postprocessed. Also, was too late to edit - that "A is invertible iff it was built from 3 non collinear points" comment should instead be "4 non coplanar points". – rchilton Mar 20 '12 at 12:17