5

I have a set of points (in the form x1,y1,z1 ... xn,yn,zn) obtained from a surface mesh. I want to find the best-fit 3D plane to these points by minimizing orthogonal distances. x,y,z coordinates are independent, that is I want to obtain the coefficient A, B, C, D for the plane equation Ax + By + Cz + D = 0.

What would be the algorithm to obtain A, B, C, D?

Note: in a previous post it was discussed the best-fit plane in a least squares sense, by considering the z coordinate a linear function of x,y. However this is not my case.

Community
  • 1
  • 1
CodificandoBits
  • 970
  • 9
  • 18

2 Answers2

6

From memory, this turns into an eigenvector problem. The distance from a point to your plane is proportional to Ax + By + Cz + D - one way to see this is to note that a normal to the plane is (A, B, C). The constant D is a pain in the neck, but I think you can get rid of it by redefining your variables to shift it to a constant so that everything has mean 0. In this case I think the best fitting plane will go through the origin.

You then find you want to minimise SUM_i (X_i . A)^2 where A is a 3-vector. Of course you can make this arbitrarily small by multiplying all components of A by some small scalar, so you want to minimise this subject to the constraint that e.g. ||A||^2 = 1, which makes sense of the proportionality by making A a unit vector. (X_i . A)^2 = A' (X_i' X) A, so you want to minimise A' (SUM_i (X_i'X_i)) A So I think you want the minimum eigenvector of SUM_i X_i'X_i

One reason this isn't used more frequently in statistics is that the answer you get will change if you scale the units of any of the co-ordinate vectors without similarly scaling the units in other directions by the same amount.

Come to think of it, you can see all this worked out properly at http://en.wikipedia.org/wiki/Total_least_squares

mcdowella
  • 19,301
  • 2
  • 19
  • 25
  • Could you explain what (X_i . A)^2 = A' (X_i' X) A means? – Nick Oct 04 '12 at 03:49
  • Have a look at the Wikipedia references as well, but here "." in X_i.A is the dot product (also known as the scalar product) and "'" is the matrix transpose. So I am saying that the square of the dot product of two vectors is the same as producing an n x n matrix (X_i'X) by turning one vector on its side an multiplying it with another, and then turning this back into a scalar by multiplying the other vector by it on one side to form a vector and then taking the dot product between that result and the other vector. Having a matrix in the centre turns this into an eigenvalue problem. – mcdowella Oct 04 '12 at 05:25
3

Least Squares Fitting of Data, section 2: "Linear Fitting of nD Points Using Orthogonal Regression".

As mcdowella mentioned, you'll need to solve a 3x3 eigensystem.

celion
  • 3,864
  • 25
  • 19