0

I try to do Principal Component Analysis of the breast_canser dataset using Python sklearn. And can't understand why both dot products (3 components) of eigenvectors aren't zeros?

frst = pca.components_[0,:]
scnd = pca.components_[1,:]
thrd = pca.components_[2,:]
orth1 = np.dot(frst,scnd)
orth2 = np.dot(scnd, thrd)
print(orth1.real)
print(orth2.real)

out:

0.0

1.52655665886e-16

Anastasiia
  • 21
  • 4
  • 3
    Possible duplicate of [Is floating point math broken?](https://stackoverflow.com/questions/588004/is-floating-point-math-broken) – Sneftel Sep 16 '17 at 15:34

1 Answers1

3

Floating point arithmetic isn't always 100% accurate since computers use a finite amount of digits to represent a number with infinite digits. 1.52655665886e-16 ~ machine epsilon the upper bound on relative error due to floating point operations, so I'd count it as 0.

EDIT: You could also run into this issue if your matrix doesn't have distinct eigenvalues.

Skam
  • 7,298
  • 4
  • 22
  • 31