I have a 7x7 covariance matrix (represented as an numpy array).
t = np.array(
[
[1.4, 0.3, 0.4, 0.8, 0.4, 0.9, 0.3],
[0.3, 1.3, 0.4, 2.3, 0.4, 2.4, 0.4],
[0.4, 0.4, 1.3, 2.8, 0.4, 1.0, 0.3],
[0.8, 2.3, 2.8, 9.5, 1.0, 7.0, 1.0],
[0.4, 0.4, 0.4, 1.0, 1.1, 1.2, 0.3],
[0.9, 2.4, 1.0, 7.0, 1.2, 7.7, 1.0],
[0.3, 0.4, 0.3, 1.0, 0.3, 1.0, 0.5],
],
dtype=np.float64,
)
I have checked this matrix to be symmetric.
np.allclose(t, t.T)
True
And np.linalg.svd returns valid non-negative singular values. However, np.linalg.eigvalsh, is returning a negative eigenvalue.
min(np.linalg.eigvalsh(t))
-0.06473876145336957
This doesnt make too much sense to me as I have checked that the column of the matrix are linearly independent (getting the reduced row echelon form of the matrix).
import sympy
reduced_form, inds = sympy.Matrix(t.values).rref()
I see similar issues where people have reported eigvalsh to return negative eigenvalue for a well behaved matrix but none of the suggestions have helped. Examples:
- Why is scipy's eigh returning unexpected negative eigenvalues?
- numpy.cov or numpy.linalg.eigvals gives wrong results
I was wondering if anyone has faced a similar issue with np.linalg.eigvalsh and have any recommendations on how to solve it?
Thank you so much.