2

I have a 7x7 covariance matrix (represented as an numpy array).

t = np.array(
    [
        [1.4, 0.3, 0.4, 0.8, 0.4, 0.9, 0.3],
        [0.3, 1.3, 0.4, 2.3, 0.4, 2.4, 0.4],
        [0.4, 0.4, 1.3, 2.8, 0.4, 1.0, 0.3],
        [0.8, 2.3, 2.8, 9.5, 1.0, 7.0, 1.0],
        [0.4, 0.4, 0.4, 1.0, 1.1, 1.2, 0.3],
        [0.9, 2.4, 1.0, 7.0, 1.2, 7.7, 1.0],
        [0.3, 0.4, 0.3, 1.0, 0.3, 1.0, 0.5],
    ],
    dtype=np.float64,
)

I have checked this matrix to be symmetric.

np.allclose(t, t.T)
True

And np.linalg.svd returns valid non-negative singular values. However, np.linalg.eigvalsh, is returning a negative eigenvalue.

min(np.linalg.eigvalsh(t))
-0.06473876145336957

This doesnt make too much sense to me as I have checked that the column of the matrix are linearly independent (getting the reduced row echelon form of the matrix).

import sympy
reduced_form, inds = sympy.Matrix(t.values).rref()

I see similar issues where people have reported eigvalsh to return negative eigenvalue for a well behaved matrix but none of the suggestions have helped. Examples:

  1. Why is scipy's eigh returning unexpected negative eigenvalues?
  2. numpy.cov or numpy.linalg.eigvals gives wrong results

I was wondering if anyone has faced a similar issue with np.linalg.eigvalsh and have any recommendations on how to solve it?

Thank you so much.

vpy
  • 55
  • 3
  • Have you checked whether things match your expectations when you declare the matrix to be complex128 instead of float64, then, considering these previous questions? If not, what happened? – Marcus Müller Apr 14 '23 at 06:01
  • @MarcusMüller i just re-ran the code snippets above with dtype=np.complex128. Unfortunately, I still get a negative eigenvalue, in fact the eigenvalues are unchanged when I use dtype.float64 and dtype.complex128. – vpy Apr 14 '23 at 06:07
  • 1
    Note that this negative eigenvalue is more than -0.1, that is not that negative. And your "covariance" matrix is obviously rounded to 0.1 precision. In other words, with the precision you have chosen, -0.06 is positive... – chrslg Apr 14 '23 at 07:20

1 Answers1

2

It's totally normal for symmetric matrices to have negative eigenvalues. A matrix being Hermitian only guarantees that its eigenvalues are real. It doesn't guarantee that the eigenvalues are positive.

It's not normal for a covariance matrix to have negative eigenvalues, meaning this isn't actually a covariance matrix, despite what you thought. Perhaps you rounded the entries of an actual covariance matrix?

user2357112
  • 260,549
  • 28
  • 431
  • 505