4

For my problem I am only interested in few eigenstates (with smallest eigenvalues) of a sparse real symmetric matrix A. As far as I can see, arpack uses a different method and should be much faster than the full diagonalization of the LinearAlgebra package. Why is it much slower in my example?

using LinearAlgebra, SparseArrays, Arpack    
A = sprand(5000,4995,0.01) # Matrix with 5-dimensional nullspace
H = sparse(Hermitian(A*A'))
@time E, v  = eigen(Matrix(H))
@time E, v  = eigs(H, nev=10, which=:SM)

> 12.059152 seconds (27 allocations: 764.733 MiB, 0.72% gc time)
> 37.628222 seconds (680 allocations: 1.424 GiB, 0.47% gc time)
varantir
  • 6,624
  • 6
  • 36
  • 57

1 Answers1

2

When computing the smallest eigenvalues with eigs, it's necessary to compute (H - λ*I)\x for some shift λ at each iteration of the algorithm. In our implementation of eigs this results in a sparse factorization of H - λ*I for each iteration and that is costly. Furthermore, computing the smallest value sometimes require many iterations compared to computing large value but I'd suspect the costs of the sparse factorizations are dominating. You can get an idea about the costs by computing which=:LM instead of :SM since the former only requires matrix-vector multiplications.

Andreas Noack
  • 1,370
  • 6
  • 11