I'm currently doing FEM calculations in java on very large square matrices with sizes up to 1M x 1M. These are very sparse though with under 10M entries. I'm using ojAlgo with the SparseStore matrix implementation and I'm really happy with it so far. The problem is, when I'm solving the linear equations system at the end using LU.R064.make().solve(A,b), with SparseStore A and b, the implementation of this solver automatically converts the sparse matrices into dense ones, leading to huge memory costs and runtimes. Is there a more efficient way to use ojAlgo or another library?
Asked
Active
Viewed 93 times
1 Answers
1
There are currently no sparse matrix decompositions in ojAlgo.
There are some iterative equation system solvers that work with sparse "equations". The selection is somewhat limited – Gauss–Seidel and Conjugate gradient. There are preconditions for when these can be used.
Have look at this interface and its implementations:
org.ojalgo.matrix.task.iterative.IterativeSolverTask.SparseDelegate
Maybe that SparseStore
can be replaced by a List<Equation>
? Equation
:s can be sparse, and you can feed that list directly to the solver.
Also note that in addition to SparseStore
there are also RowsSupplier
and ColumnsSupplier
that implement MatrixStore
. Their rows/columns can be wrapped (no copying) to create Equation
:s.

apete
- 1,250
- 1
- 10
- 16
-
Doing it like this: ConjugateGradientSolver.R064.solve(A,b); takes up even more memory than the LU variant. I don't seem to be able to replicate your approach it seems. Would you be so kind as to provide an example? – Jakob Rainer May 24 '23 at 09:23
-
1There is no (factory) constant ConjugateGradientSolver.R064. You are in fact using SolverTask.R064 and that will not give you a ConjugateGradientSolver. You have to do new ConjugateGradientSolver(); – apete May 24 '23 at 18:11
-
1If possible, replace the SparseStore with a List
. – apete May 24 '23 at 18:15 -
Tip for others: Be careful when using nonzeros().stream(), it still gives zeros. – Jakob Rainer Jun 01 '23 at 10:30