5

I'm trying to solve a system of equations that is a 1 Million x 1 Million square matrix and one 1 Million solution vector.

To do this, I'm using np.linalg.solve(matrix, answers) but it's taking a very long time.

Is there a way to speed it up?

Thanks @Chris but that doesn't answer the question since I've also tried using the Scipy module and it still takes a very long time to solve. I don't think my computer can hold that much data in RAM

OK for clarity, I've just found out that the name of the matrix that I'm trying to solve is a Hilbert matrix

ortunoa
  • 345
  • 4
  • 11
  • Does this answer your question? [Best performance method when solving 7000x7000 linear system with python](https://stackoverflow.com/questions/59460326/best-performance-method-when-solving-7000x7000-linear-system-with-python) – Chris Jun 28 '22 at 17:01
  • 1
    How much memory does your computer have? A 1Mx1M matrix should take a few terabytes of RAM. – Dima Chubarov Jun 28 '22 at 17:18
  • 16 Gb, is there any other place in memory where numpy can save the matrix as it solves it? besides RAM? – ortunoa Jun 28 '22 at 17:19
  • Try caching it. Or use numba since you are using numpy. If not, you can try cython or PyPy –  Jun 28 '22 at 17:28
  • 1
    Solving a system of linear equations requires fast access to the whole matrix, so storing the matrix on disk is usually not an option. Since every double precision number occupies 8 bytes, your computer memory could hold about 40,000x40,000 matrix at most. For a 1Mx1M matrix you would probably want at least 12 TB on a single machine or in a cluster. – Dima Chubarov Jun 28 '22 at 17:39

2 Answers2

3

Please reconsider the need for solving such a HUGE system unless your system is very sparse.

Indeed, this is barely possible to store the input/output on a PC storage device: the input dense matrix takes 8 TB with double-precision values and the output will certainly also takes few TB not to mention a temporary data storage is needed to compute the result (at least 8 TB for a dense matrix). Sparse matrices can help a lot if your input matrix is almost full of zeros but you need the matrix to contain >99.95% of zeros so to store it in your RAM.

Furthermore, the time complexity of solving a system is O(n m min(n,m)) so O(n^3) in your case (see: this post). This means a several billion billions operations. A basic mainstream processor do not exceed 0.5 TFlops. In fact, my relatively good i5-9600KF reach 0.3 TFlops in the LINPACK computationally intensive benchmark. This means the computation will certainly take a month to compute assuming is is bounded only by the speed of a mainstream processor. Actually, solving a large system of equations is known to be memory bound so it will be much slower in practice because modern RAM are a bottleneck in modern computers (see: memory wall). So for a mainstream PC, this should take from from several months to a year assuming the computation can be done in your RAM which is not possible as said before for a dense system. Since high-end SSD are about an order of magnitude slower than the RAM of a good PC, you should expect the computation to take several years. Not to mention a 20 TB high-end SSD is very expensive and it might be a good idea to consider power outages and OS failure for such a long computational time... Again, sparse matrices can help a lot, but note that solving sparse systems is known to be significantly slower than dense one unless the number of zeros is pretty small.

Such systems are solved on supercomputers (or at least large computing clusters), not regular PCs. This requires to use distributed computing and tools likes MPI and distributed linear solvers. A whole field of research is working on this topic to make them efficient on large scale systems.

Note that computing approximations can be faster, but one should solve the space problem in the first place...

Jérôme Richard
  • 41,678
  • 6
  • 29
  • 59
  • Thanks for your detailed response, I'm trying to solve this problem as part of a math/computer science problem. I'm starting to come to terms with the fact that brute forcing it might not be the way to go – ortunoa Jun 28 '22 at 22:24
0

You could try the formula for the inverse of the Hilbert matrix given on Wikipedia: https://en.wikipedia.org/wiki/Hilbert_matrix

Similar to the answer given by @jérôme-richard you cannot store the whole inverse matrix in your memory. Instead, you would have to write a function that generates one column of the inverse Hilbert matrix at a time, and then multiplies your right-hand side vector with this column. Doing that 1 million times (for each column) gives you the solution vector. Late answer but hope it still helps!

S. U.
  • 31
  • 5