1

I created a python script that relies heavily on the matplotlib.pyplot.contour. I tested it on my computer and on a google cloud server, and it didn't use more than 500MB ram. But when I moved it to a server with slurm, it failed with "Exceeded job memory limit" error, with 7000MB ram limit. The python version was 3.7.3 everywhere, with 3.1.0 matplotlib and 1.16.4 numpy. All of them were installed with conda, on the same week.

There is a Python 3.6.1 on the slurm server as well, with 2.2.0 matplotlib and 1.14.2 numpy. It could run the script without the "Exceeded job memory limit" error, but it was much slower.

Here is a simple code, which reproduces the error with the slurm, but runs well on the other places:

from matplotlib import pyplot as plt
import numpy as np

R = np.linspace(-5, 5, 1500)
U, V = np.meshgrid(R, R)

for r in R: D = plt.contour(U, V, U**2+V**2+r**2, levels=[21, 23, 25, 27, 29])

It finished running on my pc with about 818MB ram usage, but the slurm killed it at 2473MB (with 2000MB limit).

Question: How could I solve this memory problem?

Botond
  • 204
  • 1
  • 8
  • Does it help to set the matplotlib backend ? i.e. https://stackoverflow.com/questions/4930524/how-can-i-set-the-backend-in-matplotlib-in-python – Robert Davy Aug 21 '19 at 23:11
  • @RobertDavy I tried a lot of mode, but it did not help. – Botond Aug 22 '19 at 06:33
  • @RobertDavy I deleted the miniconda3 folder, downloaded and installed again. I did everything the same way and it's working now. I don't understand it, but I'm happy with the result. – Botond Aug 23 '19 at 10:06

0 Answers0