2

Does it make sense to use numpy's memmap across multiple cores (MPI)?

I have a file on disk.

Can I create a separate memmap object on each core, and use it to read different slices from the file?

What about writing to it?

user3666197
  • 1
  • 6
  • 50
  • 92
Tom
  • 223
  • 1
  • 2
  • 11
  • 1
    Could you describe your application in a bit more detail? Also, there is a pretty helpful answer here: https://stackoverflow.com/questions/16149803/working-with-big-data-in-python-and-numpy-not-enough-ram-how-to-save-partial-r/16633274#16633274 – Jan Jun 20 '20 at 01:29

1 Answers1

3

Q : "Does it make sense to use numpy's memmap across multiple cores (MPI)?"

Yes ( ... even without MPI, using just Python native { thread- | process-}-based forms of concurrent-processing )

Q : "Can I create a separate memmap-object on each core, and use it to read different slices from the file?"

Yes.

Q : "What about writing to it?"

The same ( sure, if having been opened in write-able mode ... )

user3666197
  • 1
  • 6
  • 50
  • 92