0

I have a huge dictionary that weighs about 5GB, and I would like to keep it in memory for fast access later.
I am fine with it consuming RAM during that time, to accelerate the reading times.
The general scheme I am looking for is

script1.py

with open("some_really_big_file", 'rb') as handle:
    a = pickle.load(handle) --->Will take ~2 minutes

physical_memory_address = store_to_persistent_memory(a)

with open('memory_path', 'wb') as handle:
    pickle.dump(physical_memory_address , handle)
script2.py

with open("memory_path", 'rb') as handle:
    mem_path = pickle.load(handle)

dict = load_persistent_memory(mem_path ) ---> Really fast
DsCpp
  • 2,259
  • 3
  • 18
  • 46
  • 1
    Does this answer your question? [Possible to share in-memory data between 2 separate processes?](https://stackoverflow.com/questions/1268252/possible-to-share-in-memory-data-between-2-separate-processes) – johannesack Apr 23 '20 at 08:29
  • So basically the first script should stay open, or without mmap.close() it stays in memory? – DsCpp Apr 23 '20 at 08:33
  • It might automatically be closed by the system, I really don't know. – johannesack Apr 23 '20 at 08:53
  • Well, mmap doesn't seem to be the solution for the problem, as it's only virtually in memory, but when trying to query the object, the whole process of fetching it still occurs. – DsCpp Apr 23 '20 at 14:13

0 Answers0