I have a huge dictionary that weighs about 5GB, and I would like to keep it in memory for fast access later.
I am fine with it consuming RAM during that time, to accelerate the reading times.
The general scheme I am looking for is
script1.py
with open("some_really_big_file", 'rb') as handle:
a = pickle.load(handle) --->Will take ~2 minutes
physical_memory_address = store_to_persistent_memory(a)
with open('memory_path', 'wb') as handle:
pickle.dump(physical_memory_address , handle)
script2.py
with open("memory_path", 'rb') as handle:
mem_path = pickle.load(handle)
dict = load_persistent_memory(mem_path ) ---> Really fast