Say I have a huge immutable dataset represented as say a tuple. Lets say this dataset consumes much of the working memory so it is impossible to copy it.
is there a way in python to share that tuple with other python processes on the same machine, such that:
- the data does not need to be copied, neither wholly nor in small parts
- access to the data is fast and does not rely on IPC like sockets and pipes
- I dont have to represent the data as RAW shared memory - i.e. I can keep using it as tuples
- the representation maintains immutability semantics - i.e. I can't easily overwrite the memory and ruin computations
- ideally it would be cross platform, or at least windows + linux.