I have 2 code bases, one in python, one in c++. I want to share real time data between them. I am trying to evaluate which option will work best for my specific use case:
- many small data updates from the C++ program to the python program
- they both run on the same machine
- reliability is important
- low latency is nice to have
I can see a few options:
- One process writes to a flat file, the other process reads it. It is non scalable, slow and I/O error prone.
- One process writes to a database, the other process reads it. This makes it more scalable, slightly less error prone, but still very slow.
- Embed my python program into the C++ one or the other way round. I rejected that solution because both code bases are reasonably complex, and I prefered to keep them separated for maintainability reasons.
- I use some sockets in both programs, and send messages directly. This seems to be a reasonable approach, but does not leverage the fact that they are on the same machine (it will be optimized slightly by using local host as destination, but still feels cumbersome).
- Use shared memory. So far I think this is the most satisfying solution I have found, but has the drawback of being slightly more complex to implement.
Are there other solutions I should consider?