2

I have two separate python processes running on a linux server, p1 and p2, how to read a dict of p1 from p2 ?

Two processes are independent, so I can't use multiprocessing based approach, and because of slow performance, I don't want to use socket communication or file based approach. my python version is 3.5.1

V Y
  • 685
  • 10
  • 21
  • http://stackoverflow.com/questions/1268252/python-possible-to-share-in-memory-data-between-2-separate-processes – Igor Jul 17 '16 at 17:25
  • By two seperate processes, you mean to different two consoles are working, or two python scripts in one console ? If it is the first one, I don't see another option but socket data transfer, if it is the second one, you can store the dictionary in the folder that scripts are(maybe as a text file), and read it from another script. – Rockybilly Jul 17 '16 at 17:26
  • 2
    “How to share [...] among separate python processes” — “This is NOT about multiprocessing”. I'm confused. – spectras Jul 17 '16 at 17:30
  • I updated my question, answers to questions/1268252/ explained about why we can't do that, but what I want to know is how to do it. – V Y Jul 17 '16 at 17:39
  • @spectras , multiprocessing is about forked processes, my question is about independent processes. – V Y Jul 17 '16 at 18:08
  • @VY> So you have a process waiting for another to contact it? Well, that's what sockets are *for*. But perhaps you should give more background on what you are trying to accomplish as I sense you might have an [XY problem](http://meta.stackexchange.com/a/66378) here. – spectras Jul 17 '16 at 22:55
  • @VY> especially as the "performance" part is dubious. I mean, unix domain socket have a throughput of several millions messages per second. If that's not enough, you're either using a wrong approach for your problem, or have so unusual a problem it's worth detailing. – spectras Jul 17 '16 at 23:00
  • @spectras, when data size are small, socket performance is not so bad, however, if you compare reading 1G data directly from memory with socket sending 1G data, there is a huge difference on performance. This is not a XY problem, what I'm trying to solve is exactly in my question: "how to read a dict of p1 from p2 ?" – V Y Jul 17 '16 at 23:49
  • At the extreme, you can use a memory-mapped file (see [mmap](https://docs.python.org/3.0/library/mmap.html)). But keep in mind using it properly is pretty complex. It will just provide a stream of bytes, it's your job to make sense of it and sync your processes (with semaphores) so one does not overwrite the data the other is reading. But unless you're doing this as an exercise or a toy problem, there is definitely an architecture issue with what you want. How about using a proper store such as [redis](http://redis.io/) or [mongoDB](https://www.mongodb.com/)? – spectras Jul 18 '16 at 12:10

1 Answers1

0

I think that the only way of doing that is using IPC. You can do that using sockets, PIPES. And for all these methods you have to serialise them with pickle or json. If the dictionary is big it can take several seconds.

If you don't want to do that you should have some kind of shared memory. Multiprocessing allows that but only with basic datatypes.

edgarstack
  • 1,096
  • 1
  • 10
  • 18
  • I just updated my question, two processes are independent, so I can't use multiprocessing based approach, and because of slow performance, I don't want to use socket communication or file based approach. – V Y Jul 17 '16 at 17:40
  • I was facing that problem several weeks ago. The only way of doing that is through shared memory, however I did not find anything for dictionaries. – edgarstack Jul 17 '16 at 17:41
  • shared memory can only be used between forked processes, how to use it between independent processes ? – V Y Jul 17 '16 at 18:06