0

I know that similar questions occasionally appear

communication between 2 programs in python or Communication between two python scripts

but my problem is much more simple. I have two python processes continuously running on the same computer and slave process occasionally needs three float numbers from master process.

Not being a programming expert, I would reserve three float slots somewhere in computer memory, which would be constantly updated by the master process, and the slave process would simply read this memory slots when it needed information. So essentially master is talking all the time and slave is listening only when it needs information.

Is that possible to do without much fuss and effort? If possible, keep answers simple, as I am not an programming expert.

Community
  • 1
  • 1
Pygmalion
  • 785
  • 2
  • 8
  • 24
  • have you considered using sockets to communicate between the two processes? – James Kent Apr 09 '15 at 11:06
  • @JamesKent I didn't, can you please provide some good link for starters? (I came across of it but it seemed to be connected to Internet communication and not inter-computer communication.) – Pygmalion Apr 09 '15 at 11:15
  • http://www.tutorialspoint.com/python/python_networking.htm is a reasonable resource for socket programming, as a general rule you would usually use it between machines or over the internet, but it can easily be used for inter process communications, especially if there is a dedicated master process that hosts the port and other processes connect to the port, and depending on the IP/port you chose to use you can do one to one, one to many (with individual data streams) and one to many, (same data to all listeners) via multicast, all depending on what you want to achieve. – James Kent Apr 10 '15 at 13:48

3 Answers3

1

It depends on the level of programming expert you're not, how quickly you need these numbers to be updated, and what OS you're on.

All that being said, the "Easy" way is simply to write them to a file (or files) that the slave process monitors for change, and reads when they're updated.

Nick Bastin
  • 30,415
  • 7
  • 59
  • 78
  • If possible, it should be OS independent, but currently I am developing it for Windows. The idea about disk file occurred to me too, but since update should be on scale of 0.1 s, I would rather avoid hard disk interference for speed and reliability. – Pygmalion Apr 09 '15 at 10:57
  • @Pygmalion: 100ms is trivial on any platform for a filesystem to accomplish. This file, when written, will be cached in memory by the VFS layer by any modern OS (and by "modern" I mean "written in the last 10 years"), and the update time will be set properly. If you want to get updates every 100ms then just set your slave to poll every 50ms and you will be more than fine on any modern system. Use memory if you need sub-ms timings. – Nick Bastin Apr 09 '15 at 11:25
  • +1 for explanation. Yet, this means that my disk would be constantly spinning (and become less responsive to other requests as well as use more energy)? – Pygmalion Apr 09 '15 at 11:29
  • @Pygmalion: The filesystem layer in your operating system will force this file into memory (buffer) and will not keep spinning the drive if all you are doing is reading it. If you're constantly writing it then that will be pushed back to the underlying disk reasonably quickly and cause spinning (but not delay or bad responsiveness in general). You can also use a tool like [ImDisk](http://www.ltr-data.se/opencode.html/#ImDisk) to create a ram disk to place this file on. – Nick Bastin Apr 10 '15 at 09:02
0

Could the two processes (foo and bar below) not simply be threads of a master process?

import time
import threading

x = 0.0
y = 0.0
z = 0.0
lock = threading.RLock()

def foo():
    global lock
    while True:
        time.sleep(10)
        with lock:
            print x, y, z

def bar():
    global x, y, z, lock
    while True:
        time.sleep(2)
        with lock:
            x += 0.5
            y += 0.6
            z += 0.7

fooThread = threading.Thread(target=foo)
fooThread.daemon = True
barThread = threading.Thread(target=bar)
barThread.daemon = True

barThread.start()
fooThread.start()

The 'lock' may not be required, but is good practice in multithreaded programming.

georgevanburgh
  • 169
  • 1
  • 8
  • Tempting idea, but not. To be more specific there is one program ("master") for measuring and managing temperature, and the other programs ("slaves") for measuring other stuff. Each of those programs are hundreds of lines long and just too different to be part of single process. – Pygmalion Apr 09 '15 at 11:14
  • The problem sounds like an ideal candidate for multithreading, with an individual thread for each 'measurer'. Assuming each of your polling services is a class, it should be trivial to spin up a thread for each one. If you're set on keeping them separate, the problem becomes much harder. The whole point of separate processes is to maintain memory isolation - one cannot simply (or easily) 'access' another processes memory space. I'd personally look into [Flask](http://flask.pocoo.org/) to create a RESTful endpoint for each service, or the answers to the IPC questions you originally posted. – georgevanburgh Apr 09 '15 at 11:29
0

In the end I have used RPyC library (https://rpyc.readthedocs.org/en/latest/). Easy to use and powerful.

Pygmalion
  • 785
  • 2
  • 8
  • 24