6

I have this Python based service daemon which is doing a lot of multiplexed IO (select).

From another script (also Python) I want to query this service daemon about status/information and/or control the processing (e.g. pause it, shut it down, change some parameters, etc).

What is the best way to send control messages ("from now on you process like this!") and query processed data ("what was the result of that?") using python?

I read somewhere that named pipes might work, but don't know that much about named pipes, especially in python - and whether there are any better alternatives.

Both the background service daemon AND the frontend will be programmed by me, so all options are open :)

I am using Linux.

agnsaft
  • 1,791
  • 7
  • 30
  • 49

2 Answers2

9

Pipes and Named pipes are good solution to communicate between different processes. Pipes work like shared memory buffer but has an interface that mimics a simple file on each of two ends. One process writes data on one end of the pipe, and another reads that data on the other end.

Named pipes are similar to above , except that this pipe is actually associated with a real file in your computer.

More details at

In Python, named pipe files are created with the os.mkfifo call

x = os.mkfifo(filename)

In child and parent open this pipe as file

out = os.open(filename, os.O_WRONLY)
in = open(filename, 'r')

To write

os.write(out, 'xxxx')

To read

lines = in.readline( )

Edit: Adding links from SO

You may want to read more on "IPC and Python"

Community
  • 1
  • 1
pyfunc
  • 65,343
  • 15
  • 148
  • 136
  • Thanks mate. Is this the preferred way for doing the task I described above? – agnsaft Sep 27 '10 at 20:03
  • 1
    @invictus: You could send a lot of custom messages this way which the process can handle. But for shutting down etc it is better to use signals with signal handlers in the process. – pyfunc Sep 27 '10 at 20:08
  • If I need multiple frontends communicating with the service at the same time, I would need multiple named pipes as well in order to "address" the query results correctly? – agnsaft Sep 28 '10 at 05:19
  • 1
    @invictus : Yes it does support two-way read and write. The setting depends on what you want to achieve with it. If you had multiple frontend communicating with the service, I would suggest a socket based solution, where a server could listen, connect and respond to command messages – pyfunc Sep 28 '10 at 19:03
3

The best way to do IPC is using message Queue in python as bellow

server process server.py (run this before running client.py and interact.py)

from multiprocessing.managers import BaseManager
import Queue
queue1 = Queue.Queue()
queue2 = Queue.Queue()
class QueueManager(BaseManager): pass
QueueManager.register('get_queue1', callable=lambda:queue1)
QueueManager.register('get_queue2', callable=lambda:queue2)
m = QueueManager(address=('', 50000), authkey='abracadabra')
s = m.get_server()
s.serve_forever()

The inter-actor which is for I/O interact.py

from multiprocessing.managers import BaseManager
import threading
import sys
class QueueManager(BaseManager): pass
QueueManager.register('get_queue1')
QueueManager.register('get_queue2')
m = QueueManager(address=('localhost', 50000),authkey='abracadabra')
m.connect()
queue1 = m.get_queue1()
queue2 = m.get_queue2()

def read():
    while True:
        sys.stdout.write(queue2.get())

def write():
    while True:
        queue1.put(sys.stdin.readline())
threads = []

threadr = threading.Thread(target=read)
threadr.start()
threads.append(threadr)

threadw = threading.Thread(target=write)
threadw.start()
threads.append(threadw)

for thread in threads:
    thread.join()

The client program Client.py

from multiprocessing.managers import BaseManager
import sys
import string
import os

class QueueManager(BaseManager): pass
QueueManager.register('get_queue1')
QueueManager.register('get_queue2')
m = QueueManager(address=('localhost', 50000), authkey='abracadabra')
m.connect()
queue1 = m.get_queue1()
queue2 = m.get_queue2()


class RedirectOutput:
    def __init__(self, stdout):
        self.stdout = stdout
    def write(self, s):
        queue2.put(s)

class RedirectInput:
    def __init__(self, stdin):
        self.stdin = stdin
    def readline(self):
        return queue1.get()

# redirect standard output

sys.stdout = RedirectOutput(sys.stdout)

sys.stdin = RedirectInput(sys.stdin)

# The test program which will take input and produce output 
Text=raw_input("Enter Text:")
print "you have entered:",Text
def x():
    while True:
        x= raw_input("Enter 'exit' to end and some thing else to continue")
        print x
        if 'exit' in x:
            break
x()

this can be used to communicate between two process in network or on same machine remember that inter-actor and server process will not terminate until you manually kill it.

Mr. A
  • 1,221
  • 18
  • 28
  • 1
    This looks useful (curious if you would still set it up this way w/ Python 3 today), but to be clear, as I understand it the server process exists only to define the queues, the inter-actor process is the service process doing background work, and the client is providing user interaction control of the inter-actor, is that right? – Mike Lippert Jan 20 '18 at 19:37