35

If I have a python script running (with full Tkinter GUI and everything) and I want to pass the live data it is gathering (stored internally in arrays and such) to another python script, what would be the best way of doing that?

I cannot simply import script A into script B as it will create a new instance of script A, rather than accessing any variables in the already running script A.

The only way I can think of doing it is by having script A write to a file, and then script B get the data from the file. This is less than ideal however as something bad might happen if script B tries to read a file that script A is already writing in. Also I am looking for a much faster speed to communication between the two programs.

EDIT: Here are the examples as requested. I am aware why this doesn't work, but it is the basic premise of what needs to be achieved. My source code is very long and unfortunately confidential, so it is not going to help here. In summary, script A is running Tkinter and gathering data, while script B is views.py as a part of Django, but I'm hoping this can be achieved as a part of Python.

Script A

import time

i = 0

def return_data():
    return i

if __name__ == "__main__":
    while True:
        i = i + 1
        print i
        time.sleep(.01)

Script B

import time
from scriptA import return_data

if __name__ == '__main__':
    while True:
        print return_data()  # from script A
        time.sleep(1)
Trenton McKinney
  • 56,955
  • 33
  • 144
  • 158
Jordan Gleeson
  • 531
  • 1
  • 4
  • 12
  • You should be able to import one module to the other, instantiate a single instance (using a singleton if necessary) and then assign attributes/values to this instance so you can read from it as needed in the secondary script. – JacobIRR May 09 '17 at 04:32
  • If the scripts aren't too long or sensitive, it would help to see the source code – JacobIRR May 09 '17 at 04:33
  • 1
    Perhaps you can use file socket? That seems an option for streaming data. –  May 09 '17 at 04:36
  • The question is to vague. "pass the live data to another script" could mean many different things. How are you passing it? Over a socket? via a restful interface? As command line arguments? Do you pass the data once when starting the second program or is the data constantly updated as it changes? Please show a [Minimal, Complete, and Verifiable example](http://stackoverflow.com/help/mcve) – Bryan Oakley May 09 '17 at 11:49

5 Answers5

29

you can use multiprocessing module to implement a Pipe between the two modules. Then you can start one of the modules as a Process and use the Pipe to communicate with it. The best part about using pipes is you can also pass python objects like dict,list through it.

Ex: mp2.py:

from multiprocessing import Process,Queue,Pipe
from mp1 import f

if __name__ == '__main__':
    parent_conn,child_conn = Pipe()
    p = Process(target=f, args=(child_conn,))
    p.start()
    print(parent_conn.recv())   # prints "Hello"

mp1.py:

from multiprocessing import Process,Pipe

def f(child_conn):
    msg = "Hello"
    child_conn.send(msg)
    child_conn.close()
Akshay Apte
  • 1,539
  • 9
  • 24
  • This seems to be the way to go, however why doesn't the following work? If you add this code to the end of mp1.py: `i = 0 def g(): print i if __name__ == "__main__": while True: i = i + 1 g() ` Why doesn't running mp2.py return the current i? – Jordan Gleeson May 09 '17 at 05:02
  • Didn't quite understand your question there. however if you want to call the function g(), you need to specify it in `p = Process(target=g, args=(child_conn,))` – Akshay Apte May 09 '17 at 08:34
  • See examples in edit, if you insert your code into them, mp2 will return 0 rather than whatever i is at in mp1. – Jordan Gleeson May 09 '17 at 22:28
11

If you wanna read and modify shared data, between 2 scripts, which run separately, a good solution is, take advantage of the python multiprocessing module, and use a Pipe() or a Queue() (see differences here). This way, you get to sync scripts, and avoid problems regarding concurrency and global variables (like what happens if both scripts wanna modify a variable at the same time).

As Akshay Apte said in his answer, the best part about using pipes/queues, is that you can pass python objects through them.

Also, there are methods to avoid waiting for data, if there hasn't been any passed yet (queue.empty() and pipeConn.poll()).

See an example using Queue() below:

    # main.py
    from multiprocessing import Process, Queue
    from stage1 import Stage1
    from stage2 import Stage2


    s1= Stage1()
    s2= Stage2()

    # S1 to S2 communication
    queueS1 = Queue()  # s1.stage1() writes to queueS1

    # S2 to S1 communication
    queueS2 = Queue()  # s2.stage2() writes to queueS2

    # start s2 as another process
    s2 = Process(target=s2.stage2, args=(queueS1, queueS2))
    s2.daemon = True
    s2.start()     # Launch the stage2 process

    s1.stage1(queueS1, queueS2) # start sending stuff from s1 to s2 
    s2.join() # wait till s2 daemon finishes
    # stage1.py
    import time
    import random

    class Stage1:

      def stage1(self, queueS1, queueS2):
        print("stage1")
        lala = []
        lis = [1, 2, 3, 4, 5]
        for i in range(len(lis)):
          # to avoid unnecessary waiting
          if not queueS2.empty():
            msg = queueS2.get()    # get msg from s2
            print("! ! ! stage1 RECEIVED from s2:", msg)
            lala = [6, 7, 8] # now that a msg was received, further msgs will be different
          time.sleep(1) # work
          random.shuffle(lis)
          queueS1.put(lis + lala)             
        queueS1.put('s1 is DONE')
    # stage2.py
    import time

    class Stage2:

      def stage2(self, queueS1, queueS2):
        print("stage2")
        while True:
            msg = queueS1.get()    # wait till there is a msg from s1
            print("- - - stage2 RECEIVED from s1:", msg)
            if msg == 's1 is DONE ':
                break # ends loop
            time.sleep(1) # work
            queueS2.put("update lists")             

EDIT: just found that you can use queue.get(False) to avoid blockage when receiving data. This way there's no need to check first if the queue is empty. This is no possible if you use pipes.

onofricamila
  • 930
  • 1
  • 11
  • 20
2

You could use the pickling module to pass data between two python programs.

import pickle 

def storeData(): 
    # initializing data to be stored in db 
    employee1 = {'key' : 'Engineer', 'name' : 'Harrison', 
    'age' : 21, 'pay' : 40000} 
    employee2 = {'key' : 'LeadDeveloper', 'name' : 'Jack', 
    'age' : 50, 'pay' : 50000} 

    # database 
    db = {} 
    db['employee1'] = employee1 
    db['employee2'] = employee2 

    # Its important to use binary mode 
    dbfile = open('examplePickle', 'ab') 

    # source, destination 
    pickle.dump(db, dbfile)                   
    dbfile.close() 

def loadData(): 
    # for reading also binary mode is important 
    dbfile = open('examplePickle', 'rb')      
    db = pickle.load(dbfile) 
    for keys in db: 
        print(keys, '=>', db[keys]) 
    dbfile.close() 
Magnus Melwin
  • 1,509
  • 1
  • 21
  • 32
2

I solved the same problem using the lib Shared Memory Dict, it's a very simple dict implementation of multiprocessing.shared_memory.

Source1.py

from shared_memory_dict import SharedMemoryDict
from time import sleep

smd_config = SharedMemoryDict(name='config', size=1024)

if __name__ == "__main__":
    smd_config["status"] = True

    while True:
        smd_config["status"] = not smd_config["status"]
        sleep(1)

Source2.py

from shared_memory_dict import SharedMemoryDict
from time import sleep

smd_config = SharedMemoryDict(name='config', size=1024)

if __name__ == "__main__":
    while True:
        print(smd_config["status"])
        sleep(1)
Andre Sampaio
  • 372
  • 3
  • 7
1

If anyone is still looking:

import fcntl

def write_to_file(filename, content):
    with open(filename, 'w') as file:
        fcntl.flock(file, fcntl.LOCK_EX) # Locks the file for writing
        file.write(content)
        fcntl.flock(file, fcntl.LOCK_UN) # Unlocks the file

write_to_file('myfile.txt', 'This is a message')

import time

def read_from_file(filename):
    while True:
        try:
            with open(filename, 'r') as file:
                fcntl.flock(file, fcntl.LOCK_SH) # Locks the file for reading (shared lock)
                content = file.read()
                fcntl.flock(file, fcntl.LOCK_UN) # Unlocks the file
                return content
        except IOError:
            print("File is being written to, waiting...")
            time.sleep(1)

content = read_from_file('myfile.txt')
print(content)

You can add a flag line in the txt so you can add synch reading and writting. If you require the txt file to be updated before the other script reads it. Such as. Flag is False in the txt file representating that the file was not updated since last read of the file. Whenever information is read from the file the Flag is changed from True to False; whenever information is written to the file the Flag is changed from False to True.

I have something else but it requires the array always being the same size, I use it for sharing frames between many running scripts.

Dean
  • 11
  • 3