0

I am trying to develop an application for the Raspberry Pi3. In fact 2 applications. Application 1 will be a simple gpio read, count the number of times a particular gpio input goes high and log into file . Application 2 will be to show the GPIO status on screen. I want the application 1 to continuously run and log data. That is why I have separated this from the UI based application. Now I want to get the gpio status from application 1 - data will be pin1State, pin1HighCount,pin2State,pin2HighCount. All this will be integers. Application 2 should get data from application 1 and display in a pyqt5 based UI screen. I tried to follow this example IPC shared memory across Python scripts in separate Docker containers But I found that it is a dictionary based data exchange and also it is not continuous or almost real time. The dictionary is first populated and then loaded in the server.py I am not able to find much info on this method elsewhere. I like the local host based socket approach without using files (temporary or otherwise). But I am not able to get continuous data. Also , is it possible to do use a list or even individual integer variables instead of a dictionary. I am worried that by continuously updating (Appending) to the dictionary, it might create a memory overload if the script runs for long durations my code as follows Server.py

from multiprocessing.managers import SyncManager
import multiprocessing

patch_dict = {}



def load_patch_dict():

    
    value = input("Press key to continue \n")
    for i in range(0,2000):
        patches = 1
        
        patch_dict.update([(i, i)])
        print("Patch Print ",i," . ",patch_dict)
       
    #print(patch_dict)

def get_patch_dict():
    
    return patch_dict

class MyManager(SyncManager):
    pass

if __name__ == "__main__":
    load_patch_dict()
    port_num = 5000
    MyManager.register("patch_dict", get_patch_dict)
    manager = MyManager(("127.0.0.1", port_num), authkey=b"password")
    # Set the authkey because it doesn't set properly when we initialize MyManager
    multiprocessing.current_process().authkey = b"password"

    manager.start()

    input("Press any key to kill server".center(50, "-"))
    manager.shutdown()


Client.py

rom multiprocessing.managers import SyncManager
import multiprocessing
import sys, time

class MyManager(SyncManager):
    pass

# MyManager.register("patch_dict")

if __name__ == "__main__":
    port_num = 5000
    MyManager.register("patch_dict")
    manager = MyManager(("127.0.0.1", port_num), authkey=b"password")
    multiprocessing.current_process().authkey = b"password"
    manager.connect()
    patch_dict = manager.patch_dict()

    keys = list(patch_dict.keys())

    #print("Keys ", keys)
    value = input("Press key to continue \n")
    do_loop = True
    i=1
    for key in keys:
    
        
        image_patches = manager.patch_dict.get(key)
        
        print("This is ",image_patches)
        

suresh
  • 29
  • 9

1 Answers1

0

Ok. the goal was to share data between 2 python scripts. I abandoned the above solution taken from another question. Instead I took a different approach : using memcache I first installed python-memcached using pip Then i used the following code program A

import time
import memcache

def run_loop():

    client = memcache.Client([('127.0.0.1', 5000)])
    value = input("Press key to continue \n")
    for i in range(0,20000):
        sendvalue = "It Is " + str(i)
        client.set('Value', sendvalue)
        print("sent value  ",sendvalue)
        time.sleep(0.005)

if __name__ == "__main__":
    run_loop()

Program B

import sys, time
#import pymemcache
import memcache
if __name__ == "__main__":
    #print("Keys ", keys)
    value = input("Press key to continue \n")
    do_loop = True
    i=1
    client = memcache.Client([('127.0.0.1', 5000)])
    #for key in keys:
    while do_loop:
        ret_value =  client.get('Value')
        if ret_value != None:
            print("Value returned is ",ret_value)

IMPORTANT *** START memcached . I am using Linux to run these programs. Not sure of windows environment

As far as my testing on the linux machine, the data is getting shared properly. Of course, I have to test moving lists, data exchange etc which I think should not be a problem. I tried the same from How to share variables across scripts in python? I ran into some issues. Of course, memcached should be running But this really helped : https://realpython.com/python-memcache-efficient-caching/

In fact we can also set expiry to the cache. That is great I am still learning. So correct me if I am wrong somewhere Thanks

suresh
  • 29
  • 9