0

I have a file1 with code that increments the value of shared_var once a second:

file1.py

import time
from multiprocessing import Value, Lock, Process

def update_shared_variable(shared_var, lock):
    for i in range(100):
        with lock:
            shared_var.value = i  # Update the value of the shared variable
        time.sleep(1)  
        print("Updated value in file1:", shared_var.value)

if __name__ == '__main__':
    shared_var = Value('i', 0) 
    lock = Lock()  

    update_process = Process(target=update_shared_variable, args=(shared_var, lock))
    update_process.start()

    update_process.join()

Result:

Updated value in file1: 0

Updated value in file1: 1

Updated value in file1: 2

Updated value in file1: 3

Updated value in file1: 4

Updated value in file1: 5

in file2 I'm trying, get the current shared_var value from file1 when it's running:

file2.py

import time
from multiprocessing import Value, Process

def get_current_value(shared_var):
    while True:
        current_value = shared_var.value
        print("Current value of x from file1:", current_value)
        time.sleep(2)  

if __name__ == '__main__':
    shared_var = Value('i', 0) 

    # Create a process to get the current value of x
    get_value_process = Process(target=get_current_value, args=(shared_var,))
    get_value_process.start()

    get_value_process.join()

But I get this result:

Current value of x from file1: 0

Current value of x from file1: 0

Current value of x from file1: 0

Current value of x from file1: 0

Please tell me, what is the error here? And how to get the current shared_var value from file1 in file2(while file1 is running)?

Albert Einstein
  • 7,472
  • 8
  • 36
  • 71

1 Answers1

0

You can not access a variable in separate python processes using python's multiprocessing library. By separate process, I mean processes that are started separately using python command, as opposed to, child/sub processes started from a single python process.

When you call process.start(), a child process is created (using fork, spawn or fork server) from the main python process (called the parent process, and initiated with the command line). Now if you start multiple processes from the main process, you will be able to share variables and stuff, as the python library will automatically handle communication between these so-called child processes. (see this article which describe how multiprocessing works in python)

When you start two scripts using two distinct python command calls, you simply have two distinct main python processes which are not able to communicate using the built-in multiprocessing library, as processes are not handled by this library. If you want to resolve this problem, you have to use the network or other library (zeromq) to write your own inter-process communication (see this post).

In your case, you try to start two distinct processes and expect communication between those. However, shared_var from file1 is not related whatsoever to shared_var from file2. Using the network, you could pass the value of shared_var to the other shared_var, but you have to handle it manually.

Another solution to your problem would be to use a single main process. Create a single script and spawn two processes from the script, and pass shared_var to both processes, as follows:

import time
from multiprocessing import Value, Lock, Process

def get_current_value(shared_var):
    while True:
        current_value = shared_var.value
        print("Current value of x from file1:", current_value)
        time.sleep(2)  

def update_shared_variable(shared_var, lock):
    for i in range(100):
        with lock:
            shared_var.value = i  # Update the value of the shared variable
        time.sleep(1)  
        print("Updated value in file1:", shared_var.value)

if __name__ == '__main__':
    shared_var = Value('i', 0) 
    lock = Lock()  

    update_process = Process(target=update_shared_variable, args=(shared_var, lock))
    update_process.start()

    another_process = Process(target=get_current_value, args=(shared_var,))
    another_process.start()

    update_process.join()
    another_process.join()
ftorre
  • 501
  • 1
  • 9
  • 1
    Thank you very much for your detailed response! I tried the zeromq library and I got exactly what I wanted, thanks! – Frank Cowperwood Jul 14 '23 at 14:39
  • 1
    Executing `shared_var = Value('i', 0)` is the same as `shared_var = Value('i', 0', lock=True)`. That is, a lock is created for concurrent access and that lock can be retrieved with `shared_var.get_lock()`. So creating a separate `Lock` instance is superfluous. Moreover, you do not need to use a lock at all in the code since assigning a value such as 1 to the shared variable is an *atomic* operation even if you had multiple processes updating the variable, which you don't. You would need to use a lock if you had instead `shared_var.value += 1` being executed by multiple processes. (More..) – Booboo Jul 15 '23 at 11:36
  • 1
    So execute instead `shared_var = Value('i', 0, lock=False)` and do not use any locking. – Booboo Jul 15 '23 at 11:37