tl;dr:
passing a lock to a ProcessPoolExecutor
workers will make them not run.
I have a JSON
file I want to update asynchronously using ProcessPoolExecutor
.
to avoid a race condition, I want to use a common lock to the workers.
after trying both methods from this answer, and trying this (which is the same but updated), the process pool
still doesn't function as intended.
the code:
def addToImageVector(path: str, lock):
# to see if the process is running
print('hi guy')
# a real example of how I use the lock
data: dict = None
with lock:
with open('path\\amounts.json', 'r') as f:
data: dict = json.load(fp= f)
.
.
.
# inside another function, which is practically main
for action in ACTIONS:
lock = m.Lock()
NUMBER_OF_CPUS = cpu_count()
# use up to +++ of the cpu`s available
USE_UP_TO = 0.5
with ProcessPoolExecutor(max_workers=np.uint16(NUMBER_OF_CPUS * USE_UP_TO)) as executor:
for file in os.listdir(normpath(VID_PATH + action)):
# this gets printed
print(colored(normpath(VID_PATH + action + file), 'green'))
# doesn't work
executor.submit(addToImageVector, (normpath(VID_PATH + action + file), lock))
the output:
path\boost\b1.mp4
path\boost\b2.mp4
path\boost\b3.mp4
path\boost\b4.mp4
path\boost\b5.mp4
...
path\click\c4.mp4
path\click\c5.mp4
path\click\c6.mp4
path\click\c7.mp4
...
path\click\c9.mp4
path\upgrade\u1.mp4
path\upgrade\u7.mp4
path\upgrade\u8.mp4
path\upgrade\u9.mp4
no hi guy
s in the output.