0

I want to open 3 Powershell and run the same code at the 3 of them, just different files.
As they'll have exactly the same logic, each of them will try to access each other's files to check if there's anything written there
Process1 has client1.txt, Process2 has client2.txt and Process3 has client3.txt
Here's some code as to what process 3 should check before choosing which problem to work on:

import os
while True:
f = 'C:\Users\Files'
i = {}
i['word'] = 'problem X' #just an example, I'll have a whole list
if os.path.exists(f):
    try:
        os.rename(f, f)
        print 'Access on file "' + f +'" is available!'
        file1 = open('C:\Users\Files\client1.txt', 'r+')
        file2 = open('C:\Users\Files\client2.txt', 'r+')
        if file1.read() == i['word']:
            print "A process is already working on this problem, check another"
        elif file2.read() == i['word']:
            print "A process is already working on this problem, check another"
        else:
            print "Found a new problem to work on"
            file3 = open('C:\Users\Files\client3.txt', 'r+')
            file3.write(i['word'])
            file1.close()
            file2.close()
            file3.close()

    except OSError as e:
        print 'Access-error on file "' + f + '"! \n' + str(e)
        time.sleep(5)
        pass

What I tried to represent through the code is: I only want a process to start a problem if the others aren't working on it already, as they all have the same logic they'll try to solve the same problem (I have lots that need solving) and so they might reach at around the same time as the program goes on with the while True.
When it finishes the problem, it'll delete the contents of the file and pick a new problem, then write the problem it is working at in its own file for the others to check later.

Just to make it a bit clear: Let's say process1 found a problem to work first ('AAA'), they all have empty txt files, it'll check txt2 and txt3 and see it's not equal to ('AAA'), then it'll write it to its own file and close it.
I want process2 which might make it there a second later to read both txt1 and txt3 and see that ('AAA') is already being worked on, it'll get the next in the list and check again, seeing that ('BBB') is okay and it'll write on its own file.
When it ends, it deletes the String from the txt and starts looking for another one.

There's the problem of both process trying to check files at the same time too. Maybe if there's a way to put a time.sleep() to a process if another process is using the file and then try again a bit later?

Tax
  • 49
  • 1
  • 9
  • Not a duplicate, but possibly relevant: https://stackoverflow.com/questions/30407352/how-to-prevent-a-race-condition-when-multiple-processes-attempt-to-write-to-and – paisanco Oct 08 '17 at 15:58
  • @paisanco I only understand the general idea, as I'm new to python, but it's really interesting. Do you think the block they mention blocks the kind of process I want too or just threading/multiprocess and stuff that have a 'link'? – Tax Oct 08 '17 at 16:06

1 Answers1

2

Ideally you would use Python's multiprocessing module to start each process. You could have one process watch for new files and feed them to a queue that your worker processes read from. This solution would eliminate the need for any file locking since only one process is ever looking for new files.

If you must start the processes via Power Shell though, you'll have to use locking of some sort to get the processes working together. True file locking is difficult to do in a cross platform way. You can fake it by using move though. For instance, you could attempt to move the file (move is atomic on most operating system assuming you're moving to a location on the same filesystem). If the move fails, you know another process must have gotten to it first, or something went terribly wrong with the OS. You can then do all your processing on the moved file. For example:

import path
import shutil

filename = "C:\\path\\to\\filename.txt"
_filename = os.path.join(os.path.basename(filename), ".wip")
wip_filename = os.path.join(os.path.dirname(filename), _filename)

try:
    shutil.move(filename, wip_filename)
except OSError:
    # another process moved it first
    pass
else:
    do_work(wip_filename) 
Matt Hardcastle
  • 638
  • 3
  • 9
  • You actually explained some of my own code to me, which I didn't understand before. I guess it can't be helped, I'll stop hiding away in fear from multiprocessing and queue. Thanks – Tax Oct 09 '17 at 13:11