I am implementing a simple application that consists of multiple python scripts that run concurrently. The two scripts that run at the same time are the one for parsing data and the one for looking up data in the database. Because of design decisions, I am generating a part of the data. This data isn't saved in the database but in json files. In my parser I save the data like this:
with open('a-1-test.json', 'w') as outfile:
json.dump(lookup_table, outfile)
outfile.close()
The parser runs in a loop until a certain condition is met. Meanwhile other scripts reffer to the look-up script to get data from the database (the data that the parser saves). When the other scripts call the look-up script, he first needs to check the lookup table in the json file to determine which data specifically he has to fetch.
while trigger:
time.sleep(10)
with open('a-1-test.json', 'r') as data_file:
data = json.load(data_file)
for i in data.keys():
print i, len(data[i])
This might work for some time but I get two types of errors: JSON document not found and ValueError: Unterminated string starting at(...). I guessed this is because there isnt any concurency measurements set when two different scripts try to access the same file. I know that the first error happens because I use 'w' in the parser, where an existing file will be deleted and a new one created, so in the meantime the lookup script won't be bale to see the file.
I wonder what is the best way in python to do this? Any way to put a lock on the file while it is written in, unlock it when finished so the lookup script can read it?
Thank you in advance