0

I'm working with two Python 3 scripts and one file data.json.

One process sometimes (over-)writes the file:

with open('data.json', 'w') as outfile:
    json.dump(data, outfile)

The other sometimes reads the file:

with open('/data.json') as json_data:
    data = json.load(json_data)

Is there a concern that if one process reads the file while it is being written, the reading process will receive broken data? How to handle such a case? I would like the reading process to simply sys.exit("Can't read right now"), because it will be restarted soon after finishing anyways.

Or is there no concern and Python will automatically wait indefinitely until the file is not being written to any more?

The processes run on a 64 bit Linux system with quad-core CPU.

qubodup
  • 8,687
  • 5
  • 37
  • 45

0 Answers0