2

I have a Python flask API wrapped with uWSGI+Nginx for production setup. I have 4 Nginx workers and 1 thread per worker configured as follows, processes = 4 threads = 1.

When I issue an API Post, the post writes and reads a json file at different parts of my code.

When I issue multiple API requests in parallel , it gets distributed to these 4 Nginx workers as expected. And they go ahead and process the request in parallel. Sometimes this causes json ValueErrors as multiple processes are reading and writing to the same file.

How do I overcome this scenario? This is not multiprocessing within my API. So using python's multiprocessing Lock within my json update code will not fix my problem.

I want to be able to update the json files one worker at a time. Is there a way to share locks within uwsgi workers?

  • When you say "writes and reads" a json file, do you mean to disk? It might be beneficial to readers (if it makes sense), to provide example code where you are reading and writing -- to give better context. – SteveJ Dec 12 '17 at 18:36

1 Answers1

0

All these different processes will have a different pid, so why don't you use that as the file name for the JSON reading/writing.

import os 
pid = os.getpid()
f = open(pid,'w+')

Unwanted Advice: I would highly recommend you not to use file read/write as the data exchange method between your programs, try to pass the value between functions/classes or set it in some redis like caching mechasim(slower). But writing and reading from file is slowest.

harshil9968
  • 3,254
  • 1
  • 16
  • 26