I have a script that I just converted into a flask app. The script is in python3 and is over 1200 lines so I cant post it here. What the script does is access multiple servers (100+) via threading. And then returns the data to the user via a spreadsheet. When I have multiple users using this script at the same time the spreadsheet will contain their data as well as other that were run in the same 2 min or so.
My question is could this be happening due to my data dictionarys being global?
Example code:FLASK APP
import flask
@app.route('/')
#user imports form data
formdata
from server_script import MAIN_FUNCTION
#I then call the function from the other script and send the form data to it
send spreadsheet to user
Example code:SERVER APP
import xxx
DATA_DIC_1={}
DATA_DIC_2={}
def WORK_FUNCTION_1(var1,var2,var3)
do things - add to DATA_DIC_1
def WORK_FUNCTION_2(var1,var2,va3,var4)
do things - add to DATA_DICT_2
def MAIN_FUNCTION
take data from flask APP
WORK_FUNCTION_1(var1,var2,var3)
WORK_FUNCTION_2(var1,var2,var3,var4)
using data from DATA_DIC_1 and DATA_DIC_2 create spreadsheet
return spreadsheet to flask app
I think the issue is the two DATA_DICs are not under the MAIN_FUNCTION that calls the other functions causing the data to be shared between the current concurrent users.
I know this is not much to go on, but does anyone agree with this assessment? I am new at flask as I am typically a backend guy setting up CRON jobs and what not. The app works fine as long as people are not running it at the same time.
If I add the DATA_DICs to the MAIN_FUNCTION would this fix this? I thought of this after I got home... So I cant test, just want a 2nd opinion.
EDIT: So do I need to add g.flask or multiprocessing manager I don't see how this is directly related to the other linked question.