I currently have a function with 5 arguments that from a for loop it executes, from a list of countries and then it iterates from a list of sites (over 20). It performs some operations and then it runs from a bigger function
My code is as it follows:
sitelist= ['zzz', 'xxx', 'yyy']
for site in sitelist:
startime, endtime = createtime(timer,site)
excludedlistpaths = excludedlist(site)
finallist = []
bigfunc(site,startime,endtime,excludelistpaths)
Currently my list of sites is increasing so I'm looking into using multithreading/multiprocessing but I'm getting stuck as my function has multiple arguments that change every time. My first thoughts where to use something like:
with ThreadPoolExecutor(max_workers=4) as executor:
for site in sitelist:
executor.submit(bigfunc, site,startime,endtime,excludelistpaths))
but it failed, also map seems to be the same in this case. Is there any way to create multiple processes/ threads to just concurrently do my function at the same time? Every instance of my function is totally separate from the others so the only thing I'm looking here is for speed.