0

The following returns a list of the desired payload in a PDF for a single site:

import multiprocessing as mp

def process_dates(dates, serviceAreaId, serviceAreaName, siteId, siteName): 
  model_results = []
  # for loop to process dates
  return model_results

def process_service_area(site):
  pdf_results = []
  siteId = site["id"]
  siteName = site["name"]
  siteTimeZone = site["time_zone"]
  service_area_response = requests.request("GET",f"{lrs_url}/serviceAreas/active?siteId={siteId}", headers={},data={})

  if service_area_response.status_code != 200:
    sendMessageToSlack(f"Failed to retrieve service areas for site {siteId}.")
    return pdf_results

  service_area_json_response = json.loads(service_area_response.content)

  tz = timezone(siteTimeZone)

  startdate = datetime.now(tz)
  dates = pd.date_range(start=startdate, periods=7).to_pydatetime().tolist()

  date_pool = mp.Pool(mp.cpu_count())
  
  for service_area in service_area_json_response:
    serviceAreaName = service_area["name"]
    serviceAreaId = service_area["id"]
    active = service_area["active"]
    #Run comparison
    date_pool_async_results = date_pool.apply_async(process_dates, args = (dates, serviceAreaId, serviceAreaName, siteId, siteName))

  date_pool.close()
  date_pool.join()

  for r in date_pool_async_results.get():
    pdf_results.append(r)
  return pdf_results

def process_all_sites():
  sites_response = requests.request("GET",f"{lrs_url}/sites/active", headers={}, data={})
  sites_json_response = json.loads(sites_response.content)
  pdf_results = []

  for site in sites_json_response:
    pdf_results += process_service_area(site)
    break
    # service_area_pool = mp.Pool(2)
    # service_area_pool_async_results = #service_area_pool.apply_async(process_service_area, args = (site))
  
  # service_area_pool.close()
  # service_area_pool.join()
  
  # for r in service_area_pool_async_results.get():
    # pdf_results.append(r)
  return pdf_results

results = process_all_sites()
create_pdf(results)

However, when the script is ran for multiple sites, that is, when break is commented and service_area_pool up to pdf_results.append(r) are uncommented, I receive an UnboundLocalError: local variable 'date_pool_async_results' referenced before assignment

Why is that the case? It seems that date_pool_async_results does not receive any further payloads than just the first iteration.

I've gone through these following resources; however, they don't seem to work as as solution as I am trying to append to a list in my case.

I've also tried the following, but received an UnboundLocalError: local variable 'pdf_results' referenced before assignment as well.

global pdf_results
pdf_results = []

def process_all_sites():
  # retrieve payload

Please do let me know if further context is required.

ericdwkim
  • 37
  • 5

0 Answers0