0

I have a fixed number of API endpoints (4 for example) that all do the same thing. I would like to multiprocess a loop that can asynchronously send data to the endpoints. However, I want to make sure that each endpoint only sees one request at a time. Is there a way to do this using the multiprocessing tool in python. Perhaps it would look something like,

import pandas as pd
from multiprocessing import Pool


def process_data(data, endpoint):
    ...

endpoint_lst = [
   'https://endpoint1',
   'https://endpoint2',
   'https://endpoint3',
   'https://endpoint4'
]

df = pd.read_csv('datafile.csv')

with Pool(processes=len(endpoint_lst)) as p:
    p.starmap(process_data, [df.iteritems(), endpoint_lst])
  
martineau
  • 119,623
  • 25
  • 170
  • 301
user8675309
  • 181
  • 2
  • 10
  • if its python I would use pool.map like below https://stackoverflow.com/questions/5442910/how-to-use-multiprocessing-pool-map-with-multiple-arguments HTH – Siva Abbannagari Jan 24 '21 at 20:30
  • @SivaAbbannagari, thanks for the response. It's not clear to me from that example that you provided how to ensure that each endpoint only sees only one request at a time. – user8675309 Jan 24 '21 at 22:25
  • Why do you need multiprocessing for that? Seems like an I/O-bottleneck, so multithreading should do just fine. – couka Jan 24 '21 at 22:57
  • @couka fair point. However, I'm still stuck at the crux of the problem. Whether it's multiprocessing or multithreading, how do I ensure that each endpoint only sees one request at a time and is not sent another request until it sends back a response payload? – user8675309 Jan 25 '21 at 01:24
  • Not sure what the issue is... If you have one process/thread per API endpoint, how would a single endpoint receive more than one request at a time? – couka Jan 25 '21 at 07:29
  • Could you provide an example in Python? Thanks. – user8675309 Jan 25 '21 at 13:25

0 Answers0