I successfully post each record one by one from a csv file. However, I'm trying to implement multiprocessing to make it more efficient to handle large data file in the future.
ENDPOINT_URL = 'https://example.com'
headers = {'Api-key': '123abc'}
with open("student.csv", "r") as csv_ledger:
r = csv.DictReader(csv_ledger)
data = [dict(d) for d in r ]
groups = {}
for k, g in groupby(data, lambda r: (r['name'])):
#My data mapping
#for loop to post each record
post_api = requests.post(ENDPOINT_URL, json=groups, headers=headers)
Is there any new easy way to do the multiprocessing for api request?
Update: I trying to use the grequest
but the data i post is null
rs = (grequests.post(u,json=groups, headers=headers) for u in ENDPOINT_URL)
grequests.map(rs)
print(grequests.map(rs))