I call an API using request
to get a JSON data and store it in one variable like this:
r = requests.post(url=endpoint , data=payload, headers=headers, params=parameter)
all = json.loads(r.text)
then I used loop to insert data row by row like this:
for row in all:
sql = "INSERT INTO t VALUES ({},{},{});".format(all['id'],all['num'],all['type'])
cur.execute(sql)
The real one has more columns not just 3 columns like the example.
This code work totally fine but my questions here is that, are there any other way to insert JSON data to the table? Because I need to insert like 4-5 thousand rows per 1 request which will take very long time (comparing with copy_expert
on CSV file) since it insert row by row. Are there a ways without using loop or anything that might help this insert process to be faster?
I used PosgreSQL database with Python here