0

I have this huge python list which i have collected thru web scraping. The list is composed of approximately 2,900 sub list.

stockDetailsDict = [
    ("AAPL", 8.82, 86122300.0, 8.8, 9.18, 9.17),
    ("BRK", 4.83, 3286500.0, 4.83, 5.06, 5.06),
    ("JNJ", 727.0, 127470.0, 727.0, 742.0, 740.0),
    ("DIS", 9.52, 12358700.0, 9.36, 9.73, 9.5),
    ...,
]

The structure of the list is the stock code (ticker), stockCurrentPrice, stockVolume, stockDayLow, stockDayHigh, stockOpen respectively

While the structure of my database is this:

+------------+-------------+-----+-------------+-------+
| Stock Code | Stock Price | Low | High | Open | Close |
+------------+-------------+-----+-------------+-------+
|            |             |     |      |      |       |
|            |             |     |      |      |       |

My Goal:

I'm trying to save or store this dictionary to my postgresql table (currentDay).

My Issue (EDITED):

I can easily access each keys, then store it one by one using a loop. but the thing is, since its too large (2,900 sublist), it will take several minutes to finish because its a loop.

Is there a way I can do this on a single shot to minimize the time spent?

It was pointed out to me in the comment section that executemany is the right way to do this. so i drafted the code for this to do the job.

cursor = connection.cursor()

cursor.executemany("INSERT INTO public.stockScrapper_currentday (ticker, stockCurrentPrice, stockVolume, stockDayLow, stockDayHigh, stockOpen) VALUES (%s,%s,%s, %s,%s,%s);", (stockDetailsDict))

But the it keeps returning me this error:

relation "public.stockscrapper_currentday" does not exist LINE 1: INSERT INTO public.stockScrapper_currentday (ticker, stockCur...

Any idea what's wrong?

larsks
  • 277,717
  • 41
  • 399
  • 399
Ben Daggers
  • 1,000
  • 1
  • 16
  • 51
  • 1
    see https://stackoverflow.com/questions/8134602/psycopg2-insert-multiple-rows-with-one-query – balderman Aug 12 '21 at 11:17
  • 1
    Before you enter your loop, make sure that you turn auto commit off then "manually" commit when you exit the loop. That would typically optimise this kind of situation. If you already have auto commit off then I can only guess that your system is constrained by available RAM, CPU speed or code implementation issues. Also, ~3,000 keys isn't a lot! –  Aug 12 '21 at 12:17
  • I'm thinking that I should change the dictionary to a simple list. Maybe its the better approach? – Ben Daggers Aug 12 '21 at 12:20
  • From your error, it sounds like the target table doesn't exist. – larsks Aug 12 '21 at 13:46
  • yup! solved it already! Thanks! – Ben Daggers Aug 12 '21 at 14:18

0 Answers0