1

I have this data referring to a table. I have to insert it into a table using sql alchemy. I would like to insert a batch of rows in the table using rows list (now I have three rows but I will have around 50.000), like bluk in elasticsearch but using postgres. Could you give me any idea or code example? Thanks in advance for your help.

fields = ["NAME", "MACHINE", "PRE", "DATE", "COMP"]
rows = [
          ["ECO", "A07", 6, "2016-03-07", "Card"], 
          ["LIN", "A01", 1, "2016-03-07", "Reader"], 
          ["SUM", "A02", 2, "2016-03-07", "Scanner"]
        ]
Joan Triay
  • 1,518
  • 6
  • 20
  • 35
  • Have you tried something out? – Chen A. Nov 16 '17 at 08:58
  • Hi, I added row by row, but I do not find any way for do it in a batch – Joan Triay Nov 16 '17 at 09:04
  • I've ran into same issue while working with Amazon Redshift and psycopg. I don't know about sql alchemy, but in my case I ended up using two approaches: construct huge sql query to insert all rows at once or upload data with csv using amazon redshift ssh interface. – aleosd Nov 16 '17 at 09:19
  • Possible duplicate of [How can I use psycopg2.extras in sqlalchemy?](https://stackoverflow.com/questions/43317376/how-can-i-use-psycopg2-extras-in-sqlalchemy), [SQLAlchemy, Psycopg2 and Postgresql COPY](https://stackoverflow.com/questions/13125236/sqlalchemy-psycopg2-and-postgresql-copy) – Ilja Everilä Nov 16 '17 at 10:03

1 Answers1

0

You can use the SQLAlchemy Core API which automatically performs bulk inserts if you pass a list of dictionaries.

Example:

engine = sqlalchemy.create_engine(db_url, echo=True) # echo for testing
db = engine.connect()
meta = sqlalchemy.MetaData()
foo_table = sqlalchemy.Table('foo', meta, autoload=True, autoload_with=engine)
foo_ins = foo_table.insert()

# fields, rows - as assigned in the question
xs = [ { k: v for k, v in zip(fields, row) } for row in rows ]

db.execute(foo_ins, xs)
maxschlepzig
  • 35,645
  • 14
  • 145
  • 182