1

I need to insert a large number of rows into a table with a large number of columns ( >150 ).

These rows contain timestamps as strings, which i want to insert into timestamp fields in postgresql.

Due to large number of columns i want to use a statement like

    cur1.execute("""INSERT INTO ttab VALUES(%s)""" % (row))

because the order of columns is the same and i can spare naming all columns here.

But, having the dates as string in the row, insert fails.

    LINE 1: ...f7bc2050ea4', 'SAR', '', 0, '4', 'role updated.', '07-12-20 ...
    psycopg2.errors.InvalidDatetimeFormat: invalid input syntax for type timestamp: "07-12-20 01.18.05"

Same with having the date converted to a datetime object, which does not get evaluated, giving an error because of an unknown column 'datetime......'

Is there a way to get the date fields into a suitable format for usage in the SQL values?

Outpuhandler (from cx_Oracle) is available.

tmoe
  • 303
  • 3
  • 10
  • hi, interesting, you could load them into a staging table where the date column is a string, then use `insert into ... select` to transform the source value into a date. – IronMan Mar 15 '21 at 17:30
  • Please include data to illustrate what you are attempting to insert. Also, you should be parameterizing your queries, [well known](https://stackoverflow.com/questions/1633332/how-to-put-parameterized-sql-query-into-variable-and-then-execute-in-python) in industry beyond Python and Postgres and [well documented](https://www.psycopg.org/docs/usage.html) in psycopg2. – Parfait Mar 15 '21 at 20:47
  • By the way, the modulo operator, `%`, for string parameterization is now [de-emphasized](https://stackoverflow.com/a/13452357/1422451) in Python 3 (not deprecated *yet*). – Parfait Mar 15 '21 at 20:49

0 Answers0