3

I'm trying to speed up to_sql() by using fast_executemany. But I'm getting an error,

TypeError: Invalid argument(s) 'fast_executemany' sent to create_engine(), using configuration PGDialect_psycopg2/QueuePool/Engine. Please check that the keyword arguments are appropriate for this combination of components.

I've been referencing this previous question, Speeding up pandas.DataFrame.to_sql with fast_executemany of pyODBC.

database_url = 'postgresql://{user}:{password}@{host}:5432/{database_name}'.format(
    user=user,
    host=host,
    password=password,
    database_name=database_name,
)
engine = create_engine(database_url, echo=False, fast_executemany=True)
df.to_sql('parquet', con=engine,if_exists='replace')

This code executes if I remove the fast_executemany argument, but takes pretty long. I'm using Python 3.7.

Tim
  • 80
  • 1
  • 6
  • 5
    I think fast_executemany is just for Pyodbc driver https://docs.sqlalchemy.org/en/13/dialects/mssql.html#mssql-pyodbc-fastexecutemany – GiovaniSalazar Nov 28 '19 at 04:45
  • You are correct, I'm trying to speed up the upload with to_sql() but this doesn't work. I think I found the answer here, https://stackoverflow.com/questions/23103962/how-to-write-dataframe-to-postgres-table – Tim Dec 01 '19 at 14:57

0 Answers0