3

Does anyone know of some Python package or function that can upload a Pandas DataFrame (or simply a .csv) to a PostgreSQL table, even if the table doesn't yet exist?

(i.e. it runs a CREATE TABLE with the appropriate column names and columns types based on a mapping between the python data types and closest equivalents in PostgreSQL)

In R, I use the ROracle package which provides a dbWriteTable function that does what I've described above. (see docs here)

Tommy O'Dell
  • 7,019
  • 13
  • 56
  • 69

2 Answers2

8

Since pandas 0.14, the sql functions also support postgresql (via SQLAlchemy, so all database flavors supported by SQLAlchemy work). So you can simply use to_sql to write a pandas DataFrame to a PostgreSQL database:

import pandas as pd
from sqlalchemy import create_engine
import psycopg2
engine = create_engine('postgresql://scott:tiger@localhost:5432/mydatabase')

df.to_sql("table_name", engine)

See the docs: http://pandas.pydata.org/pandas-docs/stable/io.html#sql-queries

If you have an older version of pandas (< 0.14), see this question:How to write DataFrame to postgres table?

Community
  • 1
  • 1
joris
  • 133,120
  • 36
  • 247
  • 202
0

They just made a package for this. https://gist.github.com/catawbasam/3164289 Not sure how well it works.

user531525
  • 77
  • 1
  • 1
  • 8