I am hosting a Postgres instance locally on my machine, but I have been unsuccessful in importing larger .csv files (1.0 - 1.9 GB) to the database using available methods in the docs and SO.
- When I try to create the table using pgAdmin 4 GUI, then COPY FROM csv, I get the error message "could not open file "C:\Users\username\myfile.csv" for reading: Permission denied.
- Per suggestion from this post, I copied the file to C:\Windows\Temp\myfile.csv, but received the same error. In both cases, I am unclear why the local instance is unable to read from my hard drive. The documentation states that the file's location must be clear from the client's perspective, but I have provided the full path and still raised the exception.
- Using
\copy table_name from 'C:\\Windows\Temp\myfile.csv';
from psql command line returns ERROR: relation "table_name" does not exist. I am more proficient with SQL Server and am a newcomer to PostgreSQL, so I tried\copy 'C:\Users\Temp\all_pbp.csv' from stdin;
under the impression that perhaps I had misunderstood the syntax in the docs, but it raised a syntax error as expected. Using\copy all_pbp from 'C:\Users\username\myfile.csv' stdin;
also raises a syntax error. Includingwith delimiter ',' csv header encoding 'windows-1251';
does nothing to negate the "relation "table_name" does not exist" message. The only responses I find to this error simply state that one must be certain of casing. - Including schema name in quotation marks to preserve casing from psql raises "ERROR: schema "schema_name" does not exist"
- Creating the table using a CREATE Script generated in pgAdmin, and using the Import/Export dialog also fails. The only message is Failed (error code 1).
- Using python fails to process using code below:
myfile = pd.read_csv("C:\\Users\username\myfile.csv) myfile.to_sql('myfile',engine)
after creating a postgres engine with sqlalchemy as such:engine = create_engine('postgresql://postgres:password@localhost:port/schema')
The to_sql command will run for a long time but never complete or return an error message.
TL;DR I need help getting one of the recommended methods of importing a csv from my hard drive to a local postgres databse to behave as expected with large csv files.