0

I am hosting a Postgres instance locally on my machine, but I have been unsuccessful in importing larger .csv files (1.0 - 1.9 GB) to the database using available methods in the docs and SO.

  • When I try to create the table using pgAdmin 4 GUI, then COPY FROM csv, I get the error message "could not open file "C:\Users\username\myfile.csv" for reading: Permission denied.
  • Per suggestion from this post, I copied the file to C:\Windows\Temp\myfile.csv, but received the same error. In both cases, I am unclear why the local instance is unable to read from my hard drive. The documentation states that the file's location must be clear from the client's perspective, but I have provided the full path and still raised the exception.
  • Using \copy table_name from 'C:\\Windows\Temp\myfile.csv'; from psql command line returns ERROR: relation "table_name" does not exist. I am more proficient with SQL Server and am a newcomer to PostgreSQL, so I tried \copy 'C:\Users\Temp\all_pbp.csv' from stdin; under the impression that perhaps I had misunderstood the syntax in the docs, but it raised a syntax error as expected. Using \copy all_pbp from 'C:\Users\username\myfile.csv' stdin; also raises a syntax error. Including with delimiter ',' csv header encoding 'windows-1251'; does nothing to negate the "relation "table_name" does not exist" message. The only responses I find to this error simply state that one must be certain of casing.
  • Including schema name in quotation marks to preserve casing from psql raises "ERROR: schema "schema_name" does not exist"
  • Creating the table using a CREATE Script generated in pgAdmin, and using the Import/Export dialog also fails. The only message is Failed (error code 1).
  • Using python fails to process using code below: myfile = pd.read_csv("C:\\Users\username\myfile.csv) myfile.to_sql('myfile',engine) after creating a postgres engine with sqlalchemy as such: engine = create_engine('postgresql://postgres:password@localhost:port/schema') The to_sql command will run for a long time but never complete or return an error message.

TL;DR I need help getting one of the recommended methods of importing a csv from my hard drive to a local postgres databse to behave as expected with large csv files.

Greg
  • 84
  • 1
  • 1
  • 7
  • 3
    When you call `\copy from ` from psql, are you sure that you are connected to the right database? Does `select * from ;` work? Which database version are you using? – CC. Sep 22 '20 at 21:11
  • `COPY` works from the perspective of the Postgres server, so the file must be accessible by the user the server is running as. `\copy` runs from the clients perspective so the file needs to be accessible by the client user. The table name is the one you are having an issue with, so it the part that should be quoted. While you working this out I would create a smaller sample `CSV` file to run against. – Adrian Klaver Sep 22 '20 at 22:02
  • Your problems are simple file permission errors and naming problems. Use table names that consist only of lower case characters, underscores and digits. Make sure you connect to the correct database. – Laurenz Albe Sep 23 '20 at 06:18
  • @CC I am using version 12.4. To your point, however, the only query that executes successfully from psql is `select version();` @Adrian Klaver thank you for the suggestions. I have tried putting quotes around both table name and schema, but it appears that CC is correct that I am not connected to the database. I will research mistakes in connecting to database. I have provided the credentials I used when creating the database when launching psql, so there must be something I'm missing that's preventing me from actually connecting with the database. – Greg Sep 23 '20 at 12:58

0 Answers0