5

How to load larger amount of csv.gz file into Postgresql without unzipping to csv file ,since I tried pipeline command (mkfifo pipelinename) but it doesn't work for me. Is there any other solution to solve this issue ?

I have try to load it from local to postgresql using following command command : zcat file.csv.gz | psql -U username -d database;

Result : out of memory

Need : I want to load a big size csv.gz (around 15+ GB) file from centos to postgresql database.

kavipriya M
  • 51
  • 1
  • 4
  • 1
    https://stackoverflow.com/a/41741644/2235885 (`didn't work for me` is not a very useful description. What exactly did you try? What went wrong? What was the error massage?) – joop Sep 04 '18 at 14:26

2 Answers2

6

Note that this should also work from inside psql:

\copy TABLE_NAME FROM PROGRAM 'gzip -dc FILENAME.csv.gz' DELIMITER ',' CSV HEADER NULL ''

Vincent
  • 7,808
  • 13
  • 49
  • 63
3

Just to share my simple examples using zcat instead of gzip. Simply less typing. I am using zcat to expand the gzipped file.

\copy tmptable from program 'zcat O1variant.tab.gz' with (format csv, delimiter E'\t', header TRUE)
Kemin Zhou
  • 6,264
  • 2
  • 48
  • 56