How to load larger amount of csv.gz file into Postgresql without unzipping to csv file ,since I tried pipeline command (mkfifo pipelinename) but it doesn't work for me. Is there any other solution to solve this issue ?
I have try to load it from local to postgresql using following command command : zcat file.csv.gz | psql -U username -d database;
Result : out of memory
Need : I want to load a big size csv.gz (around 15+ GB) file from centos to postgresql database.