2

I have a Postgresql script that automatically imports csv files into my database. The script can detect duplicate records and remove them, do a proper upsert but still cannot tackle everything. Basically the csv files are exported from other systems which append at the beginning and end of the file extra information e.g:

Total Count: 2956
Avg Time: 13ms

Column1, Column2, Column3
...      ...      ... 

What I want to do is skip those initial rows or any rows at the bottom of the file. Is there any way I can do this in Postgresql via COPY or via another route whatever that might be? Can I call for instance operating system commands via Postgresql?

Dimitris
  • 2,030
  • 3
  • 27
  • 45

1 Answers1

3

For Linux use tail and head to crop the file and pipe it to your script:

tail -n +3 file.csv | head -1 | psql -f my_script.sql my_database

Then your script will copy from STDIN:

copy my_table from STDIN;
Clodoaldo Neto
  • 118,695
  • 26
  • 233
  • 260
  • sorry I dont have much experience with Postgresql. Are the tail and head commands invoked via the Postgresql console? How can I set this up? – Dimitris Apr 16 '13 at 13:52
  • @Dimitris. Via the Linux command line. I guess Windows will have its counter parts. – Clodoaldo Neto Apr 16 '13 at 13:54
  • Unfortunately this is on a Windows machine. That's what the client has. So I guess I will need to look for the equivalent in Windows – Dimitris Apr 16 '13 at 14:06