0

I have a csv with 90 columns that I need to import as a table to my pgsql database (and there are several more csv files with large numbers of columns that I would like to apply this method to). My aim is to avoid manually designating 90 separate columns with a CREATE TABLE query.

Column headers in the table should remain the same as in the csv and every column should be imported as a numeric data type with a precision of 2 decimal points.

So far, the only program that I've come across that does this is pgfutter which I have installed successfully. However, the database that I am connecting to is a remote one on AWS and it is unclear where to input the connection details. Also, after installing, I get an error when requesting help info:

$ ./pgfutter --help
-bash: ./pgfutter: Permission denied

Could anyone suggest a workaround in pgfutter or another method to import a csv file with straightforward numeric columns automatically to PostgreSQL ?

iskandarblue
  • 7,208
  • 15
  • 60
  • 130
  • I'd check the permissions on `pgfutter` and make sure that it's executable. It sounds like that's your blocker. Something like `ls -l pgfutter` should show you the permissions (on the left). The `chmod` command can be used to change permissions (see `man chmod`). – jmelesky Feb 27 '17 at 19:36
  • "*I have a csv with 90 columns that I need to import as a table to my pgsql database*" A 90 column SQL table is a very poorly designed table and will be a nightmare to work with. You should do some work to translate the CSV data into something more sensible. Maybe ask about that instead. – Schwern Feb 27 '17 at 20:24
  • Why is it automatically a poorly designed table and not simply a table with many different variables? – iskandarblue Feb 27 '17 at 20:30
  • Columns in a table aren't "variables". They are attributes of the entity that the table represents. Very often a table with 90 columns indicates that the table is not properly normalized and in fact combines information from different entities. But I do agree that this discussion isn't relevant for the question. –  Feb 27 '17 at 21:52
  • [How to generate a schema from a CSV for a PostgreSQL Copy](http://stackoverflow.com/questions/35243432/how-to-generate-a-schema-from-a-csv-for-a-postgresql-copy/38688898#38688898) – klin Feb 28 '17 at 11:14

1 Answers1

0

It is simple to write a shell script that constructs a CREATE TABLE statement from the first line of a CSV file.

Here is my solution:

#!/bin/bash

# makes a CREATE TABLE statement out of the first line of a CSV file

# usage: mktab <tabname> <CSV file>

if [ -z "$2" -o -n "$3" ]; then
    echo "usage: mktab <tabname> <CSV file>" 1>&2
    exit 1
fi

IFS=,
first=1

echo -n "CREATE TABLE \"$1\" ("
for col in $(head -1 "$2"); do
    if [ $first -eq 1 ]; then
        first=0
    else
        echo -n ', '
    fi
    echo -n "\"$col\" numeric(10,2)"
done
echo ');'

exit 0
Laurenz Albe
  • 209,280
  • 17
  • 206
  • 263