0

In my Java application I import some CSV files in PostgreSQL 10.3 with PgAdmin4 (JDBC).

private static void importDB(String path, String type) { 
    Statement stmt = null;
    try {
        Class.forName(DRIVER_CLASS_NAME);
            c = DriverManager.getConnection(DB_URL, USER, PASS);
            stmt = c.createStatement();
            //System.out.println(''"+path+"');
            String sql = "COPY "+type+" from '"+path+"' CSV HEADER";
            stmt.executeUpdate(sql);
            stmt.close();
            c.close();

    } catch (Exception e) {
        e.printStackTrace();
    }
}

}

Now: if I modify some non-key-values in the CSV file and try to re-import there's an obvious primary key problem because it's just like another copy of the same file I imported first. Assuming that tables in PgAdmin4 has primary keys (id,sat), how can I check these keys in the table(in the rows actually) and modify the other non-key data? Just like overwriting them.

oiac412245
  • 13
  • 5
  • 1
    I don't know how big is your CSV file, but in case it's small enough for creating `INSERT` statements, I would definitely use `UPSERT` - of course, if you're using PostgreSQL 9.5+. `UPSERT`s do exactly what you need ;-) – Jim Jones Mar 16 '18 at 15:39
  • The biggest one is 90 MB and I had some issues using Insert instead of Copy to import, like 10 minutes to import just this file... So I cant'use Upsert I guess, if that's like Insert in terms of efficiency. – oiac412245 Mar 17 '18 at 17:56
  • If the issue is that jdbc is too slow, see https://stackoverflow.com/q/758945 – Nathan Hughes Mar 18 '18 at 02:00

0 Answers0