0

I'm trying to load CSV data using the SQLServerBulkCSVFileRecord, however it seems that this doesn't allow for enclosed fields (stripping quotes) and supply a value for NULL.

Any suggestions on how to get this working? Seems like such basic functionality needed for loading a CSV file.

Code i'm currently using:

final SQLServerBulkCopy copymanager = new SQLServerBulkCopy(connection);
final SQLServerBulkCSVFileRecord csv = new SQLServerBulkCSVFileRecord(datafile.toString(), true);

final ResultSet resultSet = connection.createStatement().executeQuery(
        String.format("select * from [%s].[%s]", getSchema(), tableName));
for (int i = 1; i <= resultSet.getMetaData().getColumnCount(); i++) {
    csv.addColumnMetadata(i, resultSet.getMetaData().getColumnName(i), resultSet.getMetaData().getColumnType(i),
            resultSet.getMetaData().getPrecision(i), resultSet.getMetaData().getScale(i));
}

copymanager.setDestinationTableName(tableName);
copymanager.writeToServer(csv);
ferdyh
  • 1,355
  • 2
  • 11
  • 29

1 Answers1

0

Interestingly, the source for SQLServerBulkCSVFileRecord has a comment

  • Both BCP and BULK INSERT considers double quotes as part of the data and throws error if any data (say "10") is to be
  • inserted into an numeric column. Our implementation does the same.

that means you will have to parse each row in your code to unquote quoted values.

As far as supplying a value for NULL, you can do it in your code our supply a DEFAULT value in the table. Your choice.

user1443098
  • 6,487
  • 5
  • 38
  • 67
  • Well thanks, but still ugly... for ex. Postgresql driver has an option to define the enclosing character, even optionally and can define a string value for null. – ferdyh Aug 16 '18 at 13:28
  • https://stackoverflow.com/questions/1405038/reading-a-csv-file-in-net mentions using a .NET class to parse csv, it might help you here – user1443098 Aug 16 '18 at 13:37
  • I ended up using the SqlBulkCopy class with an InputStream and rewriting the original file without qoutes and unused seperator. – ferdyh Aug 17 '18 at 07:18
  • Maybe I now ended up extending the SQLServerBulkCSVFileRecord and build the backend based on FastCSV... Seems more seamless; Can share part of the code if anyone is interested. – ferdyh Aug 20 '18 at 10:05