0

We are using the Spring Batch for syncing records from one DB to another.

The target table definition is like this:

CREATE TABLE entity_name (
    entity_id varchar(255) NOT NULL,
    value varchar(255) NOT NULL,
    source varchar(255) NULL,
    type enum('type1', 'type2','type3') NULL,
    timestamp timestamp NULL,
    status varchar(50) DEFAULT 'approved',
    CONSTRAINT entity_name_pkey PRIMARY KEY (entity_id, value)
);

The writer is instantiated by doing:

String command = "insert into entity_name(entity_id, value, source, type, status, timestamp) values(:entityId, :value, :source, :type, :status, :timestamp)";

JdbcBatchItemWriter<EntityName> entityWriter = new DefaultItemWriter<EntityName>(dataSource, jdbcTemplate, command, excludedTypes).getWriter();

However, based on the JdbcBatchItemWriter debug logs, it seems to be removing the timestamp column in the statement:

08:48:16.528 [main] DEBUG org.springframework.batch.item.database.JdbcBatchItemWriter - Executing batch with 2 items.
08:48:16.529 [main] DEBUG org.springframework.jdbc.core.JdbcTemplate - Executing SQL batch update [insert into entity_name(entity_id, value, source, type, status ) values(?, ?, ?, ?, ?)]

I know we probably shouldn't have been using a reserved word for column name but it is what is at the moment and I want to know if this is the expected behaviour in the JdbcTemplate and if is there any way to work around this.

alegria
  • 1,290
  • 14
  • 23

1 Answers1

0

After debugging more, I figured out that there's a test configuration which has another SQL command definition without the timestamp column.

alegria
  • 1,290
  • 14
  • 23