3

I am using LogStash 7.3.2 version to extract data from SQL Server. It is throwing below exception:

Exception when executing JDBC query {:exception=>#
transition (daylight savings time 'gap'): 1942-09-01T00:00:00.000 (Asia/Kolkata)>}

I have specified below query in LogStash:

select * 
from mytable 
where lastupdatetimestamp > :sql_last_value

Please suggest me what I am missing here.

I am using below config:-

jdbc_connection_string => "jdbc:sqlserver://HOST:PORT;databaseName=DB_NAME;integratedSecurity=false
        jdbc_user => "USERNAME"
        jdbc_password => "PASSWORD"
        jdbc_driver_library => "/home/user/LOGSTASH/mssql-jdbc-7.4.1.jre8.jar"
        jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
        statement => "SELECT * FROM TABLE_NAME  where   INCR_COLUMN  >  :sql_last_value  "
        schedule => "*/60 * * * * *"
        use_column_value => true
        tracking_column => "INCR_COLUMN"
        tracking_column_type => "timestamp"
        record_last_run => true
        last_run_metadata_path => "/home/user/LOGSTASH/LAST_RUN/.mssql_USERS_logstash_jdbc_last_run"
        connection_retry_attempts => "1000"
        connection_retry_attempts_wait_time => "60"
        jdbc_default_timezone => "UTC"
Nazim Kerimbekov
  • 4,712
  • 8
  • 34
  • 58
Raj
  • 429
  • 2
  • 6
  • 26
  • That's not a SQL Server error. SQL Server doesn't have types with timezones. `datetimeoffset` contains a timezone offset, not a timezone name. This means the error is caused when logstash is trying to convert between different types that may or may not support offsets. What are the types of the `lastupdatetimestamp` field and the `:sql_last_value` parameter? If `lastupdatetimestamp` is a `datetimeoffset`, `sql_last_value` should have the same type too, to avoid conversion – Panagiotis Kanavos Dec 09 '19 at 08:22
  • In any case, SQL Server offers built-in [change tracking](https://learn.microsoft.com/en-us/sql/relational-databases/track-changes/about-change-tracking-sql-server?view=sql-server-ver15), which is far more efficient that querying an update timestamp. Update timestamps can't handle deletions for example – Panagiotis Kanavos Dec 09 '19 at 08:24
  • @PanagiotisKanavos Thanks for your response. In SQL Server, lastupdatetimestamp field is type of datetime. And I have defined this field in LogStash as timestamp type. – Raj Dec 09 '19 at 08:27
  • As for the error itself, I'd guess the code passed a `datetime` or `datetime2` to the query, forcing logstash to convert it to a local `datetimeoffset` value. If the library used for such conversions (typically tzdb) has no timezone rules for `Asia/Kolkata` for 1942, you may get an error. That name used to be `Asia/Calcutta` – Panagiotis Kanavos Dec 09 '19 at 08:27
  • sometimes it works fine ... But once in a week it throws exception and LogStash does not save last timestamp. So what changes I should make in LogStash? – Raj Dec 09 '19 at 08:33
  • You could try with LocYou should use `datetimeoffset` instead - what if client and server are in different timezones? . Even better, use Change Tracking so you won't have to worry about timezones and triggers at all. There are newer types you can try too, eg `LocalDateTime` or `OffsetDateTime` – Panagiotis Kanavos Dec 09 '19 at 08:37
  • As per logstash documentation I can use tracking column type only numeric and timestamp. So is there any way I can change it to LocalDateTime and OffsetDateTime? – Raj Dec 09 '19 at 08:46
  • The *time* is simply invalid for that timezone. If you check [the timezone's history](https://www.timeanddate.com/time/zone/india/kolkata?year=1925) you'll see there was a DST change at exactly `1942-09-01 00:00` from +5:30 to +6:30. See [this elasticsearch issue too](https://www.google.com/search?q=Exception+when+executing+JDBC+query+%7B%3Aexception%3D%3E%23+transition+(daylight+savings+time+%27gap%27)&oq=Exception+when+executing+JDBC+query+%7B%3Aexception%3D%3E%23+transition+(daylight+savings+time+%27gap%27)&aqs=chrome..69i57.492j0j7&sourceid=chrome&ie=UTF-8) – Panagiotis Kanavos Dec 09 '19 at 08:59
  • Please post the actual names and versions of the JDBC driver, Elastic, logstash and the code/configuration you used. Post the error's call stack too. Why was *September 1942* used as an update time too? There were no computers back then and this specific value for this particular timezone guarantees failures – Panagiotis Kanavos Dec 09 '19 at 09:02
  • @PanagiotisKanavos I have added logstash config in question. Please review the config. – Raj Dec 11 '19 at 07:57
  • @PanagiotisKanavos .. Can you please review logstash config ? – Raj Dec 13 '19 at 02:28
  • @PanagiotisKanavos Did you get the chance to review my logstash config? – Raj Dec 13 '19 at 05:39

1 Answers1

1

you need to cast it to timestamp

WHERE lastupdatetimestamp >  DATETIME(TIMESTAMP :sql_last_value);"
ArunasB
  • 104
  • 4