1

I run into trouble when I convert unix epoch timestamp to normal datetime in Azure Data Warehouse:

select 
    dateadd(s, convert(bigint, 2551564800), convert(datetime, '1-1-1970 00:00:00')) as bigint

Error:

Arithmetic overflow error converting expression to data type int.

The value 2551564800 is equal to 09/11/2050.

Any help will be appreciated

Thom A
  • 88,727
  • 11
  • 45
  • 75
  • 1
    @Larnu, the question is related to handle the value more than max int value in DATEADD function. The question you pointed to use DATEADD function. It is diff question. – Venkatesh R Jun 21 '19 at 09:12
  • Thanks for the suggestion , but i am running this code on Azure Sql Data warehouse and dataadd_big is not supported. is there any other alternative ? – Anbarasan Dhanushkodi Jun 21 '19 at 09:44
  • *"i am running this code on Azure Sql Data warehouse"* Then tag that; as that makes the difference on if this is a duplicate or not. – Thom A Jun 21 '19 at 09:53
  • thanks, just edited my post as well. – Anbarasan Dhanushkodi Jun 21 '19 at 09:54
  • @AnbarasanDhanushkodi why use a unix timestamp instead of a proper date type? The timestamp should be converted to a `datetime2` during loading, avoiding the need for conversions afterwards. – Panagiotis Kanavos Jun 21 '19 at 09:55
  • @AnbarasanDhanushkodi and `'1-1-1970 00:00:00'` is *not* an ISO8601 date literal either, which means its value depends on the locale. You wouldn't even need that `convert` call if you used a literal in `YYYYMMDD` form. – Panagiotis Kanavos Jun 21 '19 at 09:57
  • Could you have epoch values greater than 5103129600? – Thom A Jun 21 '19 at 10:00
  • @AnbarasanDhanushkodi come to think of it, you wouldn't need `convert(bigint, 2551564800)` with a bigint *column* either. Where did this value come from? Did you try to pass a Unix timestamp as a *string* to the client? Why not pass a strongly typed *datetime2* or *datetime* parameter? You can probably get rid of the need for conversions if you explain what you actually try to do – Panagiotis Kanavos Jun 21 '19 at 10:00
  • 1
    Unfortunately T-SQL now has a `DATEDIFF_BIG` but not yet a `DATEADD_BIG` (or an option to accept `BIGINT` for the argument. You need a two-step approach with `select DATEADD(SECOND, 2551564800 % 3600, DATEADD(HOUR, 2551564800 / 3600, '19700101'))`. (2 billion hours is ~340K years, so this won't overflow for any representable value.) – Jeroen Mostert Jun 21 '19 at 10:02
  • @PanagiotisKanavos , the timestamp is coming from source system, we cannot change that. I get your point on the format of the literal. the covert (bigint, 2551564800) was an attempt to see if change the default int to bigint and the dataadd functions works. – Anbarasan Dhanushkodi Jun 21 '19 at 10:24
  • @AnbarasanDhanushkodi change your loading code to convert it to a `datetime2`. There's no reason to store that bigint in the database. That's the bug, not how `DATEADD` works – Panagiotis Kanavos Jun 21 '19 at 10:26
  • @Larnu thanks , at this point dont see data going beyond 2100. it is good to know the limitation of the function – Anbarasan Dhanushkodi Jun 21 '19 at 10:26
  • @JeroenMostert you suggestion also works great. thanks – Anbarasan Dhanushkodi Jun 21 '19 at 10:38

1 Answers1

0

If you don't have epoch values greater than 5103129600 (which will allow you you to have dates up to 2106-02-07T06:28:13) this will work:

SELECT DATEADD(SECOND,2551564800 % 2147483647, DATEADD(SECOND,2147483647 * (V.Epoch / 2147483647),'19700101'))
FROM (VALUES(CONVERT(bigint,2551564800))) V(Epoch)
Thom A
  • 88,727
  • 11
  • 45
  • 75
  • I see why you have used the 2147483647 in the code, it is the max value of int data type. Eventhough it has a limit to year 2106, for me it works. i will accept it as solution. Thanks a bunch and thanks to all the contributors. – Anbarasan Dhanushkodi Jun 21 '19 at 10:37