1

I currently have a function which accepts a varbinary as it allows me to pass the timestamp for comparison against other timestamps without casting it however my comparisons are failing as the timestamp is stored as bigint and then converted to varbinary and then passed to the function however the cast does not store the timestamp as 16 bytes but 8 bytes which cause the comparison to return false.

I.E:

DECLARE @Test1 varbinary(16) = CAST(506693 as varbinary(16)) -- 0x0007BB45
DECLARE @Test2 timestamp = CAST(506693 as Timestamp) -- 0x000000000007BB45

When I compare Test1 with a timestamp to the timestamp is greater than test1 it returns false, however if I use Test2 it returns true (which is correct)

Is there a way to make the varbinary store the additional zeros?

  • I don't quite follow your question. But the solution is to `CAST` both values to `bigint` instead of `varbinary` for comparison. – Wagner DosAnjos Dec 15 '15 at 20:14
  • The `timestamp` datatype in SQL Server has **nothing at all** to do with an actual date and time. It's just a binary "counter", really (that's why it's been renamed `rowversion` in more recent versions of SQL Server) – marc_s Dec 15 '15 at 21:41

1 Answers1

2

Hope this helps:

--It is no difference how you declare it
DECLARE @Test8 varbinary(8) = CAST(506693 as varbinary(8));
DECLARE @Test64 varbinary(64) = CAST(506693 as varbinary(64)); 
SELECT sys.fn_varbintohexstr(@Test8);  --0x0007bb45
SELECT sys.fn_varbintohexstr(@Test64); --0x0007bb45

--but it is a difference what you are casting
DECLARE @Test16 varbinary(16) = CAST(CAST(506693 AS bigint) as varbinary(16)); 
DECLARE @TestTS timestamp = CAST(506693 as Timestamp);
SELECT sys.fn_varbintohexstr(@Test16); --0x000000000007bb45
SELECT sys.fn_varbintohexstr(@TestTS)  --0x000000000007bb45
Shnugo
  • 66,100
  • 9
  • 53
  • 114