I'm trying to understand what the bytes from the timestamp set on my DB mean. How do they get computed to generate the more readable date?
I'm using the below query to get the data that I need:
SELECT systimestamp
,DUMP (systimestamp)
,sessiontimezone
FROM dual;
And the output of my above query is:
+-------------------------------------+-----------------------------------------------------------------+------------------+
| systimestamp | dump(systimestamp) | sessiontimezone |
+-------------------------------------+-----------------------------------------------------------------+------------------+
| 31-JUL-15 08.55.06.157047000 +00:00 | Typ=188 Len=20: 223,7,7,31,8,55,6,0,216,88,92,9,0,0,5,0,0,0,0,0 | Europe/Bucharest |
+-------------------------------------+-----------------------------------------------------------------+------------------+
I have found a few resources online explaining what the bytes mean (here) but the rules don't match in my scenario.
For example: 223 is not the century + 100 etc.
The reason I'm trying to do this is because of a problem I'm facing when comparing the values in a timestamp(3)
column with systimestamp
and I'm trying to write a script to verify if my issue/solution is the same as explained here.
Any help is appreciated.