Does anybody know why following happens with the timestamp integers using JavaScript Date? If I use integer -2500938981 to transpose to YYYY-MM-DD hh:mm:ss, I get 1890-09-30 22:43:59. Then for -2500938980 I would expect a timestamp 1890-09-30 22:44:00. But actually what I get is 1890-09-30 22:43:40
var datetime1 = new Date((-2500938981) * 1000)
var datetime2 = new Date((-2500938980) * 1000)
// dt1 in YYYY-MM-DD hh:mm:ss is 1890-9-30 22:43:59
var dt1 = datetime1.getFullYear() + "-" + (datetime1.getUTCMonth() + 1) + "-" + datetime1.getUTCDate() +
" " + datetime1.getUTCHours() + ":" + datetime1.getUTCMinutes() + ":" + datetime1.getSeconds();
// dt1 in YYYY-MM-DD hh:mm:ss is 1890-9-30 22:43:40
var dt2 = datetime2.getFullYear() + "-" + (datetime2.getUTCMonth() + 1) + "-" + datetime2.getUTCDate() +
" " + datetime2.getUTCHours() + ":" + datetime2.getUTCMinutes() + ":" + datetime2.getSeconds();
console.log(dt1)
console.log(dt2)
Also tested here: https://www.w3schools.com/code/tryit.asp?filename=GK1UBRVKDS91
In addition, also tested on https://www.freeformatter.com/epoch-timestamp-to-date-converter.html, following happens:
-2500938981 -> 9/30/1890, 11:59:59 PM
-2500938980 -> 9/30/1890, 11:43:40 PM
On https://www.epochconverter.com/ it works OK:
-2500938981 -> GMT: Tuesday, September 30, 1890 10:43:39 PM
-2500938980 -> GMT: Tuesday, September 30, 1890 10:43:40 PM