I have a question regarding Timestamp & epoch time.
Let's say I have a MySQL server that actually has their timezone set to 00:00 (that is GMT) and has a table like:
CREATE TABLE `message` (
`mid` int(10) unsigned NOT NULL AUTO_INCREMENT,
`content` varchar(255) DEFAULT NULL,
`username` varchar(128) COLLATE utf8_bin NOT NULL,
`time` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (`mid`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin;
in which you store messages. I also have a JS file which compares the last message from that table and compares it with Client's time. I have logged them in console.log and here's my question. Shouldn't the last message's timestamp always be less than client unix timestamp?
I'm getting these results:
Last message: 1509580533
My timestamp: 1509570014.969
~~~
Last message: 1509580533
My timestamp: 1509570016.179
~~~
Last message: 1509580533
My timestamp: 1509570016.729
Here's my code:
var timestamp = Date.parse("2017-11-01 20:55:33")/1000; //for example
console.log("Last message: " + timestamp);
console.log("My timestamp:" + Date.now() / 1000);
What's wrong with my code? Thanks!