In a project using JSON as an exchange, we encountered the problem that timestamps in milliseconds in C++ and Java are completely different even though both are described as a long
primitive data type.
What kind of standards does each language use and why is there a difference?
As an example, 1407315600
is a C++ timestamp which refers to 06.08.2014 09:00:00 UTC
while in Java it's unreadable!
Reading timestamps in Java is done using new Date(1407315600)
.