-1

In a project using JSON as an exchange, we encountered the problem that timestamps in milliseconds in C++ and Java are completely different even though both are described as a long primitive data type.

What kind of standards does each language use and why is there a difference?

As an example, 1407315600 is a C++ timestamp which refers to 06.08.2014 09:00:00 UTC while in Java it's unreadable!

Reading timestamps in Java is done using new Date(1407315600).

halfer
  • 19,824
  • 17
  • 99
  • 186
Mohamed Taher Alrefaie
  • 15,698
  • 9
  • 48
  • 66

1 Answers1

2

Try

new Date(1407315600 * 1000)

The Java date requires milliseconds, the C++ timestamp you have looks like it is in seconds.

In general, the C++ time_t functions give the time in seconds since the epoch.

To get the time in milliseconds in C++ (compared to Java), please refer to this.

C++11

If C++11 is available on the platform, the chrono::high_resolution_clock could be used to obtain a higher resolution (note; the clock may be an alias for one of the other clocks or an implementation defined clock).

#include <iostream>
#include <chrono>

int main()
{
    using namespace std;
    using namespace std::chrono;

    milliseconds ms;
    ms = duration_cast<milliseconds>(high_resolution_clock::now().time_since_epoch());
    cout << ms.count() << endl;
}
Community
  • 1
  • 1
Niall
  • 30,036
  • 10
  • 99
  • 142