So I am porting code from PHP to Java. Part of the code has a time sensitive hashing algorithm, where the current Unix time stamp acts as a seed. I am running into the problem where my ported code is giving me different answers in either language. I suspect it could be a slight difference in implementation or something.
I am wondering if the problem is something rather obscure such as this. Nevertheless, any help would be greatly appreciated.
Here is my code in Java.
private static int generateB() {
SimpleDateFormat sdf = new SimpleDateFormat("dd/MM/yyyy", Locale.US);
Date start;
Date now;
try {
start = sdf.parse("01/01/1970");
now = new Date();
} catch (ParseException e) {
return 402890;
}
long diff = now.getTime() - start.getTime();
long hours = diff / (60 * 60 * 1000) % 24;
long days = diff / (24 * 60 * 60 * 1000);
return (int) (hours + days * 24);
}
and here is the PHP code:
/**
* Generate "b" parameter
* The number of hours elapsed, since 1st of January 1970
*
* @return int
*/
private function generateB()
{
$start = new \DateTime('1970-01-01');
$now = new \DateTime('now');
$diff = $now->diff($start);
return $diff->h + ($diff->days * 24);
}
Yet they return different results, off by 3 hours to be exact.
At the time of this post. The PHP returns 403472 while Java returns 403475. Also if I use System.currentTimeMillis() I get 403480.
So my question is why is there any difference at all? I mean subtracting 3 could solve my problem but I am just curious for future reference why the difference exists at all. Note: I am running the PHP via PHP Sandbox for testing purposes.