There is this Javascript function that I'm trying to rewrite in Java:
function normalizeHash(encondindRound2) {
if (encondindRound2 < 0) {
encondindRound2 = (encondindRound2 & 0x7fffffff) + 0x80000000;
}
return encondindRound2 % 1E6;
}
My Java adaptation:
public long normalizeHash(long encondindRound2) {
if (encondindRound2 < 0) {
encondindRound2 = (((int) encondindRound2) & 0x7fffffff) + 0x80000000;
}
return (((int) encondindRound2) % 1_000_000);
}
When I pass -1954896768
, Javascript version returns 70528
, while Java returns -896768
. I'm not sure why. The difference seems to start inside the if condition: in Javascript function after the if encodingRound2 = 2340070528
, while in Java: encodingRound2 = -1954896768
.
I made these repls to show it online:
Javascript: https://repl.it/repls/NumbGuiltyHack
Java: https://repl.it/repls/ClumsyQualifiedProblem
EDIT: Changing Java function to this
public long normalizeHash(long encondindRound2) {
if (encondindRound2 < 0) {
encondindRound2 = (encondindRound2 & 0x7fffffff) + 0x80000000;
}
return (encondindRound2 % 1_000_000);
}
doesn't seem to affect the result - it's still -896768