9

There is this Javascript function that I'm trying to rewrite in Java:

function normalizeHash(encondindRound2) {
    if (encondindRound2 < 0) {
        encondindRound2 = (encondindRound2 & 0x7fffffff) + 0x80000000;
    }
    return encondindRound2 % 1E6;
}

My Java adaptation:

public long normalizeHash(long encondindRound2) {
        if (encondindRound2 < 0) {
            encondindRound2 = (((int) encondindRound2) & 0x7fffffff) + 0x80000000;
        }
        return (((int) encondindRound2) % 1_000_000);
    }

When I pass -1954896768, Javascript version returns 70528, while Java returns -896768. I'm not sure why. The difference seems to start inside the if condition: in Javascript function after the if encodingRound2 = 2340070528, while in Java: encodingRound2 = -1954896768.

I made these repls to show it online:

Javascript: https://repl.it/repls/NumbGuiltyHack

Java: https://repl.it/repls/ClumsyQualifiedProblem

EDIT: Changing Java function to this

public long normalizeHash(long encondindRound2) {
        if (encondindRound2 < 0) {
            encondindRound2 = (encondindRound2 & 0x7fffffff) + 0x80000000;
        }
        return (encondindRound2 % 1_000_000);
    }

doesn't seem to affect the result - it's still -896768

parsecer
  • 4,758
  • 13
  • 71
  • 140
  • 1
    Why are you casting `encondindRound2` to an `int` in the Java code? Since it's defined as a `long`, you're going to potentially lose precision if you cast it to a narrower type. – Jordan Jan 30 '20 at 21:02
  • 1
    @Jordan because - inside the if - there is a bitwise operation being performed on it. Javascript, although storing numbers as 64-bit floats, when doing bitwise, converts numbers to 32-bit integers. I had this issue with another piece of code before, when doing bitwise with Java's `long` gave different results, because of overflow. – parsecer Jan 30 '20 at 21:05
  • Removing the `(int)` cast from the `return` line doesn't change the Java result, it remains `-896768` – parsecer Jan 30 '20 at 21:06
  • It could be that the negative number has the same bits as the JavaScript number? – NomadMaker Jan 30 '20 at 21:08
  • 4
    Ah, I found the issue. When you tack on `... + 0x80000000`, Java converts the value to an int, because `0x80000000` is considered to be an int literal. Change that number to `0x80000000L`. – Jordan Jan 30 '20 at 21:10
  • 1
    @Jordan Wow, you are wizard! It worked! Please post your answer and I'll accept it – parsecer Jan 30 '20 at 21:12

2 Answers2

9

In Java, 0x80000000 is outside the range of a 32bit int, so it wraps around to -2147483648.

In JavaScript, 0x80000000 is well inside the range of a 64bit double, so it remains 2147483648.

Obviously, adding -2147483648 vs adding 2147483648 results in a very large discrepancy.

You can either use a long 0x80000000L in Java, or coerce your JS number into a 32bit int with (0x80000000|0), depending on which you want.

that other guy
  • 116,971
  • 11
  • 170
  • 194
2

Try this. You need to specify long values in doing the conversion.

    public static long normalizeHash(long encondindRound2) {
        if (encondindRound2 < 0) {
            encondindRound2 =  (encondindRound2 & 0x7fffffffL) + 0x80000000L;
        }

        return  (encondindRound2 % 1_000_000);
    }

But there is another issue you should be aware of. Javascript treats % as a modulo operator where Java treats it as a simple remainder operator. Check out this post here for more information.

WJS
  • 36,363
  • 4
  • 24
  • 39