In python 2.7, the following code returns the correct result (-18027917)
from __future__ import print_function
def twos_comp(val, bits):
if (val & (1 << (bits - 1))) != 0:
val -= 1 << bits
return val
valBinary = "110111011001110101001110011"
print(twos_comp(int(valBinary, 2), len(valBinary)))
In JavaScript (Node.js) the following code returns an incorrect result (1995238003)
function toTwosComplement(val, bits) {
if ((val & (1 << (bits - 1))) != 0) {
val -= (val - 1) << bits;
}
return val;
}
valBinary = "110111011001110101001110011"; // same as python example
console.log(toTwosComplement(parseInt(valBinary, 2), valBinary.length));
Evidently there is something different about the behaviour of the bit operators (or int / parseInt) but I haven't been able to see what it is.