1

So I have a code I previously wrote in C# that I am trying to convert to JS. However I just encountered a strange issue while narrowing down my Number type to UInt32 Range in JS. However, If I implement the same code in C# or Object Pascal (languages which have native UInt32 types), I get a different result. Can someone please show me how I can get results that are consistent with the C# code in JS.

For simplicity sake, I am using defined constants to show the problem.

The result of my Code is

JS: 2444377276

C# and Object Pascal: 2444377275

function modulo(a, b) {
    return a - Math.floor(a / b) * b;
}

function ToInteger(x) {
    x = Number(x);
    return x < 0 ? Math.ceil(x) : Math.floor(x);
}

function ToUint32(x) {
    return modulo(ToInteger(x), Math.pow(2, 32));
}

let a = (1207988537 * 16777619);
console.log(a >>> 0);
let uu = new Uint32Array([a]);
console.log(uu[0]);
console.log(ToUint32(a));
// will only work in ES2020 upwards but produces same result as other JS solutions above
//console.log(BigInt.asUintN(32, BigInt(a)));

C# code

using System;
 
public class Test
{
    public static void Main()
    {
        // your code goes here
        unchecked
        {
            uint a = (uint)(1207988537 * 16777619); 
            Console.WriteLine(a);
        }
 
    }
}

Object Pascal Code

{$MODE DELPHI}
program ideone;
var
    a: UInt32;
begin
    (* your code goes here *)
     a := UInt32(1207988537 * 16777619); 
    Writeln(a);
end.
Xor-el
  • 84
  • 1
  • 1
  • 3

0 Answers0