0

I was solving a DP question involving Catalan Numbers using JavaScript. I was getting the wrong answer for n = 35 (35th Catalan Number). So I decided to code in C++. I wrote similar code in C++ and to my surprise my code passed all test cases. Why am I getting different result for similar code in C++ and JS?

Problem Link : https://www.interviewbit.com/problems/intersecting-chords-in-a-circle/

JS Code(Rhino 1.7.7)

var arr = new Array(36);
arr[0] = 1;
arr[1] = 1;
arr[2] = 2;
for(var i = 3; i <= 35; i++) {
    arr[i] = 0;
    for(var j = 0; j < i; j++) {
        arr[i] += arr[j] * arr[i-j-1];
        arr[i] = arr[i] % 1000000007;
    }
}
print(arr[35]);

C++

int main() {
long int arr[36];
    arr[0] = 1;
    arr[1] = 1;
    arr[2] = 2;
    for(int i = 3; i <= 35; i++) {
        arr[i] = 0;
        for(int j = 0; j < i; j++) {
            arr[i] += arr[j] * arr[i-j-1];
            arr[i] = arr[i] % 1000000007;
        }
    }
    cout<<arr[35];
return 0;
}

For input : 35;
Expected Output : 93302951;
Output in JavaScript: 93302952;
Output in C++: 93302951;

Sushant Ahuja
  • 43
  • 1
  • 3

2 Answers2

3

That's because Javascript uses 64-bit IEEE 754 floating point numbers. If you change your C++ code to use the equivalent type (double), you get the same wrong result:

#include <iostream>
#include <cmath>

int main()
{
    double arr[36];
    arr[0] = 1;
    arr[1] = 1;
    arr[2] = 2;
    for(int i = 3; i <= 35; i++) {
        arr[i] = 0;
        for(int j = 0; j < i; j++) {
            arr[i] += arr[j] * arr[i-j-1];
            arr[i] = std::fmod(arr[i], 1000000007);
        }
    }
    std::cout << (int)arr[35];
    return 0;
}

prints:

93302952

Max Vollmer
  • 8,412
  • 9
  • 28
  • 43
  • You could add an example in JS using `bigint` that will yield the proper result. I just tested it out and got `93302951n` – ndugger Jul 16 '19 at 18:42
  • @ndugger My Javascript is too rusty, but I highly encourage you to post an answer that complements mine with a working Javascript example :) – Max Vollmer Jul 16 '19 at 18:50
3

Max's answer is spot on, so consider this a follow-up. JavaScript now has [arbitrarily large] integers under the name of BigInt. You can create literal versions of this by adding an n suffix to your number literals. So, a working JS version that would be equivalent to the C version would be as follows:

const array = new Array(36);

array[0] = 1n;
array[1] = 1n;
array[2] = 2n;

for (let i = 3; i <= 35; i++) {
    array[i] = 0n;

    for (let j = 0; j < i; j++) {
        array[i] += array[j] * array[i-j-1];
        array[i] = array[i] % 1000000007n;
    }
}

console.log(array[35]);

This yields 93302951n

ndugger
  • 7,373
  • 5
  • 31
  • 42
  • Two notes after consulting the docs: 1. BigInt isn't 64bit, it supports arbitrarily large integers. 2. BigInt isn't part of the language standard yet. It probably will be, but it's still in the draft stage. It's not supported by Edge (who would have thought ^^) and Safari. – Max Vollmer Jul 16 '19 at 19:04
  • Edge is going to be chrome come this fall, and Safari is always off doing their own thing (They have implemented it in their preview branch, though, so it's on its way). I think it's very safe to say that it's standard at this point. As for it not being 64bit, you're right, I got my wires crossed from researching JS DataViews recently https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DataView/setBigInt64 – ndugger Jul 16 '19 at 19:08