Given the C code below:
int nSum = 0;
// pNumber is 9109190866037
int nDigits = strlen(pNumber);
int nParity = (nDigits-1) % 2;
char cDigit[2] = "\0";
for (int i = nDigits; i > 0 ; i--)
{
cDigit[0] = pNumber[i-1];
int nDigit = atoi(cDigit);
if (nParity == i % 2) {
nDigit = nDigit * 2;
}
nSum += nDigit/10;
nSum += nDigit%10;
printf("NUMBER: %d\n", nSum);
}
Outputs:
NUMBER: 13
NUMBER: 13
NUMBER: 16
NUMBER: 22
NUMBER: 29
NUMBER: 29
NUMBER: 38
NUMBER: 39
NUMBER: 48
NUMBER: 48
NUMBER: 50
NUMBER: 59
NUMBER: 59
And this JavaScript code (written in TypeScript, so there actually is typing involved here as well, but it is mostly inferred):
let nSum = 0;
let nDigits = partialIdNumber.length;
let nParity = (nDigits - 1) % 2;
let cDigit = "\0";
for (let i = nDigits; i > 0; i--) {
cDigit = partialIdNumber[i - 1];
let nDigit = parseInt(cDigit);
if (nParity == i % 2) {
nDigit = nDigit * 2;
}
nSum += nDigit / 10;
nSum += nDigit % 10;
console.log("NUMBER: %d", nSum);
}
Outputs:
NUMBER: 14.3
NUMBER: 14.3
NUMBER: 17.5
NUMBER: 24.1
NUMBER: 31.700000000000003
NUMBER: 31.700000000000003
NUMBER: 41.5
NUMBER: 42.6
NUMBER: 52.4
NUMBER: 52.4
NUMBER: 54.6
NUMBER: 64.5
NOTE: Both these implementations are the same, just different languages.
The C code produces the expected results and JavaScript doesn't.
Questions
- What assumptions does JavaScript make in order to produce that output?
- What part of my JavaScript code would I have to change to produce the desired output?