-1

When I try to add two integers together, the integer just stays at what it originally was. It's difficult to explain, so here's my code:

var levelRewardsID = parseInt(resultData[0].levelRewardsHighestID)
var levelRewardsIDIncrease = Math.floor(Math.random()*9)+1
var newLevelRewardsID = levelRewardsID+levelRewardsIDIncrease
console.log(levelRewardsID)
console.log(levelRewardsIDIncrease)
console.log(newLevelRewardsID)
if (!isNaN(levelRewardsID)) {
  console.log("not NAN")
}
if (!isNaN(levelRewardsIDIncrease)) {
  console.log("not NAN 2")
}
if (!isNaN(newLevelRewardsID)) {
   console.log("not NAN 3")
}

Console:

89819672607051330000
6
89819672607051330000
not NAN
not NAN 2
not NAN 3

So as you can see, everything is an integer. So that can't be the issue. Yet when I try to do var newLevelRewardsID = levelRewardsID+levelRewardsIDIncrease, the output is the same as levelRewardsID... I'm not sure what I did wrong but if anyone knows, do let me know. Thanks!

Note: I'm using Node.JS version 9.1.0 (latest as of posting this) if that helps

APixel Visuals
  • 89
  • 1
  • 1
  • 12

1 Answers1

2

I think you reach the javascript number limit which seems to be 9007199254740992 and 89819672607051330000 > 9007199254740992.

See What is JavaScript's highest integer value that a number can go to without losing precision?

vanessa
  • 419
  • 1
  • 4
  • 13