-2

I have a section of code as follows:

var percentageTemp =((parseInt(score)/parseInt(guesses))*100);
var percentage= percentageTemp.toFixed(2);

score and guesses are both set to 0 at the start of the code, and are altered as the user plays the game. The code works as it should, exept if I leave score variable 0, I get a NaN output. Why am I getting the NaN output when score = 0?

Toby Cannon
  • 725
  • 6
  • 16

2 Answers2

3

The code works as it should, exept if I leave score variable 0, I get a NaN output.

You shouldn't do (examples from the Node.JS REPL):

> var score = 0; var guesses = 1; var percentageTemp =((parseInt(score)/parseInt(guesses))*100); var percentage= percentageTemp.toFixed(2); console.log(percentage)
0.00

If, on the other hand, you leave guesses at 0 then:

> var score = 0; var guesses = 0; var percentageTemp =((parseInt(score)/parseInt(guesses))*100); var percentage= percentageTemp.toFixed(2); console.log(percentage)
NaN

… because it isn't possible to divide by zero.

Quentin
  • 914,110
  • 126
  • 1,211
  • 1,335
0

I'd advise using a radix. parseInt(score, 10) to ensure the value doesn't get converted to hexadecimal. Also, note that javascripts handling of decimal numbers is not accurate. Try putting this in your console - (0.1 + 0.2) and see an unusual value returned - 0.30000000000000004

Steve Tomlin
  • 3,391
  • 3
  • 31
  • 63