I just started learning java through w3schools and one of the methods(?) was Math.random()
and they had this example:
I am not quite sure why they've done Math.random() * 101
. Why did they use 101 instead of 100?
Thank you
I just started learning java through w3schools and one of the methods(?) was Math.random()
and they had this example:
I am not quite sure why they've done Math.random() * 101
. Why did they use 101 instead of 100?
Thank you
From the Oracle documentation:
Returns a double value with a positive sign, greater than or equal to 0.0 and less than 1.0.
The expression Math.random() * 101
therefore evaluates to some floating point number in the range 0 to less than 101. Since casting to an integer truncates, instead of rounding, this will be reduced to integers in the range 0 to 100, inclusively, with a (roughly) even distribution.
If this was Math.random() * 100
, the number 100 could never be generated. There are 101 values from 0 to 100, inclusive, so you need to go to 101.
The random method generates random numbers between 0 and .9999999...
inclusive. If you multiply by 101
you get between 0 and 100.999999...
inclusive. Assuming the largest possible number is generated that would be (int)(100.9999999...)
which results in 100
since the fraction will be dropped when cast to an int
.