I tried to create a function in Javascript that will give a random integer between two integers (including both). But, unfortunately this is not including the first integer. The code is as follows:
const randomEjector = (num1,num2) => {return Math.ceil(Math.random() * (num2 - num1)) + num1}
I am considering the analogy of inequality.
++
Let's consider,
(Math.random()
will output a no. between 0(including) and 1(excluding)).
Let's say x = Math.random()
, then x∈[0,1)
So, 0 ≤ x < 1
0 * (num2-num1) ≤ x * (num2-num1) < 1 * (num2-num1)
Again, x*(num2-num1)∈[0,1*(num2-num1))
Now opening Math.ceil()
part:
But, Math.ceil()
is like greatest integer function:
So, for (0,1] it will return 1.
for 0 => 0
and for interval ((num2-num1)-1,(num2-num1))
,it will return (num2-num1).
So, After Math.ceil()
is executed, we should be left with an interval [0,(num2-num1)].
Now, x*num2-num1
∈ [0,(num2-num1)]
Finally, let y = x*(num2-num1)
So,0 ≤ y ≤ (num2-num1)
So,0 + num1 ≤ y + num1 ≤ (num2-num1) + num1
Which will give num1 ≤ y + num1 ≤ num2
Let, z = y + num1
So, z ∈ [num1,num2]
Then, why my function is excluding 1st term and making it like (num1,num2]?
Where's the problem?
For instance for console.log(randomEjector(3,4))
, it will always give 4 and never 3.