I'm writing a program in Java, and I noticed that I had an expression in my code x / y > 0
that was evaluating to false
when y == 0
. I have since corrected this line in my code, but I'm curious, does anyone know why x / y > 0
evaluates to false
rather than causing the program to terminate?
Asked
Active
Viewed 762 times
0
-
On my machine it throws ArithmeticException – Suspended May 23 '15 at 22:19
-
I'm using Eclipse, if that matters. – Rootbeer May 23 '15 at 22:19
-
I'm using eclipse too – Suspended May 23 '15 at 22:20
-
Can you post your code. This doesn't seem right. – t3dodson May 23 '15 at 22:21
-
possible duplicate of [Why doesn't Java throw an Exception when dividing by 0.0?](http://stackoverflow.com/questions/2381544/why-doesnt-java-throw-an-exception-when-dividing-by-0-0) – Dan Getz May 23 '15 at 23:40
1 Answers
1
x / y
will throw ArithmeticException
if they are both integers and y is 0.
x / y
equals Infinity if they are floating point numbers. And Infinity is greater than 0!

Ewan Mellor
- 6,747
- 1
- 24
- 39
-
2`Infinity` is greater than `0`, but that does not explain why `Infinity > 0 == false`. I suppose that not only `x,y` were floats and `y == 0`, but also `x < 0`. – Zereges May 23 '15 at 23:08