Whenever I'm trying to divide two integers, I am getting some odd results when the integers are bigger than 21474836
- I thought that it's the case of some data-type limitations, but the int
eger is obviously much bigger: 2147483647
.
As I said, this happens only when two of the integers are bigger than 21474836
.
Working (because the integers are lower than 21474836
):
(11474836 * 100) / 11474836 // returns 100
Not working:
(211474836 * 100) / 211474836 // returns 0, should 100
(31474830 * 100) / 31474837 // returns -99, should 99~
(40000000 * 100) / 41474837 // returns -7, should 96~
See the live demo here: http://ideone.com/lAeneM
What is the problem?