In the languages I have tested, - (x div y )
is not equal to -x div y
; I have tested //
in Python, /
in Ruby, div
in Perl 6; C has a similar behavior.
That behavior is usually according to spec, since div
is usually defined as the rounding down of the result of the division, however it does not make a lot of sense from the arithmetic point of view, since it makes div
behave in a different way depending on the sign, and it causes confusion such as this post on how it is done in Python.
Is there some specific rationale behind this design decision, or is just div
defined that way from scratch? Apparently Guido van Rossum uses a coherency argument in a blog post that explains how it is done in Python, but you can have coherency also if you choose to round up.
(Inspired by this question by PMurias in the #perl6 IRC channel)