1

My question here is in terms of concept.

Background
Donald Knuth once mentioned about a case where sometimes his merge sort failed once in a while. Later on he found out that it's due to the maximum value being exceeded while trying to find the mid-value between two integers.

The code had to be changed from this:

int mid = (left + right) / 2;

to this:

int mid = left + (right-left)/2;

My question:
I'm just curious why Java doesn't throw the Overflow Error instead of silently returning unexpected result. I feel that it's just semantically wrong that the program deliberately continues with the wrong outcome.

One possible argument that I cant think of here is in context of bit operation. It may be too troublesome to incorporate the Overflow Error while doing the binary operations.

But still I would argue that if that's the case then Java should throw the Overflow Error when an arithmetical operation such as +, -, *, / is used and ignores the Overflow Error when dealing only with binary operations such as >>, >>>, <<, etc.

Not sure about the other programming languages, but I suspect it's quite similar. Please correct me if I'm wrong.

So, any reasoning behind this?

EDIT:
As marked by Luiggi, this is a duplicate question. I've got my answers in the given links over there.

Iwan Satria
  • 1,903
  • 1
  • 19
  • 22
  • Tradition. And many hardware architectures wouldn't make it easy to do. And C doesn't do it. (Only language I recall doing this was IBM's internal PL/S language, and it could bite you when you weren't careful.) – Hot Licks May 16 '14 at 03:40
  • More: http://stackoverflow.com/q/16085286/1065197 http://stackoverflow.com/q/103654/1065197 – Luiggi Mendoza May 16 '14 at 03:40

0 Answers0