int a1 = 65535;
char ch2 = (char) a1;
System.out.println("ASCII value corresponding to 65535 after being typecasted : "+ch2);// prints?
char ch3 = 65535;
System.out.println("ASCII value corresponding to 65535 : "+ch3);// again prints?
I quote from Herbert Schildt Chapter 3 : Data types, Variables and Arrays :
The range of a char is 0 to 65535. There are no negative chars. The standard set of characters known as ASCII still ranges from 0 to 127 as always, and the extended 8-bit character set, ISO-Latin-1, ranges from 0 to 255. Since Java is designed to allow programs to be written for worldwide use, it makes sense that it would use Unicode to represent characters. An integer can also be assigned to a char as long as it is within range.
//char ch33 = 65536; compilation-error, ofcourse since out of char range (which is 0 - 65535) int a11 = 65536; char ch22 = (char) a11; System.out.println("ASCII value corresponding to 65536 after being typecasted : "+ch22); // non-printing character(appearance of a small square like figure in eclipse console)
The question is: why is there no compilation error for this line: char ch22 = (char) a11
, even though char ch33 = 65536
does not works? One more thing, this was not the case when int a1 = 65535
was taken?