0

The following program prints unexpected output, can anyone please tell me how the multicast worked here. I searched a lot and got this typecast rule "Sign extension is performed if the type of the original value is signed; zero extension if it is a char, regardless of the type to which it is being converted." Still unable to figure out the what happened while casting.

    public class Multicast 
    {
       public static void main(String[] args) 
       {
          System.out.println((int) (char) (byte) -1);
       }
    }

Output: 65535

Atul Chavan
  • 1,754
  • 15
  • 12
  • Well, `char` is unsigned, so the cast from char to int doesn't sign extend. Thus you get the maximum value of a char. Is that what you meant? – markspace Apr 20 '16 at 18:19
  • @markspace thanks for your reply, i got this explanation, not sure how it worked. "Because byte is a signed type, sign extension occurs when converting the byte value –1 to a char. The resulting char value has all 16 bits set, so it is equal to 2^16 – 1, or 65,535. The cast from char to int is also a widening primitive conversion, so the rule tells us that zero extension is performed rather than sign extension. The resulting int value is 65535, which is just what the program prints." – Atul Chavan Apr 20 '16 at 18:26

0 Answers0