0

Java uses 32 bits for the char tipe - so the max value is 65536.

But the following code give me the result reported in the title.

public static void main(String[] args) {
    int a = 10000000;
    char b = 33;
    b = (char)a;
    System.out.println((int)b);

}
Max Vollmer
  • 8,412
  • 9
  • 28
  • 43
It's ok.
  • 53
  • 6
  • You are observing an overflow. Not sure what you expected, a crash? Java does not crash on overflows. – Zabuzard Aug 31 '19 at 23:20
  • What value did you expect to get, and *why* did you expect that? If you thought `65535`, then you should read up on how [Narrowing Primitive Conversion](https://docs.oracle.com/javase/specs/jls/se11/html/jls-5.html#jls-5.1.3) works in Java. – Andreas Sep 01 '19 at 02:34
  • I presume you meant 16 bits, giving 65536 possible values. 32 bits gives 4,294,967,296 possible values. – Jim Garrison Sep 01 '19 at 04:13

2 Answers2

10

A char is 16bit, not 32bit.

65535 is the maximum value of a char, and 10000000 is greater than that, so you cannot store that value in a char.

10000000 in binary is 100110001001011010000000

Now when casting that to char all the bits left of the 16 bits that "fit" are dropped, leaving you with 1001011010000000.

And binary 1001011010000000 in decimal is 38528.

Max Vollmer
  • 8,412
  • 9
  • 28
  • 43
  • Thanks so much ! I now understand why this code gives me this result ! – It's ok. Aug 31 '19 at 23:31
  • I was expecting 65536 as a result. – It's ok. Aug 31 '19 at 23:32
  • I started today to learn Java, sorry for the stupid question :) – It's ok. Aug 31 '19 at 23:33
  • Not a stupid question, otherwise you'd have downvotes instead of upvotes. It's a valid and understandable beginner's question. It makes sense to assume that the value gets rounded down instead of binary bits being dropped. Keep on asking and learning, that's how we all started. – Max Vollmer Aug 31 '19 at 23:35
3

Java uses 32 bits for the char tipe

no, Java uses 16 bit chars.

so the max value is 65536.

Almost - in terms of what is the char's max value, it's 65535, however, the max value of 2s-complement 32 bit value is 231, which is 2147483647.

But the following code give me the result reported in the title.

int a = 10000000;
char b = 33;
b = (char)a;

Well, 10000000 is sure greater than 65535, isn't it? What did you expect when trying to fit that number into a char? What you got is an overflow.

Community
  • 1
  • 1
Fureeish
  • 12,533
  • 4
  • 32
  • 62