6

I was surprised to see this code work. I thought that char and int were two distinct data types in Java and that I would have had to cast the char to an int for this to give the ascii equivelent. Why does this work?

String s = "hello";
int x = s.charAt(1);
System.out.println(x);
Roger
  • 1,461
  • 5
  • 14
  • 15

2 Answers2

12

A char can be automatically converted to an int. See JLS 5.1.2:

The following 19 specific conversions on primitive types are called the widening primitive conversions:

...

  • char to int, long, float, or double

...

A widening conversion of a signed integer value to an integral type T simply sign-extends the two's-complement representation of the integer value to fill the wider format. A widening conversion of a char to an integral type T zero-extends the representation of the char value to fill the wider format.

(emphasis added)

yshavit
  • 42,327
  • 7
  • 87
  • 124
4

char and int are two distinct types, but this works because an int has more precision than a char. That is, every value of char can be represented as an int so no data is lost in the cast.

Kevin
  • 53,822
  • 15
  • 101
  • 132