By default, Java assumes you are defining an int value with a literal.
short x = 20; // There is an implicit conversion
But when we use "x" variable in an arithmetic expression, for eample:
short y = x * 2; //DOES NOT COMPILE
We know the result of 20 * 2 is 40, which can easily fit into a short variable. What is going on?