Alexei Levenkov referenced the answer in his last comment, but maybe there should be some further explanation: assigning a literal to a variable is something different than assigning a value from another variable (especially if the other variable is also of another type, which implies an implicit or explicit conversion).
For example:
// The following statement defines a variable
// called 'integerValue' and assigns the
// 32 bit signed integer value fourty-two to
// it because of the literal '42'.
int integerValue = 42;
// The next statement defines a variable
// called 'longValue' and assigns the
// 64 bit signed integer value forty-two to
// it because of the literal '42L'
long longValue = 42L;
// The next line defines a variable
// with the 16 bit signed integer value
// forty-two (literal '42') - the compiler
// will automatically interpret the literal
// in the correct format
short shortValue1 = 42;
// There is no implicit conversion between
// int and short, therefore the next statement
// will not compile. Notice that there is no
// literal involved here - instead a value from
// another variable is assigned.
short shortValue2 = integerValue; // Error
// The following assignment is also not correct as
// '42L' is not a suitable literal for values of
// type short
short shortValue3 = 42L; // Error
The main thing is that int
and short
use the same style for literals (plain number, no suffix) and that is what might get people confused.