2

Let's say i declare this variable:

long k = 1060606060000;

If i do that I get an error, because the number is obviously to large for an integer. Why do I have to add the L at the end for the compiler to recognise that it is a long variable? Even though I obviously said I need k to be of the type long.

Geddi
  • 33
  • 5

1 Answers1

3

Because the literal value will always be interpreted as int, if not appended with l or L.

You would be assigning a literal int to a variable of type long if the l wasn't specified.

At compile-time, the compiler checks on the literal value first and if l or L is not appended, it will interpret it as int.

Now, if the number is larger than Integer.MAX_VALUE, the compiler will display an error.

Mena
  • 47,782
  • 11
  • 87
  • 106
  • But i obviously declare my variable as long, because my number could extend the range of int. Why doesn't the compiler just automatically cast to long then? – Geddi Nov 15 '16 at 14:09
  • Consider this as a two steps process. First the compiler examines the literal, then it applies the assignment of the literal to the reference. The compilation error (when not appended with `L` / `l`) will take place when the literal value is examined. – Mena Nov 15 '16 at 14:10
  • Ok thanks. Is there a background idea to this? I feel like this isn't the most practical way right? – Geddi Nov 15 '16 at 14:13
  • @Geddi not sure I understand. Appending the `l` *is* the way to declare a `long` when using a literal and assigning to a `long` or `Long` reference type, if the literal cannot be assigned to `int`. If the value fits within `int` range, then you can let the conversion do its work and not bother with the `l`. – Mena Nov 15 '16 at 14:16
  • What i am trying to say is that long is bigger then integer and the other types below. So when i declare a variable of the type long, all numbers that do not extend 64bits can fit in there. So why even bother with the type of the literal when it can fit in there anyways. Sorry if this sounds unreasonable I am obviously a beginner. Just curious. – Geddi Nov 15 '16 at 14:22
  • @Geddi what you say makes sense, but before checking on the reference type you are assigning the literal value to, the compiler checks on the literal itself, without yet checking on the reference type yet. As integer literals default to `int`, the compiler complains because of the literal value itself, interpreted as `int`, does not fit the `int` range. – Mena Nov 15 '16 at 14:28
  • Not sure why you are calling `long` a reference type, both in your answer in in the comments. `Long` (capital L) is a reference type, `long` is a primitive (value) type. Calling it a reference type is confusing and wrong. (Read chapter 4 of the JLS, "The types of the Java programming language are divided into two categories: primitive types and reference types. ") – Erwin Bolwidt Nov 15 '16 at 14:32
  • @ErwinBolwidt you are right. Amending. – Mena Nov 15 '16 at 14:32