As claimed here, placing an L
after an integer constant turns it into a Long
object whereas using l
would supposedly turn it into its primitive counterpart long
, but Oracle claims here that "An integer literal is of type long
if it is suffixed with an ASCII letter L
or l
".
So is the former making things up, or is the latter lying to me?
And if Oracle is lying to me, would the memory and / or performance difference of Long
vs long
ever actually matter or even be detectable?
In other words, does the case of a Java IntegerTypeSuffix actually matter?