There are normally two special values. Some languages handle both, some only 1 and throw an error with the other, and some merge the two. Those two values are Null
and Undefined
.
Null
can be useful because it is a guaranteed value that indicates that something is wrong, or outside of the domain/range of possible answers. Take Java for instance:
If you did not have null
, what if you did a lookup in a HashMap
for something that didn't exist in the Map? What would you return? If you returned an Object then how would you know wether that Object meant nothing was there or this was what was actually in the Map. Workarounds could include creating your own "NON_EXIST" Constant Object, but thats essentially the same thing as what null
already is anyways. Another workaround might be throwing an Exception. Now you're looking at major performance impacts.
Again, its the idea of having a guaranteed allowable value that you can use. It is always available to you and its always disjoint from the set of "real values" that would normally return from an operation you're performing. Null
is therefore intentionally used to mean something special.
Like I hinted before, there are also optimizations that can be done here as well because Null
is a special value. In languages like Java if you originally referenced an Object solely by a variable, then set that variable to null
you're removing that reference and allowing Java's Garbage Collector to collect that now unreferenced data. If instead you let that variable sit around forever that hunk of memory that may never be used again will continue to hold resources. This is a contrived example, but it proves a point and in some resource intensive Java programs you will see these explicit assignments to null
.