What is the difference between: if((typeof OA != 'undefined') && OA )
and if(OA)
?
The former statement works; the latter quietly stops the execution of current function.
(maybe a rookie question)
Thanks!
What is the difference between: if((typeof OA != 'undefined') && OA )
and if(OA)
?
The former statement works; the latter quietly stops the execution of current function.
(maybe a rookie question)
Thanks!
if(OA)
will fail if OA
was never defined. typeof OA != 'undefined'
checks if OA
is defined.
var OA;
if(OA){
}
This works.
if(OA){
}
This doesn't work: OA is not defined
.
typeof OA != 'undefined' && OA
checks if it's defined before trying to access the variable
compiler wont try to evaluate OA incase of typeof where as in it tries to evaluate in if(OA)
if ((typeof OA != 'undefined') && OA)
This will first check if the variable OA
is defined. If it is, it will then be cast to a boolean and evaluated.
if(OA)
This assumes OA
exists and immediately casts it to a boolean and evaluates it.
The second example will throw a javascript exception if the variable OA
has never been declared - the first example avoids that.
See my answer here for more explanation on the multiple meanings of undefined
in javascript.