In my opinion Javascript has quite a few weird quirks. Here is one of them
var a;
!a //true, a is not set
a = null
!a //true, a is not set
a = 1
!a //false, a is set!
a = 0
!a//true, a is not set!
All of these values i find to be quite reasonable, except for the case where a = 0, this is just plain wrong to me. Is there any reasonable way of circumventing this issue without having to add to bulk to my code?