1

I don't know if I am asking a valid question or not, but I am curious why PHP, JavaScript, Python interpret null differently

Javascript :

console.log(null === 0); // Outputs false
console.log(null + 1); //Outputs 1 , why ? if null isn't equal to zero 

PHP

var_dump(null === 0);
// outputs boolean false

var_dump(null + 1);
// outputs int 1, why if null being not equals to zero ?

Python :

>>None == 0
>>false

Since python None is singleton object, so addition to its gives the error

TypeError: unsupported operand type(s) for +: 'NoneType' and 'int'

Curios about how other language interpret NULL ? And what it should be ? And if Null isn't equals to Zero , why doing a addition on it , it behaves like Zero ?

Dimag Kharab
  • 4,439
  • 1
  • 24
  • 45

1 Answers1

2

You can look at type comparison tables of each language to get some understanding. Here are tables for 3 languages you listed:

And then you should check how null (or None) are casted to other types.

For PHP:

  • (int)null is 0
  • (string)null is '' (empty string)

For JS:

  • null + 0 is 0, null + 1 is 1, but parseInt(null) is NaN
  • null + '' is 'null' and String(null) is 'null'

Can't check it for Python right now. But you see why it behaves like that in PHP and JS. The reasons behind this differences are probably more complicated.

Jan.J
  • 3,050
  • 1
  • 23
  • 33