I have a property in a MonoBehavior, defined in the following way:
public GameObject whatever = null;
I set it to an actual GameObject reference on some of them, leave it blank in others. Basically, its an optional property. Now, the craziest thing happens when I test it in the program:
Debug.Log(whatever + " " + (whatever == null));
This prints: "null False". I have no idea how to proceed. All I want is to be able to null test it in code. (I have tried removing the = null as well, didn't make a difference). Unity seems to be creating some wrapper or something. Here is a full reduction that reproduces the problem:
using UnityEngine;
using System.Collections;
public class TestBehavior : MonoBehaviour
{
public GameObject whatever;
// Use this for initialization
void Start ()
{
Debug.Log("logical: " + whatever + " " + (whatever == null)); // prints null True
Print(whatever);
}
public void Print<T>(T anObject)
{
Debug.Log("Now it gets nuts: " + anObject + " " + (anObject == null)); // prints null False
}
}