I thought I understood SQL fairly well, but here is a simple example of NULL values introducing a bug that I didn't expect. Could someone explain why the first and second SELECT queries yield different results.
DROP TABLE IF EXISTS #temp;
CREATE TABLE #temp
(
ITEM int,
RELATIONSHIP nvarchar(max) NULL
)
INSERT INTO #temp VALUES (1, '')
INSERT INTO #temp VALUES (2, NULL)
INSERT INTO #temp VALUES (3, 'Parent')
INSERT INTO #temp VALUES (4, 'Child')
SELECT
ITEM, RELATIONSHIP,
CAST(CASE
WHEN (RELATIONSHIP != 'Parent' AND RELATIONSHIP != 'Child')
THEN 0
ELSE 1
END AS bit) family
FROM
#temp;
SELECT
ITEM, RELATIONSHIP,
CAST(CASE
WHEN (RELATIONSHIP = 'Parent' OR RELATIONSHIP = 'Child')
THEN 1
ELSE 0
END AS bit) family
FROM
#temp;
SELECT
ITEM, RELATIONSHIP,
CAST(CASE
WHEN (RELATIONSHIP IS NULL OR (RELATIONSHIP != 'Parent' AND RELATIONSHIP != 'Child'))
THEN 0
ELSE 1
END AS bit) family
FROM
#temp;
ITEM RELATIONSHIP family
1 0
2 NULL 1
3 Parent 1
4 Child 1
ITEM RELATIONSHIP family
1 0
2 NULL 0
3 Parent 1
4 Child 1
ITEM RELATIONSHIP family
1 0
2 NULL 0
3 Parent 1
4 Child 1
Is this because all scalar comparison of NULL values are always false and (false AND false) is always false?