I have a Linq to SQL query that behaves (in my opinion) very strange when I check for null values.
There is a record in the DB as shown by the last Linq, but why does the 1st two queries not show the record?
//Check
(record.SomeID == null ? "Yes" : "No"); //This prints Yes
//LINQ
var q = (from t in mySQLTable
where t.PKID == record.PKID &&
t.SomeID == record.SomeID
select t).FirstOrDefault();
//This prints nothing. I.e. no records found
var q2 = (from t in mySQLTable
where t.PKID == record.PKID &&
t.SomeID == (record.SomeID == null ? null : record.SomeID)
select t).FirstOrDefault();
//This also prints nothing. I.e. no records found
var q3 = (from t in mySQLTable
where t.PKID == record.PKID &&
t.SomeID == null
select t).FirstOrDefault();
//This prints 1 record