0

I have a Linq to SQL query that behaves (in my opinion) very strange when I check for null values.

There is a record in the DB as shown by the last Linq, but why does the 1st two queries not show the record?

//Check
(record.SomeID == null ? "Yes" : "No"); //This prints Yes

//LINQ
var q = (from t in mySQLTable
         where t.PKID == record.PKID && 
         t.SomeID == record.SomeID
         select t).FirstOrDefault();
 //This prints nothing.  I.e. no records found

var q2 = (from t in mySQLTable
          where t.PKID == record.PKID &&
          t.SomeID == (record.SomeID == null ? null : record.SomeID)
          select t).FirstOrDefault();
 //This also prints nothing.  I.e. no records found

 var q3 = (from t in mySQLTable
           where t.PKID == record.PKID &&
           t.SomeID == null
           select t).FirstOrDefault();
 //This prints 1 record
Gilad Green
  • 36,708
  • 7
  • 61
  • 95
Cameron Castillo
  • 2,712
  • 10
  • 47
  • 77

1 Answers1

1

You can overcome this issue using the below query:

bool isNull = record.SomeID == null;

var q = (from t in mySQLTable
         where t.PKID == record.PKID 
         && ( (isNull && t.SomeID == null)
             ||
              (!isNull && t.SomeID == record.SomeID)
            )
         select t).FirstOrDefault();
Zein Makki
  • 29,485
  • 6
  • 52
  • 63
  • Yes, that is working. This however can become quite difficult to read (and lots of work) if there a number of ID's that you are checking. Maybe an extension method might be the answer if this are used frequently. Thanks. – Cameron Castillo Aug 08 '16 at 07:46