There are some varbinary
table columns in my database that i need to test for equality with a byte array, all happening through Entity Framework and Linq-to-Entities. For Linq-to-Objects, this is achieved very easily with the SequenceEquals
extension method, however EF doesn't support it unfortunately, therefore i rely on the ==
operator which is correctly translated into a SQL query.
E.g.
byte[] deviceId = GetDeviceId();
Device device = _deviceRepository.QueryAll() //backed by a DbContext, returns IQueryable<Device>
.Include(d => d.RelatedEntity1)
.Include(d => d.RelatedEntity2)
.Include(d => d.RelatedEntityEtc)
.FirstOrDefault(d => d.DeviceUniqueIdentifier == deviceId && ...);
LoginUserOnDevice(device);
The problem is, though, when unit testing the piece of code involving the Linq-to-Entities query, with a fake _deviceRepository
, for whom i use an ordinary List<T>
transformed through an .AsQueryable()
into an IQueryable<T>
, the equality operator is applied with its original semantics, i.e. reference equality, so the comparison obviously fails. Of course, this doesn't happen strictly for unit testing, but in any situation where the linq query is not executed anymore against a SQL data source.
What would be the optimal approach here? Preferably without changing the query code, so it runs correctly both when querying against a real database (using EF 6) and while unit testing. Materializing the entire collection into memory by calling .ToList()
right after the Include
s or anything along these lines is out of the question in my scenario.