1

I have a query written in Linq To Entities:

db.Table<Operation>()
    .Where(x => x.Date >= dateStart)
    .Where(x => x.Date < dateEnd)
    .GroupBy(x => new
    {
        x.EntityId,
        x.EntityName,
        x.EntityToken
    })
    .Select(x => new EntityBrief
    {
        EntityId = x.Key.EntityId,
        EntityName = x.Key.EntityName,
        EntityToken = x.Key.EntityToken,
        Quantity = x.Count()
    })
    .OrderByDescending(x => x.Quantity)
    .Take(5)
    .ToList();

The problem is that it takes 4 seconds when executing in the application using EF. But when I take the created pure SQL Query from that query object (using Log) and fire it directly on SQL Server, then it takes 0 seconds. Is it a known problem?

cryss
  • 4,130
  • 1
  • 29
  • 34
  • Have you tried using a profiler to narrow it down? Without the `Take(5)` I would have suspected the object creation, but 5 objects should be really quick... – Christoph Fink Apr 29 '14 at 06:36
  • Could you recommend such a profiler? – cryss Apr 29 '14 at 06:59
  • We use dotTrace here. – Christoph Fink Apr 29 '14 at 06:59
  • When you fire the query from EF, is it the first query or do you run it multiple times and each time it takes 4 seconds? When you rebuild the application, the first time it is run, the model is rebuilt and it takes a few seconds to do this (dependent on the model size). Subsequent queries can be a lot faster. – Jon Bellamy Apr 29 '14 at 07:54
  • It takes about 4 seconds all the time. – cryss Apr 29 '14 at 08:06
  • @ChrisJ I'm not entirely sure what is happening then. However, you could consider making a StoredProcedure within SQL and then mapping to that in EF as a workaround? – Jon Bellamy Apr 29 '14 at 09:08
  • @JonBellamy Yes, I have to consider that option. – cryss Apr 29 '14 at 09:30
  • If you run SQL Profiler you'll be able to identify the exact query being run (which will probably match the log) and also identify how long it takes to run (0 seconds or 4 seconds). If the profiled query takes 0 seconds to run, the problem is in the c# code. Why don't you post the query that it generates. – Nick.Mc Apr 29 '14 at 10:16
  • @chrfin Unfortunately, dotTrace doesn't say much. It points out that ADO's ExecuteReader() method takes the 100% of the execution time. Maybe the transaction is slowing down the execution somehow. – cryss Apr 29 '14 at 11:22

2 Answers2

0

Firstly, try improving your query:

var entityBriefs = 
  Table<Operation>().Where(x => x.Date >= dateStart && x.Date < dateEnd)
                    .GroupBy(x => x.EntityId)
                    .OrderByDescending(x => x.Count())
                    .Take(5)
                    .Select(x => new EntityBrief
                    {
                        EntityId = x.Key.EntityId, 
                        Quantity = x.Count()
                    });

var c = entityBriefs.ToDictionary(e => e.EntityId, e => e);

var entityInfo = Table<Operation>().Where(o => mapping.Keys.Contains(o.EntityId).ToList();

foreach(var entity in entityInfo)
{
   mapping[entity.EntityId].EntityName = entity.EntityName;
   mapping[entity.EntityId].EntityToken = entity.EntityToken;
}

You may also compile queries with the help of CompiledQuery.Compile, and use it further with improved performance.

http://msdn.microsoft.com/en-us/library/bb399335%28v=vs.110%29.aspx

Andrew
  • 3,648
  • 1
  • 15
  • 29
  • I don't think that having `x.Count()` twice is an improvement. – Gert Arnold Apr 29 '14 at 10:28
  • second count calculates within 5 groups only. in other case you have to project entire dataset to EntityBrief – Andrew Apr 29 '14 at 10:48
  • Compilation isn't neccssary, because if I choose a shorter term, then the execution time is very short. So the problem (differences in execution time) exists only when it comes to process a lot of data. – cryss Apr 29 '14 at 11:16
  • try performing the sort and take operations prior to selecting the data – Andrew Apr 29 '14 at 11:20
  • Yes, it speeds up the execution from 4 to 3.5 seconds, so there are 3.5 seconds more to optimize :) – cryss Apr 29 '14 at 11:24
  • in first grouping, group by `x.EntityId` only. Next sort it an take 5 items, call `ToList()` and next select `new EntityBrief{EntityId = x.Key.EntityId,Quantity = x.Count()}`. Next execute another query to get `EntityName` and `EntityToken` for your 5 records, should be even faster. – Andrew Apr 29 '14 at 11:33
  • Thanks, but the answer doesn't solve my problem which are different query execution times in two different environments (.net and sql mgmt studio). – cryss May 05 '14 at 10:13
0

The problem was with the database locks. I used wrong isolation level, so my queries were blocked under some circumstances. Now I use read-commited-snapshot and the execution time looks good.

cryss
  • 4,130
  • 1
  • 29
  • 34