0

In my program I am copying data from my MySQL database and in theory I am trimming down the results set through paging. It still seems to take a long time 600-800 milliseconds for 30 rows of data with about 25 columns of data with nothing very large in each cell.

Is this just the performance hit I have to endure by hitting a database to retrieve data? I'm updating this data frequently and it is a different dataset per user so I can't cache or preload it into memory.

I would love some suggestions on how I go from here. I looked at making a seperate stored procedure to do the totalrow count for my pagination, but the main hit just plain seems to be when I have to copy the data up to the program to a list to then produce JSON for my webpage view.

I'm sort of new at this and I think the responsiveness of my front end page is starting to suffer due to the time it is taking to produce these results. 1.8 - 2.5 seconds for a page refresh seems like it is too long to be reloading a grid displaying 30 results.

Thank you in advance.

Edit: Additional Details:

All primary keys in the tables are indexed. Yes I have profiled the EF query that takes the longest time, it doesn't seem to be doing anything more then a paged select. Ex

sql = "System.Collections.Generic.List`1[PeerTrader.usercollectionenhancedview].OrderBy(Param_0 => Param_0.CardName).Skip(90).Take(30)"

We can take the client side out of things because I'm measuring diagnostics on my controller and basically the client side is adding no real overhead to the presentation of the display. Basically the time it takes to prepare the JSon data to return it to the client is the same.

It would be a page generally of 30 (though the time it takes doesn't really change when I reduce it to a row count of 5) from a Stored Procedure that results total count would be about 20K.

The real hit is at the point where I actually "read" data from the database. So wherever I inject the portion that actually loads results from the database that appears to be the biggest hit. Running a paged query that should emulate the page call my server makes on the MySQL server returns a result in the order of 60 milliseconds.

Here is my controller action: In the example below I'm copying the entire dataset over to a List after the filter because it doesn't seem to matter weather I copy in 5, 30, or 20K worth of rows, it seems to be the same hit in the read operation and then it lets me enumerate on the dataset a number of times later in the controller.

public ActionResult GridData(string sidx, string sord, int page, int rows, bool _search, string filters)
    {
    ///**********Diagnostics Start********
    System.Diagnostics.Stopwatch sw = new System.Diagnostics.Stopwatch();
    sw.Start();
    ///**********Diagnostics End********

    MembershipUser currentUser = Membership.GetUser(User.Identity.Name, true /* userIsOnline */);
    int currentUserID = (int)currentUser.ProviderUserKey;

    var serializer = new JavaScriptSerializer();
    //Filters f = (!_search || string.IsNullOrEmpty(filters)) ? null : serializer.Deserialize<Filters>(filters);
    Filters_IQueryable f = (!_search || string.IsNullOrEmpty(filters)) ? null : serializer.Deserialize<Filters_IQueryable>(filters);

    IQueryable<usercollectionenhancedview> query = db.SingleUserCollectionView(currentUserID).AsQueryable();


    var filteredQuery = (f == null ? query : f.FilterObjectSet(query)).ToList();

    ///**********Diagnostics Start********
    sw.Stop(); long t1 = sw.ElapsedMilliseconds; sw.Start();
    System.Diagnostics.Debug.WriteLine("");
    System.Diagnostics.Debug.WriteLine("**************");
    System.Diagnostics.Debug.WriteLine("T1: " + t1);
    ///**********Diagnostics End********

    int totalRecords;

    totalRecords = filteredQuery.Where(x => x.fk_IdUsers == currentUserID).Count();
    pagedQuery = filteredQuery.AsQueryable().Where(x => x.fk_IdUsers == currentUserID).OrderBy(sidx + " " + sord).Skip((page - 1) * rows).Take(rows);

     ///**********Diagnostics Start********
     sw.Stop(); long t2 = sw.ElapsedMilliseconds; sw.Start();
     System.Diagnostics.Debug.WriteLine("T2: " + t2);
     ///**********Diagnostics End********

I hope I have added enough extra information for further comments, if not just let me know. I do appreciate your exploration into my problem.

Edit 2:

sql = "System.Data.Objects.ObjectResult1[PeerTrader.usercollectionenhancedview].OrderBy(Param_0 => Param_0.CardName).Skip(0).Take(30)"When I go looking on my DB, the actual call on the DB to do theorder by` is actually taking a fair bit of time, 300 ms on the actuall db itself, so it looks like I have tracked an expensive step down

Mark
  • 3,123
  • 4
  • 20
  • 31
  • It could be many reasons. You should post mode details about your implementation: JavaScript Code, C# code and more details about the database (for example you get one page with 30 rows from 100000 total rows or from 1000 total rows). Could you measure for example the performance of the server part and then the performance of JavaScript code. – Oleg Feb 02 '13 at 20:27
  • Are you limiting results on database query or prior to JSON response? Got indexes on applicable attributes? Have you profiled EF query to see what it's doing? –  Feb 02 '13 at 20:27
  • @Oleg Please see the edit above where I add in extra details and add in answers to both queries. – Mark Feb 02 '13 at 20:50
  • @bmewsing Please see the edit above where I add in extra details and add in answers to both queries. – Mark Feb 02 '13 at 20:51
  • @Mark: It seems that you implemented access to the database in the wrong way. It seems that your code get **all 20K data from the database and then do sorting and paging in C# code**. You wrote about the usage of Stored Procedure, but you don't do this in your current code. You use View which code is not accessible for the C# code I think. Usage of `IQueryable` instead of usage of Entity Framework seems also suspected. Moreover even if all other things works perfect it's important to know by which column you make sorting.If you have no index on the column by which sorting done it could be slow – Oleg Feb 03 '13 at 09:46
  • @Oleg I was forced into accessing data from the DB due to some wierd query requirements, but you are right, I figured out last night that I was returning the whole dataset no matter where I was enumerating the results. I'm going to try now to build up the full query inside my program or look to pass the dynamic search into the stored procedure. I had a 4 to 5 times reduction in my speed last night by making my stored procedure handle the paging so I think I'm on the right path now. Thanks for taking a look. – Mark Feb 03 '13 at 16:40
  • @Mark: You are welcome! If the current performance is OK you don't need to invest more to make the solution more quickly. One can make everything more better of cause, but one should do this only it one sees the bottleneck in the place. Best wishes! – Oleg Feb 03 '13 at 18:08
  • @Oleg On my production server the gridview is taking about 2.5 seconds to page where I am showing 30 records, and this to me seems a bit long and I want it to be as snappy as possible for my users. – Mark Feb 03 '13 at 18:18
  • @Mark: In the case **you should append your question with the JavaScript code (which create jqGrid) which you use**. Why you speak about "gridview"? Do you use jqGrid? If yes in which version and in combination with which versions of jQuery and jQuery UI? – Oleg Feb 03 '13 at 18:20
  • @Oleg The time delay is all serverside at the moment due (which I am testing via diagnostics) I have the problem tracked down now and am testing building the query on the controller vs a stored procedure which can handle dynamic querying. Thanks (Yes it's jqGrid) – Mark Feb 03 '13 at 18:28
  • @Mark: If the SELECT statement with 20K rows has native id then you can implement paging in very simple way like I described [here](http://stackoverflow.com/a/8480127/315935) and [here](http://stackoverflow.com/a/9792723/315935). Probably it will help you to improve your STORED PROCEDURE. – Oleg Feb 03 '13 at 19:14

1 Answers1

0
var filteredQuery = (f == null ? query : f.FilterObjectSet(query)).ToList();

Above line looks suspect. See here:

The ToList(IEnumerable) method forces immediate query evaluation

There is no need to materialise 1000s of results if all you need is a count and 30 items. The below queries should be rewritten to operate on the db and not a materialised collection. The SQL generated when using count() and Skip().Take() should be good enough to restrict the results to only what you need but you may want to check the actual query using ObjectQuery.ToTraceString()

totalRecords = filteredQuery.Where(x => x.fk_IdUsers == currentUserID).Count();
pagedQuery = filteredQuery.AsQueryable().Where(x => x.fk_IdUsers == currentUserID)
                          .OrderBy(sidx + " " + sord).Skip((page - 1) * rows).Take(rows);

If you're still having trouble and it's a read-only situation you could try using .AsNoTracking() and turning off any proxy generation.

  • Yes, the `.ToList()` does force the query evaluation. I have played with the controller action and even if I defer the evaluation to where I set the pagedQuery results it is a similar time cost for the evaluation. I have even hard coded in other values and completely deferred the query evaluation to the end of the controller where I would only be evaluating a dataset (in theory) of 5-30 rows and it is basically the same time cost. – Mark Feb 02 '13 at 21:29
  • But that is part of why I asked this question. In my mind it should take less time to evaluate the query in the pagedQuery where I'm only returning 30 results vs where I evaluate on the filteredQuery which would return slightly over 20K results. – Mark Feb 02 '13 at 21:30
  • 1
    @Mark I'm surprised that there would be no difference in materialising whole resultset and then filtering vs just querying for what you need. –  Feb 02 '13 at 21:37
  • After the `var filteredQuery` line is ran the variable contains 20,029 records and has taken 351 ms (my computer at home has an SSD and is much faster then my prod server) the T2/T3 diagnostics times increment a total of 2 ms. The biggest time sink is the obvious copy from the db. – Mark Feb 02 '13 at 21:38
  • I change things over to where I produce the list based on the pagedQuery and it took as much time (even more actually, 419 ms). – Mark Feb 02 '13 at 21:42
  • `sql = "System.Data.Objects.ObjectResult`1[PeerTrader.usercollectionenhancedview].OrderBy(Param_0 => Param_0.CardName).Skip(0).Take(30)"` When I go looking on my DB, the actual call on the DB to do the `order by` is actually taking a fair bit of time, 300 ms on the actuall db itself, so it looks like I have tracked an expensive step down – Mark Feb 02 '13 at 21:48