In my program I am copying data from my MySQL database and in theory I am trimming down the results set through paging. It still seems to take a long time 600-800 milliseconds for 30 rows of data with about 25 columns of data with nothing very large in each cell.
Is this just the performance hit I have to endure by hitting a database to retrieve data? I'm updating this data frequently and it is a different dataset per user so I can't cache or preload it into memory.
I would love some suggestions on how I go from here. I looked at making a seperate stored procedure to do the totalrow count for my pagination, but the main hit just plain seems to be when I have to copy the data up to the program to a list to then produce JSON for my webpage view.
I'm sort of new at this and I think the responsiveness of my front end page is starting to suffer due to the time it is taking to produce these results. 1.8 - 2.5 seconds for a page refresh seems like it is too long to be reloading a grid displaying 30 results.
Thank you in advance.
Edit: Additional Details:
All primary keys in the tables are indexed. Yes I have profiled the EF query that takes the longest time, it doesn't seem to be doing anything more then a paged select. Ex
sql = "System.Collections.Generic.List`1[PeerTrader.usercollectionenhancedview].OrderBy(Param_0 => Param_0.CardName).Skip(90).Take(30)"
We can take the client side out of things because I'm measuring diagnostics on my controller and basically the client side is adding no real overhead to the presentation of the display. Basically the time it takes to prepare the JSon data to return it to the client is the same.
It would be a page generally of 30 (though the time it takes doesn't really change when I reduce it to a row count of 5) from a Stored Procedure that results total count would be about 20K.
The real hit is at the point where I actually "read" data from the database. So wherever I inject the portion that actually loads results from the database that appears to be the biggest hit. Running a paged query that should emulate the page call my server makes on the MySQL server returns a result in the order of 60 milliseconds.
Here is my controller action: In the example below I'm copying the entire dataset over to a List after the filter because it doesn't seem to matter weather I copy in 5, 30, or 20K worth of rows, it seems to be the same hit in the read operation and then it lets me enumerate on the dataset a number of times later in the controller.
public ActionResult GridData(string sidx, string sord, int page, int rows, bool _search, string filters)
{
///**********Diagnostics Start********
System.Diagnostics.Stopwatch sw = new System.Diagnostics.Stopwatch();
sw.Start();
///**********Diagnostics End********
MembershipUser currentUser = Membership.GetUser(User.Identity.Name, true /* userIsOnline */);
int currentUserID = (int)currentUser.ProviderUserKey;
var serializer = new JavaScriptSerializer();
//Filters f = (!_search || string.IsNullOrEmpty(filters)) ? null : serializer.Deserialize<Filters>(filters);
Filters_IQueryable f = (!_search || string.IsNullOrEmpty(filters)) ? null : serializer.Deserialize<Filters_IQueryable>(filters);
IQueryable<usercollectionenhancedview> query = db.SingleUserCollectionView(currentUserID).AsQueryable();
var filteredQuery = (f == null ? query : f.FilterObjectSet(query)).ToList();
///**********Diagnostics Start********
sw.Stop(); long t1 = sw.ElapsedMilliseconds; sw.Start();
System.Diagnostics.Debug.WriteLine("");
System.Diagnostics.Debug.WriteLine("**************");
System.Diagnostics.Debug.WriteLine("T1: " + t1);
///**********Diagnostics End********
int totalRecords;
totalRecords = filteredQuery.Where(x => x.fk_IdUsers == currentUserID).Count();
pagedQuery = filteredQuery.AsQueryable().Where(x => x.fk_IdUsers == currentUserID).OrderBy(sidx + " " + sord).Skip((page - 1) * rows).Take(rows);
///**********Diagnostics Start********
sw.Stop(); long t2 = sw.ElapsedMilliseconds; sw.Start();
System.Diagnostics.Debug.WriteLine("T2: " + t2);
///**********Diagnostics End********
I hope I have added enough extra information for further comments, if not just let me know. I do appreciate your exploration into my problem.
Edit 2:
sql = "System.Data.Objects.ObjectResult
1[PeerTrader.usercollectionenhancedview].OrderBy(Param_0 => Param_0.CardName).Skip(0).Take(30)"When I go looking on my DB, the actual call on the DB to do the
order by` is actually taking a fair bit of time, 300 ms on the actuall db itself, so it looks like I have tracked an expensive step down