0

There is more than 60,000 records in the table. I am using Microsoft SQL Server 2016 (RTM-GDR).

I have planned to store the data into the DataTable property and fetch only the top 100 out of the data available in the Datatable property at a time. And then delete this top 100 records so that processing would be better for each time.

Service Code

public DataTable Records { get; set; }

In Service Method

if(this.Records == null || this.Records.Count() == 0) 
{
     //For the first time add records to the `Records` data table.
}
else {
  //The Records already there.
 }

Web API Code

[HttpGet]
public HttpResponseMessage GetReports()
{
  var tempReports = this.mService.GetReports();
  if (tempReports == null)
  {
    return ErrorResult(HttpStatusCode.NotFound);
  }

  return OK(tempReports );
}

Problem

This this.Records.Count() is 0 always whenever I send new request to fetch the data.

The data is getting successfully added to the Records data table but it's not being preserved.

Is there something I have to do explicitly from the Web API to preserve the records for the particular user?

Bijay Koirala
  • 242
  • 2
  • 10
  • 1
    Possible duplicate of [ASP.NET Web API session or something?](https://stackoverflow.com/questions/11478244/asp-net-web-api-session-or-something) – mjwills Feb 22 '19 at 08:44
  • @mjwills Is there is any way of achieving this by storing other than in session? – Bijay Koirala Feb 22 '19 at 08:47
  • 1
    You could also store it in a cache. Or a database. – mjwills Feb 22 '19 at 08:48
  • What is the actual question here? 60K rows is no data. Why process it in pages of 100 rows? What is the expected output, multiple reports of only 100 items each? Or should it be 1 report with all 60K rows? – Panagiotis Kanavos Feb 22 '19 at 09:25
  • 1
    If `GetReports()` is slow accessing just 60K rows the code should be inspected for inefficiencies, bad coding practices etc. DataTable has no `Count()` method either, getting the row count is done using `DataTable.Rows.Count` – Panagiotis Kanavos Feb 22 '19 at 09:30
  • @PanagiotisKanavos a report of 60k data. – Bijay Koirala Feb 22 '19 at 10:04
  • @BijayKoirala what's the question in that case? Whether you access the rows one by one or 100 by 100 you'll still have to process all 60K of them before responding. If you have a performance issue with `GetRecords()` post the code – Panagiotis Kanavos Feb 22 '19 at 10:07

1 Answers1

0

First of all, Web Api is stateless and each request is supposed to be completely independent from the previous request and if you have chosen a stateless technology you'd better be loyal to its specifications.
Regarding your concern of performance, from your code I guess that you want to do something like pagination. Let me provide you with a sample code by which pagination is normally performed using entity framework:

var pagedList = yourDbContext.youEntityDbSet.OrderBy(x => x.Id)
                                .Skip(numberOfRecordToSkip)
                                .Take(numberOfRecordToTake)
                                .ToList();

numberOfRecordToSkip and numberOfRecordToTake need to be added to the request from client-side and don't worry about the performance, since one database connection on each request is not a big deal.

Amir Molaei
  • 3,700
  • 1
  • 17
  • 20
  • We did the same thing but the performance is the issue. So, we are planning to store the data in the `cache` as @mjwills comment. – Bijay Koirala Feb 22 '19 at 10:06
  • @BijayKoirala you already did that. Your cache is called `Reports` and its type is `DataTable`. That's essentially a strongly typed array with named columns and a Primary Key index. You don't need *another* cache. If your code is slow, it's the *code* that needs improvement – Panagiotis Kanavos Feb 22 '19 at 10:08
  • @PanagiotisKanavos Thanks. Let me check from our side for code optimization. If the issue still exists will update soon. – Bijay Koirala Feb 22 '19 at 10:19