0

I have a table ABC having total number of records around 100000 to 150000. I want to fetch all the records with all columns within a 10 to 15 seconds. Is there any possibility to do so?

I have already tried using view or temporary table, but it takes 47 seconds to load.

DECLARE TABLE TABLE(Business varchar(1000), Practice varchar(1000),
Location varchar(1000), ProviderName varchar(1000),
InsuranceName varchar(200), CustName varchar(100), ChartNo varchar(200),
ClaimId bigint, ClaimSubmissionDate varchar(100), DOS DateTime,
Modifiers varchar(200), CPT varchar(200), CPTCat varchar(1000),
Unit varchar(200), chargeamount int, chargepostingdetailid bigint)
jarlh
  • 42,561
  • 8
  • 45
  • 63
  • 3
    Get faster hardware... – jarlh Apr 07 '16 at 08:56
  • Oh! Thanks Man! But apart of Hardware, is there any possibility? 30 Secs? – Krunal Patel Apr 07 '16 at 09:04
  • Show us table definition, sample table data etc. Note that it's probably an issue of transferring lots of data - which is a bit hard to speed up. (Buffer sizes etc may improve things.) – jarlh Apr 07 '16 at 09:07
  • DECLARE ABC TABLE(Business varchar(1000),Practice varchar(1000),Location varchar(1000),ProviderName varchar(1000),InsuranceName varchar(200),CustName varchar(100),ChartNo varchar(200),ClaimId bigint,ClaimSubmissionDate varchar(100),DOS DateTime,Modifiers varchar(200),CPT varchar(200),CPTCat varchar(1000),Unit varchar(200),chargeamount int,chargepostingdetailid bigint) – Krunal Patel Apr 07 '16 at 09:10
  • Did you check Execution Plan? Is Parallelism applying for your query? – Max Apr 07 '16 at 09:11
  • No, it is not using Parallelism. – Krunal Patel Apr 07 '16 at 09:14
  • Makes me wonder what a user would ever do with 150.000 records. Looks like you need to start aggregating data. – Carra Apr 07 '16 at 09:18
  • @KrunalPatel maybe you should try to enable Parallelism for your query? You can configure it through hint MAXDOP or procedure sp_reconfigure. – Max Apr 07 '16 at 09:36
  • As of now, MAXDOP has set to 2. and i have tried using OPTION (MAXDOP 2) Clause with SELECT statement. But still it's not getting faster. – Krunal Patel Apr 07 '16 at 09:50

2 Answers2

0

Well, since your question at this point is very vague.... I would suggest to split the query in parallel queries (assuming that the bottle neck is NOT on the database).

Use Parallel.For with conjunction with ConcurrentBag.

Each "loop" is loading por example 5000 or so, "play" with the numbers and see if there is any difference.

Dryadwoods
  • 2,875
  • 5
  • 42
  • 72
  • Thanks for Your reply. I agree with you to use Parallelism. But i have to use this SQL Statement for SSRS Report. That's the main problem. Because i have to transfer all these data to SSRS and SSRS will do pagination over it and then report will be get present. – Krunal Patel Apr 07 '16 at 09:08
  • When I say parallel is to load all the 100000 to 150000 entries, but each query load only a certain amount and then add each query result (like paginated query) to the final list (concurrentBag). When everything is loaded then pass the FULL list to your consumer object. – Dryadwoods Apr 07 '16 at 09:21
0
  1. Run your client on the server machine. You can use RDP session if you need SQL Server Management Studio or just run your application on the server.

  2. Use shared memory for the network protocol in your database connection. See here: Fastest SQL Server protocol?

Community
  • 1
  • 1
Y.B.
  • 3,526
  • 14
  • 24