-1

We run a SAAS product. One of our customer is an enterprise client and while others generate 8-10,000 rows of data for a bill report for 6 months. They generate about 300,000 rows of data. Each row has 30 columns. So when they try to goto our reporting module and export their 6 months report, our servers go unresponsive as the CPU jumps to 80%, and the entire process fails. Fetching 300K from SQL database after joins etc takes 30 minutes, followed by 3 hours to write the 300K rows with 30 columns in excel. My worry is that just 1 client is bringing the entire site down, as we have 40 reports, and if they start going back 6 months and exporting data, other users will have a terrible experience.

What is the best way to handle such large exports of data in SAAS products?

Tx

user1763470
  • 197
  • 1
  • 1
  • 11
  • Are you using Azure SQL DB? If so what tier? If you are using premium database have you tried to use column store, in-memory oltp to increase performance? Do you have proper indexing - there is index advisor in azure sql database that can recommend missing indexes. – Jovan MSFT Oct 06 '16 at 19:04

1 Answers1

1

Replicate your production database into a "end-user's sand-box database", serve your end-users' reporting from there, where they can slow down the DB to a crunching halt, if they need to.

Such large exports shouldn't be done from a production system.

tofro
  • 5,640
  • 14
  • 31
  • You mean an end users sandbox local on their network or a hosted mirrored database? – user1763470 Oct 06 '16 at 17:03
  • Both would help - But when you're talking SaaS and I guess reporting is included in your offered services, I'm afraid you'd be obliged to host the mirror. – tofro Oct 06 '16 at 17:19