0

One of my application has the following use-case:

  • user inputs some filters and conditions about orders (delivery date ranges,...) to analyze
  • the application compute a lot of data and save it on several support tables (potentially thousands of record for each analysis)
  • the application starts a report engine that use data from these tables
  • when exiting, the application deletes computed record from support tables

Actually I'm analyzing how to ehnance queries performance adding indexes/stastics to support tables and the SQL Profiler suggests me to create 3-4 indexes and 20-25 statistics.

The record in supports tables are costantly created and removed: it's correct to create all this indexes/statistics or there is the risk that all these data will be easily outdated (with the only result of a costant overhead for maintaining indexes/statistics)?

DB server: SQL Server 2005+ App language: C# .NET

Thanks in advance for any hints/suggestions!

Defkon1
  • 492
  • 4
  • 15

1 Answers1

0

First seems like a good situation for a data cube. Second, yes you should update stats before running your query once the support tables are populated. You should disable your indexes when inserting the data. Then the rebuild command will bring your indexes and stats up to date in one go. Profiler these days is usually quite good at these suggestions, but test the combinations to see what actully gives the best performance gains. To look as os cubes here What are the open source tools and techniques to build a complete data warehouse platform?

Community
  • 1
  • 1
Ian P
  • 1,724
  • 1
  • 10
  • 12
  • Support tables are actually built with datamart logics, with 4-5 dimensions tables and a couple of facts tables (on which the report engine works). – Defkon1 Jun 27 '13 at 16:10