One of my application has the following use-case:
- user inputs some filters and conditions about orders (delivery date ranges,...) to analyze
- the application compute a lot of data and save it on several support tables (potentially thousands of record for each analysis)
- the application starts a report engine that use data from these tables
- when exiting, the application deletes computed record from support tables
Actually I'm analyzing how to ehnance queries performance adding indexes/stastics to support tables and the SQL Profiler suggests me to create 3-4 indexes and 20-25 statistics.
The record in supports tables are costantly created and removed: it's correct to create all this indexes/statistics or there is the risk that all these data will be easily outdated (with the only result of a costant overhead for maintaining indexes/statistics)?
DB server: SQL Server 2005+ App language: C# .NET
Thanks in advance for any hints/suggestions!