1

I inherited a SQL Server database used with a C# client application. I know enough to be dangerous but I'm not a programmer. I know the previous programmer liked to play around in the live database and as a result I have a number of what I think are orphaned tables.

The application itself isn't super high-usage and can withstand the exceptions of tables not being there for a small time frame. I'd like to start turning off or disabling these tables to see they're being used anymore through trial and error.

Is there a way to turn them off without completely removing the data so that I can re-enable them quickly if needed? Or is there a more transparent way to discover whether those tables are needed?

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Residualfail
  • 79
  • 1
  • 12
  • 1
    Well, that's a pretty brute force method of doing it, can't you just search the sourcecode for the table name instead? How do you know you're testing every single function? If you *really* want to do this, just rename the table. – DavidG Jun 07 '17 at 15:00
  • 2
    You can just rename the table. Maybe append ".testing" or something to the name. – itsme86 Jun 07 '17 at 15:01
  • 2
    See [How can I tell if a database table is being accessed anymore? Want something like a “SELECT trigger”](https://stackoverflow.com/questions/2155594/how-can-i-tell-if-a-database-table-is-being-accessed-anymore-want-something-lik) – Alex K. Jun 07 '17 at 15:01
  • Are there clustered indexes on the tables? If yes, you can disable them. This effectively makes the table inaccessible. Then you can rebuild the clustered index, if needed. – Shaneis Jun 07 '17 at 15:04
  • We have also used a Database Audit to see if stuff is being used. – SteveB Jun 07 '17 at 15:22
  • @DavidG - That's the point of the question. I'm hoping to find a better way but my limited experience with SSMS doesn't help. – Residualfail Jun 07 '17 at 15:43

3 Answers3

2

There is no easy way. The tables can be accessed through both stored procedures and direct SQL calls from your client application. A comprehensive approach would mean that you'd have to have some way of making each table unavailable (renaming has been suggested in comments) and then perform a full regression test on your client application; you might have to do this with each table in the database. The client application might access the tables conditionally, subject to external things like the logged-in user (and related privileges), the date, configuration, and so forth.

The SQL Server Profiler is a good tool to use, but it's not going to solve your problem all by itself because you still have to analyze what it captures.

Xavier J
  • 4,326
  • 1
  • 14
  • 25
  • The application does indeed use both stored procedures and SQL calls from the client. A goal of mine is to migrate all SQL queries to stored procedures for consistency and my personal sanity. But first I want to clean up the existing clutter. I'm positive renaming tables isn't the best approach but being small scale and limited usage, it's probably something I can live with. I have a gut feeling 3 tables are just dead but I'm not comfortable just flat out removing the tables and hoping for the best. – Residualfail Jun 07 '17 at 15:49
2

You could create a new db schema and transfer the tables to that schema

ALTER SCHEMA new schema TRANSFER dbo.your table

Then transfer them back again after testing.

https://learn.microsoft.com/en-us/sql/t-sql/statements/alter-schema-transact-sql

0

You can change the permissions so that no one except you and the dbos have select permission on the table. You cannot prevent a dbo or sa from having all permissions on a table.

You can also encrypt the table see- Encrypting database tables in SQL Server 2008 in which case it is really locked down.

You can also used SQL Server Audit to see if anyone reads the data. Audit is a very low impact product (comes with SQL Server 2008) and is very easy to set up and can audit selects unlike a trigger.

benjamin moskovits
  • 5,261
  • 1
  • 12
  • 22