We have a table with 800M rows and 93 columns that needs historical data removed. The data to be removed makes up > 95%. So far, attempts to delete in batches have failed, estimating at least one week to complete. The existing indexes are the main cause of the slowness. Removing the indexes and rebuilding after deletion has also proved too slow in testing.
It was decided that we should create a new table containing only the data we need, before renaming the old table as backup and renaming new to original.
The problem is that there are at least a hundred keys, constraints and indexes.
We are considering generating a script from Object Browser, renaming all the object names using find/replace, creating the identically structured table, populating with data, renaming original as backup, then renaming the new table and all the objects to those contained by the original table.
This is a critical table on a live database with only a few overnight hours to complete the process. So many manual renamings make us nervous as there cannot be any problems when the client begins trading.
What is the most reliable method? Is there an automated process to undertake this procedure? Any recommendations appreciated.