I have a situation where I have a SQL Server 2016 database consisting of 77 tables (we'll call it MainDB
). These tables hold records of a person and items attributed to them. The application retrieves with it needs by a GUID of the personID or the itemID.
I have to query the entire DB for a person, place all the relevant data into a new database (we'll call MainDB_Backup
).
Right now I've started by writing a SQL script in SSMS consisting of:
BEGIN TRY
BEGIN TRANSACTION
DECLARE @PersonID uniqueidentifier;
SELECT @PersonID='XXXXXXXX-XXXX-XXX-XXXX-XXXXXXXXXXXX'
... some more variables ...
INSERT INTO MAINDB_BACKUP.dbo.Person
SELECT * FROM MAINDB.dbo.Person
WHERE MAINDB.dbo.Person.PersonID = @PersonID
INSERT INTO MAINDB_BACKUP.dbo.PersonInfo
SELECT * FROM MAINDB.dbo.PersonInfo
WHERE MAINDB.dbo.PersonInfo.PersonID = @PersonID
... more insert into statements ...
COMMIT TRANSACTION
END TRY
BEGIN CATCH
DECLARE
@ErrorMessage NVARCHAR(4000),
@ErrorSeverity INT,
@ErrorState INT;
SELECT
@ErrorMessage = ERROR_MESSAGE(),
@ErrorSeverity = ERROR_SEVERITY(),
@ErrorState = ERROR_STATE();
RAISERROR (
@ErrorMessage,
@ErrorSeverity,
@ErrorState
);
ROLLBACK TRANSACTION
END CATCH
I've been using the feature in SSMS that checks tables that depend on a selected table and tables on which the selected table is dependent on in order to query correctly. The application itself does not write to all tables at the same time.
My question is, is there a better way of doing this? Also, would having individual statements as I have above for each table effect performance ?
I have also considered having a stored procedure as well, but I'm working with this script just to see if I can successfully move the data over.