0

I have a situation where I have a SQL Server 2016 database consisting of 77 tables (we'll call it MainDB). These tables hold records of a person and items attributed to them. The application retrieves with it needs by a GUID of the personID or the itemID.

I have to query the entire DB for a person, place all the relevant data into a new database (we'll call MainDB_Backup).

Right now I've started by writing a SQL script in SSMS consisting of:

BEGIN TRY
    BEGIN TRANSACTION

    DECLARE @PersonID uniqueidentifier;
    SELECT @PersonID='XXXXXXXX-XXXX-XXX-XXXX-XXXXXXXXXXXX'

    ... some more variables ...

    INSERT INTO MAINDB_BACKUP.dbo.Person
        SELECT * FROM MAINDB.dbo.Person
        WHERE MAINDB.dbo.Person.PersonID = @PersonID

    INSERT INTO MAINDB_BACKUP.dbo.PersonInfo
        SELECT * FROM MAINDB.dbo.PersonInfo
        WHERE MAINDB.dbo.PersonInfo.PersonID = @PersonID

    ... more insert into statements ...

    COMMIT TRANSACTION

END TRY
BEGIN CATCH
    DECLARE 
        @ErrorMessage NVARCHAR(4000),
        @ErrorSeverity INT,
        @ErrorState INT;
    SELECT 
        @ErrorMessage = ERROR_MESSAGE(),
        @ErrorSeverity = ERROR_SEVERITY(),
        @ErrorState = ERROR_STATE();
    RAISERROR (
        @ErrorMessage,
        @ErrorSeverity,
        @ErrorState    
        );
    ROLLBACK TRANSACTION
END CATCH

I've been using the feature in SSMS that checks tables that depend on a selected table and tables on which the selected table is dependent on in order to query correctly. The application itself does not write to all tables at the same time.

My question is, is there a better way of doing this? Also, would having individual statements as I have above for each table effect performance ?

I have also considered having a stored procedure as well, but I'm working with this script just to see if I can successfully move the data over.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
vee
  • 389
  • 5
  • 19
  • I would urge you to read this question and ask yourself if you think you could answer based only on the details provided. – Sean Lange Aug 02 '18 at 15:54
  • @SeanLange I am just asking if there is a best practice in querying a large database and copying some of that data into a new database? should I have individual insert into statements or one giant query? – vee Aug 02 '18 at 15:58
  • Well...to insert into multiple tables requires multiple insert statement. There is no work around for that. Your question is by too vague and too broad. – Sean Lange Aug 02 '18 at 16:00
  • you could possibly make dynamic SQL by querying INFORMATION_SCHEMA.COLUMNS for personid, maybe with a cursor, then crafting and executing INSERT statements – Cato Aug 02 '18 at 16:51
  • select distinct 'INSERT INTO MAINDB_BACKUP.dbo.' + table_name + ' SELECT * FROM MAINDB.dbo.' + table_name + ' WHERE PersonID = @PersonID' AS YourSQL from INFORMATION_SCHEMA.COLUMNS where COLUMN_NAME = 'personid' – Cato Aug 02 '18 at 16:56
  • @Cato I like that idea, I will give that a try and let you know – vee Aug 02 '18 at 16:58
  • In addition to @Cato's idea, if you are using foreign key's back to Person, you could [query INFORMATION_SCHEMA](https://stackoverflow.com/a/3907999/685760) tables for foreign key usage which other tables link back to Person. You could quite literally then create dynamic SQL to create your relevant insert statements. – Mr Moose Aug 02 '18 at 17:27

0 Answers0