0

I'm writing a SQL Server data migration script. The data migration works fine when I run it from SQL Server Management Studio (SSMS) but it fails when I try and run it from Liquibase. I've investigated and when I try and use either a statement block of BEGIN..END or any manual transaction control with BEGIN TRANSACTION and COMMIT TRANSACTION, although the script works in SSMS, it fails in Liquibase. Here is an example of code that works in Liquibase, without any statement blocks or transaction control:

DECLARE @DOC_COUNT int
SELECT @DOC_COUNT = COUNT(*) FROM old_doc_table WHERE ITEM_TYPE = 'DOCUMENT'

SET IDENTITY_INSERT DOCUMENT ON

WHILE ((SELECT COUNT(*) FROM DOCUMENT) < @DOC_COUNT)
INSERT INTO DOCUMENT (
    ID,
    VERSION,
    ...
    )
SELECT TOP 500
           ID,
           VERSION,
           ...
FROM old_doc_table odt
WHERE odt.ITEM_TYPE = 'DOCUMENT'
  AND odt.ID NOT IN (SELECT ID FROM DOCUMENT);

SET IDENTITY_INSERT DOCUMENT OFF

I suspect the data migration will take a while, so really I want to split this into separate transactions. Viz:

  1. Get count of rows to be transferred in one transaction, which is committed.
  2. While loop, with each batch of 500 rows transferred in a transaction which is then committed.
  3. Set identity insert off in final transaction, then committed.

I can add the appropriate BEGIN TRANSACTION and COMMIT TRANSACTION calls and validate that the SQL runs fine in SSMS, but it fails in Liquibase saying the script has a SQL error. I'm running in Liquibase like this:

<changeSet id="migrateDocumentData" author="proctorh">
    <sqlFile path="sql/migrateDocumentData.sql" relativeToChangelogFile="true"/>
</changeSet>

I'm aware that the Liquibase changeset has a runInTransaction boolean attribute, which I plan to set to false for this changeset, but until I get the syntax error resolved I can't even test that properly.

Any ideas how to get this working properly?

Hedley
  • 1,060
  • 10
  • 24
  • Did you try with `splitStatements="false"` for the `sqlFile` tag –  Nov 28 '18 at 15:56
  • Note that you are not "transferring" rows, you are copying them. Did you really mean transfer - as in insert some set of rows in new table and then delete those same rows from old table? In addition, 500 rows is a very TINY batch generally speaking. – SMor Nov 28 '18 at 16:17
  • And why do you think you need explicit transactions? I suggest you read Solomon's reply [here](https://stackoverflow.com/questions/35903375/how-to-update-large-table-with-millions-of-rows-in-sql-server). It seems your posted script is well suited to accomplish your goal and runs in liquibase - so why complicate things? – SMor Nov 28 '18 at 16:19
  • My concern is that I'll get a transaction timeout on a production server with a large number of rows and a transaction timeout set to perhaps 10 minutes. Are you thinking that even with millions of rows the insert select will be fast enough to transfer all rows within that time limit? – Hedley Nov 28 '18 at 16:33

0 Answers0