We have a project where the database support is given for Sql Server
and Oracle
and to control the version on the database side we are using liquibase
. Sometimes we need to bring a backup of the customer's database to our own infrastructure to investigate issues and work on it. We currently have more than 1000
changesets
(and counting..) for database versioning, which takes a lot of time.
The problem: When we bring a backup from the customer and restore it into our local environment, we need to clear the DATABASECHANGELOG
and re-run all the changesets
again to force the liquibase calculate the correct checksum
. We don't know how exactly the liquibase
calculate it but we suppose it envolves environments variables like database and instance name, which is different from our customer to our own env.
Question: We would like to know, how could we improve this process? Maybe configuring how the liquibase should calculate the checksum (maybe considering just the ID, Author and Script). Or recalculate the checksum to our environment. Clearning the DATABASECHANGELOG
and re-runing all the changesets is consuming a lot of time and make it difficult to maintenance.
Thank you.