Our current database is of nearly 200MB, but once the application goes live, we are expecting this to grow to a large volume.. may be, 20-30 GB of data in that.
We are planning to use "dacpac" (generated by database project - SSDT) to deploy on production server. The database will be created with a number of tables, and lots of initial data in lookup tables.
However, the concern is for future deployments when we will be using the "dacpac" (generated by database project - SSDT) to upgrade the database on production server.
As I have no past experience of using dacpac for deployments, can anyone please suggest me following -
- Does the deployment depend on the volume of data? Or if it just depends upon the schema changes? For example, if the target database is of 20-30 GB, how much approximate time it can take just to upgrade it?
- How can we version database schema?
- Can the upgrade process be rolled back if anything goes wrong?
And lastly, is it better than traditional way of manual writing sql scripts to upgrade database?