0

If one has a big database, it will be very slow to build. Especially if one has a lot of stored procedures and unit tests.

One option would be to have two SQL projects that are responsible for different aspects of the database. Then each build could deploy separately. In this case one would have the DropObjects flag to false.

Has anyone encountered this approach or used any other method of working with a large database in SSDT?

If one had multiple SQL projects - how could you break concerns apart?

So far I have a huge build which takes hours including unit tests when I put the SQL project in a pipeline

Dale K
  • 25,246
  • 15
  • 42
  • 71
Diego
  • 1
  • If you have different Schema's it may work that way. Or big sets of code/functionality that are independent of each other but if any of the tables/sp's from one project overlap/are used in the other you could /would get errors trying to run them independently. I.E. One of your projects may create tables, but the other may have SP's that reference the table, so if table is not created before the SP is created you could get errors. – Brad Feb 10 '23 at 20:29

1 Answers1

-1

I've done this and I'm actually working on another project much smaller that's broken into 5 DB projects and will produce 2 dbs. However, I'm here because I forgot how to do it but I think it was done with project variables as I recall. I remember there was a website that showed how to do it and that's why I'm here. If I find it I'll share it.

The reason we do it is it's easier to break the parts to reusable parts in other db projects. Build once and share.

SQLEagle
  • 19
  • 4
  • Ok I figured it out. So, I have a Master and Common db. In the Master db add a database reference to the Common db using the Common db project and in the database, location select "same database". Build and deploy and the objects from the Common db get published in the Master db along with all the objects in the Master db. – SQLEagle Feb 14 '23 at 00:04