I have recently started a new job where they do most everything via Stored Procedures. We currently have a "deployment" folder under source control that contains timestamped folders with database change scripts (we actually have 3 folders under that - one for table modification scripts, one for views and functions, and one for sprocs), as well as a "Next" folder that has the new changes we are currently working on (it gets renamed when we do a deployment). We have three databases set up: a local copy on our workstation which is only accessible to the individual developer, a development DB and the live DB in production.
This means that we create and commit the .SQL files, and then have to manually run them on an almost daily basis (as new ones are added to source control, so pretty much every time we do an update we need to check what's changed in those folders and run them against our local copy of the DB), not to mention having to do the same thing on the dev and prod servers when we do deployments; also we have the databases named differently on each server to avoid accidentally running a change script in the wrong environment (this strikes me as odd as typically you have the DB instance named differently but the actual database named identical across all your servers) so we cannot include the USE
statements in the scripts. This process seems very inefficient.
Is there a recommended best practice way of handling this kind of thing that I could suggest we start using instead?