3

I have an ASP.Net 4.0 web site with a SQL Server 2008 database. I want to deploy dependent changes to both the web site and the database at the same time while keeping the site running. My normal procedure is to first deploy the web site changes, and while the web site compiles, deploy the database changes. This works if I am fast enough to get the database changes out before the first request finishes compiling.

I don't want any down time on the web site.

EDIT: I can't purchase any new hardware or software.

Is there a better way?

EDIT: Note: My web sites do not use persistent information such as session state, so recompiling the application does not cause any problems for me.

Carter Medlin
  • 11,857
  • 5
  • 62
  • 68

6 Answers6

3

There is really no good way to prevent IIS from restarting when you release changes to your application. It's possible in theory, but the changes required would outweight the benefits of doing it.

I think you should be looking to keep downtime to an absolute minimum rather than eliminating it altogether. Nobody likes downtime, but it's a necessary evil in a lot of cases.

For your database changes, there are tools which can make the process of updating your database much easier. I would suggest taking a look at SQL Compare and SQL Data Compare from Red Gate. These tools will allow you to compare schemas and data, and synchronize databases in a matter of seconds. I've been using both tools for several years now, and they really are fantastic time-savers.

SQL Compare:

http://www.red-gate.com/products/sql-development/sql-compare/

SQL Data Compare:

http://www.red-gate.com/products/sql-development/sql-data-compare/

James Johnson
  • 45,496
  • 8
  • 73
  • 110
  • I don't have a problem with IIS restarting. I'm not using any persistent data like session in the application because I want the ability to keep my apps alive during deployments. – Carter Medlin Aug 30 '11 at 20:37
  • Alright, well that's good. So the main concern is database changes, and I would advise taking a look at the tools I suggested. They won't eliminate downtime, but they will help to reduce downtime to a matter of minutes. – James Johnson Aug 30 '11 at 20:45
  • I'm using Visual Studio 2010 database projects. It does a good job of creating change scripts. In most cases my database change script executes faster than my .Net code compiles. In the rare case that it doesn't i'm looking for a solution. – Carter Medlin Aug 30 '11 at 21:21
1

You are going to have to make your database changes backwards compatible, deploy them first. Then you can keep doing what you are doing with timing issues.

rick schott
  • 21,012
  • 5
  • 52
  • 81
  • I make changes backward compatible when possible. New objects can be deployed without risk, drops can be delayed. I'd rather not burden the developers with having to keep their code backward compatible if I don't have to. My current method works as long as the database change script runs faster than my ASP.NET code compiles. – Carter Medlin Aug 30 '11 at 21:28
1

I like @rick schott's idea about making db changes backward compatible. But, I think you'll need some kind of cluster/farm/garden ultimately. If you have clustered web servers and clustered db server, you can take one web and one db out of the pool, deploy to those, test the app there. Then put them back in pool, take the other ones out, update them and put them back.

RyanW
  • 5,338
  • 4
  • 46
  • 58
  • You are correct, that is what you want to do in the long run given you have $ for the hardware and personnel to manage. – rick schott Aug 30 '11 at 20:56
1

How coupled is your code to the database? Does it look like this in some places:

dt.Rows[0]["CustomerId"];

ORMs are great for mitigating database changes. However, this may not be a viable option for you. Instead, outside of major database changes I would try to write code that works with the existing schema and the new schema. For example, make your code less dependent on a column existing (or not)...if not exists for an int, default zero.... if not exists for a varchar, default empty string.

"All problems in computer science can be solved by another level of indirection." Butler Lampson

Kris Krause
  • 7,304
  • 2
  • 23
  • 26
  • Tightly coupled. Database changes break the application. I avoid database defaults because I want the application to fail when an expected value is null to avoid confusion during development. – Carter Medlin Aug 30 '11 at 21:02
1

If you are making stored procedure changes, perhaps you can default the parameters as you add new parameters? This let's you move the database code before the web site publishes.

Example of current production stored procedure...

ALTER PROCEDURE [dbo].[p_Stored_Proc_Name]
    @Some_Id INT
AS

Adding a new parameter to stored procedure...

ALTER PROCEDURE [dbo].[p_Stored_Proc_Name]
    @Some_Id INT,
    @New_Parameter = 0
AS

You need to make sure that your stored proc code handles @New_Parameter = 0 the way you'd like it to.

proudgeekdad
  • 3,424
  • 6
  • 42
  • 40
  • This is a good way of ensuring backward compatibility, but i'm not comfortable altering code that has passed testing to make special deployment versions. I'd rather not burden the developers with having to keep their code backward compatible either. – Carter Medlin Aug 30 '11 at 21:17
-2

Automate the process!

I would write a batch:

  1. Copy an App_Offline.htm file to the webserver
  2. Deploy ASP.NET app
  3. Deploy database changes (can be started in parallel to 2.)
  4. Remove or rename App_Offline.htm
Jan
  • 15,802
  • 5
  • 35
  • 59
  • "without downtime" "while keeping the site running" – Eonasdan Aug 30 '11 at 20:26
  • Sadly.... not an answer given that the poster SPECIFICALLY does not want ANY down time on the website. – TomTom Aug 30 '11 at 20:26
  • @TomTom: I guess its impossible without "any" downtime given that the app can't run with undeployed databas changes and the fact that changing the database takes some time. – Jan Aug 30 '11 at 20:31
  • Ah, no, it is doable, but it requires a VERY special programming also for the database side with multiple replicated databases at the same time. Generally not worth it, but hey, I am not here to argue specific requirements. – TomTom Aug 31 '11 at 04:40