473,386 Members | 1,741 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,386 software developers and data experts.

Production release methodology

Hi,

Our DBA group is debating production release methodology. We always
perfect our deployment package (typically a script) against the
database in a Staging environment, before executing the package
against Production. One side argues that the safest approach is to
create a script containing all schema changes, data changes, stored
procs, functions, etc. Run that script against Staging until it is
error-free. Then, when you run it against Production on Release Night
you know it will also be error-free, since Staging is a copy of
Production. However, any changes to the deployment, such as updates
to procs (and there are always a bunch in the pre-deployment period)
must be manually implemented on the script.

The other side wants to take advantage of the .NET environment. We
keep all procs, views, and functions in a Release folder on VSS,
execute a Get Latest Version (Recursive) from VisualStudio, and run
against Staging/Production. Any changes to the procs are
automatically propagated from VSS to VS, and there is no manual
editing of the script. You still need a script for table and data
changes. The "script everything" side of the debate feels this method
is not as reliable, especially since when views are nested, you will
get compile errors and must execute the run-on several times until all
the views compile. The "script" side feels that any errors are
unacceptable in a Production environment.

What is the opinion out there?

Thanks!
Jul 20 '05 #1
5 2272

"Theodore Feldman" <th**************@fmglobal.com> wrote in message
news:26**************************@posting.google.c om...
Hi,

Our DBA group is debating production release methodology. We always
perfect our deployment package (typically a script) against the
database in a Staging environment, before executing the package
against Production. One side argues that the safest approach is to
create a script containing all schema changes, data changes, stored
procs, functions, etc. Run that script against Staging until it is
error-free. Then, when you run it against Production on Release Night
you know it will also be error-free, since Staging is a copy of
Production. However, any changes to the deployment, such as updates
to procs (and there are always a bunch in the pre-deployment period)
must be manually implemented on the script.

The other side wants to take advantage of the .NET environment. We
keep all procs, views, and functions in a Release folder on VSS,
execute a Get Latest Version (Recursive) from VisualStudio, and run
against Staging/Production. Any changes to the procs are
automatically propagated from VSS to VS, and there is no manual
editing of the script. You still need a script for table and data
changes. The "script everything" side of the debate feels this method
is not as reliable, especially since when views are nested, you will
get compile errors and must execute the run-on several times until all
the views compile. The "script" side feels that any errors are
unacceptable in a Production environment.

What is the opinion out there?

Thanks!


One other possible approach is to prepare your Staging database with the
latest tables and other objects, taken from VSS, then use a DB comparison
tool to generate a sync script for deployment to production. This can save a
lot of time in deployment, and also makes it much harder to 'forget' any
minor or last minute changes.You would still need a script for data
modifications, of course. I use the Red Gate tools in more or less that way,
and they work well:

http://www.red-gate.com/sql/summary.htm

But there are plenty of other ways to deploy code, so it's probably going to
come down to whatever works best for your team, given the tools and
knowledge you have available.

Simon
Jul 20 '05 #2
"Theodore Feldman" <th**************@fmglobal.com> wrote in message
news:26**************************@posting.google.c om...
Hi,

Our DBA group is debating production release methodology. We always
perfect our deployment package (typically a script) against the
database in a Staging environment, before executing the package
against Production. One side argues that the safest approach is to
create a script containing all schema changes, data changes, stored
procs, functions, etc. Run that script against Staging until it is
error-free. Then, when you run it against Production on Release Night
you know it will also be error-free, since Staging is a copy of
Production. However, any changes to the deployment, such as updates
to procs (and there are always a bunch in the pre-deployment period)
must be manually implemented on the script.

The other side wants to take advantage of the .NET environment. We
keep all procs, views, and functions in a Release folder on VSS,
execute a Get Latest Version (Recursive) from VisualStudio, and run
against Staging/Production. Any changes to the procs are
automatically propagated from VSS to VS, and there is no manual
editing of the script. You still need a script for table and data
changes. The "script everything" side of the debate feels this method
is not as reliable, especially since when views are nested, you will
get compile errors and must execute the run-on several times until all
the views compile. The "script" side feels that any errors are
unacceptable in a Production environment.

What is the opinion out there?


The database is still central and will always be so.
The application environment changes every season.
It might have been VB last year but this year its .net.
This is history.

It is for ever a compromise under current methodologies,
but my opinion is that applications should be written with
more and more of the code outside the application and in
the database environment.

In the end, there will only be the database. The application
environment external to the rdbms is a redundancy. I would
not be fuelling a redundancy unless you dont have any other
options.


Pete Brown
Falls Creek
Oz


Jul 20 '05 #3
We keep all SQL scripts under source control and build release packages
based on only labeled versions. This provides more control over what gets
promoted. We use a custom .Net program to run the scripts, control the
execution sequence and provide enhanced logging. One can achieve similar
results with OSQL.

I suggest you treat database implementation script development like you do
code - as an iterative build process. If you find errors, correct the
source controlled scripts, create another release package build and try
again against a fresh staging database until you get it right. This
discipline will ensure that the same stuff that was blessed in your QA/UAT
environment is the same that gets promoted.

Also, check out http://www.abaris.se/abaperls/index.html for a public domain
configuration management toolset. Even if you choose not to use it, it may
give you some ideas for automating the CM process.

--
Hope this helps.

Dan Guzman
SQL Server MVP
"Theodore Feldman" <th**************@fmglobal.com> wrote in message
news:26**************************@posting.google.c om...
Hi,

Our DBA group is debating production release methodology. We always
perfect our deployment package (typically a script) against the
database in a Staging environment, before executing the package
against Production. One side argues that the safest approach is to
create a script containing all schema changes, data changes, stored
procs, functions, etc. Run that script against Staging until it is
error-free. Then, when you run it against Production on Release Night
you know it will also be error-free, since Staging is a copy of
Production. However, any changes to the deployment, such as updates
to procs (and there are always a bunch in the pre-deployment period)
must be manually implemented on the script.

The other side wants to take advantage of the .NET environment. We
keep all procs, views, and functions in a Release folder on VSS,
execute a Get Latest Version (Recursive) from VisualStudio, and run
against Staging/Production. Any changes to the procs are
automatically propagated from VSS to VS, and there is no manual
editing of the script. You still need a script for table and data
changes. The "script everything" side of the debate feels this method
is not as reliable, especially since when views are nested, you will
get compile errors and must execute the run-on several times until all
the views compile. The "script" side feels that any errors are
unacceptable in a Production environment.

What is the opinion out there?

Thanks!

Jul 20 '05 #4
th**************@fmglobal.com (Theodore Feldman) wrote in message news:<26**************************@posting.google. com>...
Hi,

Our DBA group is debating production release methodology. We always
perfect our deployment package (typically a script) against the
database in a Staging environment, before executing the package
against Production. One side argues that the safest approach is to
create a script containing all schema changes, data changes, stored
procs, functions, etc. Run that script against Staging until it is
error-free. Then, when you run it against Production on Release Night
you know it will also be error-free, since Staging is a copy of
Production. However, any changes to the deployment, such as updates
to procs (and there are always a bunch in the pre-deployment period)
must be manually implemented on the script.

The other side wants to take advantage of the .NET environment. We
keep all procs, views, and functions in a Release folder on VSS,
execute a Get Latest Version (Recursive) from VisualStudio, and run
against Staging/Production. Any changes to the procs are
automatically propagated from VSS to VS, and there is no manual
editing of the script. You still need a script for table and data
changes. The "script everything" side of the debate feels this method
is not as reliable, especially since when views are nested, you will
get compile errors and must execute the run-on several times until all
the views compile. The "script" side feels that any errors are
unacceptable in a Production environment.

What is the opinion out there?

Thanks!


We use www.red-gate.com to generate a script that would update
production to the current version of the staging database. This only
applies to the schema (and some static data tables).

The rest, ie procs, views, functions, etc comes direct from
SourceSafe. If there are dependencies, we either use alpabetical
ordering (default) or a custom script that applies inner
functions/views etc first.

We have also setup an nightly batch process that does a trial upgrade
from staging to production every night to pick up errors as soon as
possible.

To me it sounds like your DBAs are trying to keep themselves in a job.
Manually creating these scripts can be a time consuming, laborious and
uninteresting process. They used to use this process at a previous
role and we changed
Jul 20 '05 #5
[posted and mailed, please reply in news]

Theodore Feldman (th**************@fmglobal.com) writes:
Our DBA group is debating production release methodology. We always
perfect our deployment package (typically a script) against the
database in a Staging environment, before executing the package
against Production. One side argues that the safest approach is to
create a script containing all schema changes, data changes, stored
procs, functions, etc. Run that script against Staging until it is
error-free. Then, when you run it against Production on Release Night
you know it will also be error-free, since Staging is a copy of
Production. However, any changes to the deployment, such as updates
to procs (and there are always a bunch in the pre-deployment period)
must be manually implemented on the script.

The other side wants to take advantage of the .NET environment. We
keep all procs, views, and functions in a Release folder on VSS,
execute a Get Latest Version (Recursive) from VisualStudio, and run
against Staging/Production. Any changes to the procs are
automatically propagated from VSS to VS, and there is no manual
editing of the script. You still need a script for table and data
changes. The "script everything" side of the debate feels this method
is not as reliable, especially since when views are nested, you will
get compile errors and must execute the run-on several times until all
the views compile. The "script" side feels that any errors are
unacceptable in a Production environment.


I just finished another posting on CM issues, see the topic "Script
Question", which gives an outline of how we work in our shop. Here
I will just supplement with your specific issues.

Basically, we go the script way, but we have a tool which is smart enough
to permit regeneration of the script as well as manual tweaks. This gives
the best of both worlds.

A completely manually maintained script is a fragile solution, which is
open to casual mistakes, and risky.

If I did not have a tool at hand, I would probably start from your
..Net alternative, and then find solutions to problems like nested views.

I recall that there is a stored procedure builder that comes with the
SQL Server 2000 Resource Kit. I have never came around to evaluate it,
but I know it works from SourceSafe.

--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

5
by: Bill Willyerd | last post by:
I have been looking for some documentation that would support or reject my opinion on Production -vs- Development naming conventions. I believe that each environment should be housed on separate...
3
by: yoda | last post by:
Hi Guys, I've been used to deploying code to the production server by checking out of subversion and manually sorting out any kinks. (yes, I know, it sounds primitive) I realize I'm losing so...
4
by: Benne Smith | last post by:
In our company, i have three servers; 1) a development server (mine only - here i make daily changes and test my stuff) 2) a test server (for the users to test milestone builds - changes weekly)...
4
by: Jazz | last post by:
Hello, I have a question about trace as MSDN seems confuses me.(http://msdn.microsoft.com/library/default.asp?url=/library/en-us/cpref/html/frlrfSystemDiagnosticsTraceClassTopic.asp). My answer...
6
by: Aaron | last post by:
I'm too excited about the new features in 2.0. I'm think about installing beta 2 to my live production server. Is there going to be any problems running .net framework 2.0 beta 2 on a production...
4
by: Steffen Loringer | last post by:
Hi all, which of the asp.net files on my development system must be transfered to the "final" server system to make the application working? And which of the "Directory" option in IIS need to be...
1
by: Steve Franks | last post by:
Can someone please let me know what's required to deploy an ASP.NET 2.0 site to a production server? I'm not referring to "Copy Web Site" or how to move the files over, but rather asking about...
2
by: KBS Tours Development | last post by:
Hi, I've deployed latest release on production with new AJAX functionality but the server posts back completely and toolkit functionalities are not working at all like calendar etc. Here is...
3
by: =?Utf-8?B?bG10dGFn?= | last post by:
We have developed a number of different applications (ASP.NET web site, Windows services, DLLs, Windows forms, etc.) in C# 2.0. We have developed and unit tested all these applications/components...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.