Hello.
I am developing the application (VBA&ODBC, to be exact) which
periodically calls the stored procedures in the IBM DB2. A few of the
procedures require executing with isolation level RR ( ANSI
"SERIALIZABLE" ), not the default; default is CS (ANSI "Read
Committed")). The procedure language is SQL.
According to the documentation, I can adjust procedure *run*-time
isolation level by setting *compile*-time dataserver-wide option
DB2_SQLROUTINE_PREPOPTS="ISOLATION RR" and by restarting server.
But development process is not unitary code writing and compiling. I
have to design/code/compile/test/ the procedures multitude times,
before the code stabilize. The isolation levels of these procedures are
going every which way.
Consequently I have to restart dataserver almost every time I changed a
line of the SQL code, haven't I ? 6-10 restarts per hour... This is
nightmare...
Besides me there are other developers using this dataserver. How should
we "serialize" our compilations ? I imagine the schedule for
compilation of the SQL stored procedures:
- 10:00-11:00 - "ISOLATION RR"
- 11:00-12:00 - "ISOLATION CS"
.... Sure, It's incomparable degree of parallelism of work !
I have some practical questions:
1) Is there a way to develop the SQL stored procedures with different
transaction isolation levels, but without continuous dataserver
restarting ?
2) Is there a way to deploy these procedures at the production
dataserver without interrupting the service ?
I have some academic questions:
1) Why the stored procedure does not obey to the transaction isolation
level of the calling unit of work ?
2) Why can't I adjust this level after procedure creation ?
3) Why this level is adjusted dataserver-wide ?
--
Thank you in advance, Konstantin Andreev.