Hi,
I'm having serious issues with our user defined functions. They get
very slow after a while. The functions I use have several IF-branches
which check input parameters. Always exactly one branch will be
executed.
Normally, the execution time of my functions is something like 300ms.
After a while though, it climbs up to 20 seconds (!!).
The interesting thing about it is now: if I just alter the function by
adding a blank somewhere it will be fast again. What the hell is going
on here?
To me it seems that 2005 is optimizing the function incorrectly. It is
executed on very different result sets (some are extremly small, some
are extensivly large). I can imagine that for one case hashed indexing
is better and for the other one nested loops. So maybe it sticks to
one ooptimization after a while? The problem is that I cannot really
prove it because I cannot force the behavior. It occurs suddenly it
won't go away until I perform the "add blank" action.
The server is a 8 processor DB cluster with 12GB of RAM, all SPs
installed (SP2 for 2005).
Any hints what to do?
René