468,103 Members | 1,193 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 468,103 developers. It's quick & easy.

Same query - different execution plans??

Hi,

We are trying to solve a real puzzle. We have a stored procedure that
exhibits *drastically* different execution times depending on how its
executed.

When run from QA, it can take as little as 3 seconds. When it is
called from an Excel vba application, it can take up to 180 seconds.
Although, at other times, it can take as little as 20 seconds from
Excel.

Here's a little background. The 180 second response time *usually*
occurs after a data load into a table that is referenced by the stored
procedure.

A check of DBCC show_statistics shows that the statistics DO get
updated after a large amount of data is loaded into the table.

*** So, my first question is, does the updated statistics force a
recompile of the stored procedure?

Next, we checked syscacheobjects to see what was going on with the
execution plan for this stored procedure. What I expected to see was
ONE execution plan for the stored procedure.

This is not the case at all. What is happening is that TWO separate
COMPILED PLANs are being created, depending on whether the sp is run
from QA or from Excel.

In addition, there are several EXECUTABLE PLANs that correspond to the
two COMPILED PLANs. Depending on *where* the sp is run, the usecount
increases for the various EXECUTABLE PLANS.

To me, this does not make any sense! Why are there *multiple* compile
and executable plans for the SAME sp?

One theory we have is, that we need to call the sp with the dbo
qualifier, ie) EXEC dbo.sp

Has anyone seen this? I just want to get to the bottom of this and
find out why sometimes the query takes 180 seconds and other times
only takes 3 seconds!!

Please help.

Thanks much

Jul 16 '07 #1
5 10455
sqlgirl (to*******@gmail.com) writes:
This is not the case at all. What is happening is that TWO separate
COMPILED PLANs are being created, depending on whether the sp is run
from QA or from Excel.
Yes, that is actually to be expected. :-)

Here is the scoop: certain SET commands affects the query plan so much,
that different combinations of SET commands yields different cache
entries. When you connect from Excel, you use an API that issues
turns on a number of SET commands. In QA you can set up under Options->
Connection Properties how to connect. But the QA default includes
SET ARITHABORT ON, which is not set when you connect from a client
application.

That explains why you see different plans. It does not explain why
you get such difference in performance. It may just be bad luck, but
there is also a real possibility: somewhere there is an indexed view
or an indexed computed columns. For these indexes, six SET commands
must be on and a seventh must be OFF. And one of the six are SET
ARITHABORT ON.

Try to get your VBA application to issue SET ARITHABORT ON and see
what happens.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Jul 16 '07 #2
Hi, thanks for the information. Is ARITHABORT the only setting we need
to be concerned with?

We ran a test. Cleared out the cache entry for our sp by altering it.
Verified that it was no longer listed in syscacheobjects.

Next, we ran the query from QA, it added an entry to syscacheobjects.
Then, we ran it from excel. It still creates a second entry in
syscacheobjects.

Which still leaves me scratching my head.
Jul 16 '07 #3
sqlgirl (to*******@gmail.com) writes:
Hi, thanks for the information. Is ARITHABORT the only setting we need
to be concerned with?

We ran a test. Cleared out the cache entry for our sp by altering it.
Verified that it was no longer listed in syscacheobjects.

Next, we ran the query from QA, it added an entry to syscacheobjects.
Then, we ran it from excel. It still creates a second entry in
syscacheobjects.
And by then you had added SET ARITHABORT ON to the VBA code?

There are indeed other options that matter, but ARITHABORT is typically
the tricky one.

You can look at the column setopts in syscacheobjects, to see if there is
any difference. Unfortunately, Books Online does not specify which bit
that corresponds to which setting. (I assume that since you look in
syscacheobjects, that you are on SQL 2000. In SQL 2005 this information
is available in a more digestable form.)

You can also examine the SET option, by looking at the processes from
Profiler, keep an eye on Current connections, and Audit:Login, as well
as StmtCompleted.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Jul 17 '07 #4
It is not explicitly stated, so I'll ask:

Does each environment call the stored procedures with the exact same
parameters?

Does each environment call the stored procedure after a set of data is
loaded?

Do you have the execution plan for each scenario?

If you explicitly update statistics for the system after the new data
is loaded, does the execution time improve?

Jul 17 '07 #5
In addition to the other responses: when using stored procedures,
parameter sniffing can influence the query plan. So if two execution
plans are cached (because of differing connection settings), then the
parameter values of the first call will determine the query plan. If
these differ, the plan can differ.

Gert-Jan
sqlgirl wrote:
>
Hi,

We are trying to solve a real puzzle. We have a stored procedure that
exhibits *drastically* different execution times depending on how its
executed.

When run from QA, it can take as little as 3 seconds. When it is
called from an Excel vba application, it can take up to 180 seconds.
Although, at other times, it can take as little as 20 seconds from
Excel.

Here's a little background. The 180 second response time *usually*
occurs after a data load into a table that is referenced by the stored
procedure.

A check of DBCC show_statistics shows that the statistics DO get
updated after a large amount of data is loaded into the table.

*** So, my first question is, does the updated statistics force a
recompile of the stored procedure?

Next, we checked syscacheobjects to see what was going on with the
execution plan for this stored procedure. What I expected to see was
ONE execution plan for the stored procedure.

This is not the case at all. What is happening is that TWO separate
COMPILED PLANs are being created, depending on whether the sp is run
from QA or from Excel.

In addition, there are several EXECUTABLE PLANs that correspond to the
two COMPILED PLANs. Depending on *where* the sp is run, the usecount
increases for the various EXECUTABLE PLANS.

To me, this does not make any sense! Why are there *multiple* compile
and executable plans for the SAME sp?

One theory we have is, that we need to call the sp with the dbo
qualifier, ie) EXEC dbo.sp

Has anyone seen this? I just want to get to the bottom of this and
find out why sometimes the query takes 180 seconds and other times
only takes 3 seconds!!

Please help.

Thanks much
Jul 17 '07 #6

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

10 posts views Thread by Thomas R. Hummel | last post: by
2 posts views Thread by Luke.Schollmeyer | last post: by
1 post views Thread by Solo | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.