469,964 Members | 1,729 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,964 developers. It's quick & easy.

Stored Procedure vs SQL huge difference in execution time

mas
I have a Stored Procedure (SP) that creates the data required for a
report that I show on a web page. The SP does all the work and just
returns back a results set that I dump in an ASP.NET DataGrid. The SP
takes a product area and a start and end date as parameters.

Here are the basics of the SP.

1. Create temp table to store report results, all columns are created
that will be needed at this point.
2. Select products and general product data into the temp table.
3. Create a cursor that loops through all the products in the temp
table, running a more complex query with each individual product.
4. The results of that query are updated on the temp table based on the
current product of the cursor.
5. A complex "totals" query is run and the results from that are
inserted into the temp table as the last 3 rows.

In all we are talking about 120 rows in the temp table with 8 columns
that are mostly numbers.

I originally wrote this report SP about a month ago and it worked fine,
ran in about 10 - 20 seconds based on server traffic and amount of
data in the temp table. For the example I'm running there are the
120 products.

Just yesterday the (SP started timing out and when I ran the SP
manually from Query Analyzer (QA) (exec SP_NAME ... ) with the same
parameters as it was getting in the code it took 6 minutes to complete.
I was floored. I immediately copied the SQL out of the SP and pasted
into another QA window, changed the variables to be hard coded values
and ran it. It completed in 10 seconds.

I'm really confused now. I ran a Profiler on the 2 when I ran them
again. The SQL code in QA executed again in ~10 seconds with 65,000
reads. When the SP finished some 6 minutes later it had completed wit
the right results but it needed 150,000,000 reads to do its job.

How can the exact same SQL code produce such different results (time,
disk reads) based on whether its in a SP or just run from QA but still
give me the exact same output. The reports both look correct and have
the same numbers of rows.

I asked my Sys Admin if he had done anything to anything and he said
no.

I've been reading about recompiles and temp table indexes and all
kinds of other stuff that could possibly be affecting it but have
gotten nowhere.

Any ideas are appreciated.

Jul 23 '05 #1
5 3934
Try dropping and recompiling the procedure. It may be that the query plan
stored for the procedure was based on table statistics that have now become
obsolete.
Joe Weinstein at BEA

mas wrote:
I have a Stored Procedure (SP) that creates the data required for a
report that I show on a web page. The SP does all the work and just
returns back a results set that I dump in an ASP.NET DataGrid. The SP
takes a product area and a start and end date as parameters.

Here are the basics of the SP.

1. Create temp table to store report results, all columns are created
that will be needed at this point.
2. Select products and general product data into the temp table.
3. Create a cursor that loops through all the products in the temp
table, running a more complex query with each individual product.
4. The results of that query are updated on the temp table based on the
current product of the cursor.
5. A complex "totals" query is run and the results from that are
inserted into the temp table as the last 3 rows.

In all we are talking about 120 rows in the temp table with 8 columns
that are mostly numbers.

I originally wrote this report SP about a month ago and it worked fine,
ran in about 10 - 20 seconds based on server traffic and amount of
data in the temp table. For the example I'm running there are the
120 products.

Just yesterday the (SP started timing out and when I ran the SP
manually from Query Analyzer (QA) (exec SP_NAME ... ) with the same
parameters as it was getting in the code it took 6 minutes to complete.
I was floored. I immediately copied the SQL out of the SP and pasted
into another QA window, changed the variables to be hard coded values
and ran it. It completed in 10 seconds.

I'm really confused now. I ran a Profiler on the 2 when I ran them
again. The SQL code in QA executed again in ~10 seconds with 65,000
reads. When the SP finished some 6 minutes later it had completed wit
the right results but it needed 150,000,000 reads to do its job.

How can the exact same SQL code produce such different results (time,
disk reads) based on whether its in a SP or just run from QA but still
give me the exact same output. The reports both look correct and have
the same numbers of rows.

I asked my Sys Admin if he had done anything to anything and he said
no.

I've been reading about recompiles and temp table indexes and all
kinds of other stuff that could possibly be affecting it but have
gotten nowhere.

Any ideas are appreciated.


Jul 23 '05 #2
mas
Tried that already, dropped and recreated the same procedure and also
created a whole new procedure with different name - still get the same
results.

Jul 23 '05 #3
can we take a look at the sp.

Jul 23 '05 #4
On 3 Jun 2005 11:07:33 -0700, mas wrote:
Tried that already, dropped and recreated the same procedure and also
created a whole new procedure with different name - still get the same
results.
Then try comparing the execution plans, updating the index statistics, or
consider adding an index. This part of your post
I immediately copied the SQL out of the SP and pasted
into another QA window, changed the variables to be hard coded values
and ran it. It completed in 10 seconds.


makes me think very strongly that there is a vital difference in execution
plans. When hard coded values are handed to the query optimizer, it can
make very intelligent decisions about what index to use. But a stored
procedure must be compiled ahead of time, and parameter values are not
known ahead of time, and the optimizer sometimes does not make the right
decisions.

This can happen especially if you have too many parameters in the query.
For instance

SELECT * FROM TBL1
INNER JOIN TBL2 ON TBL1.K=TBL2.K
INNER JOIN TBL3 ON TBL1.M=TBL3.M
WHERE (TBL1.F1 = @X OR @X IS NULL)
AND (TBL2.F2 = @Y OR @Y IS NULL)
AND (TBL3.F3 = @Z OR @Z IS NULL)

Even if you have clustered indexes on F1, F2, F3, the optimizer might not
use them. Instead it will do a table scan on all three tables, because it
knows the "OR @X IS NULL" subclause might match every row if @X gets passed
in as null.

This kind of query is best broken up into separate queries:

IF @X IS NOT NULL
THEN SELECT * FROM TBL1 .... WHERE TBL1.F1 = @X
ELSE
IF @Y IS NOT NULL
THEN SELECT * FROM TBL1 .... WHERE TBL2.F2 = @Y
ELSE
IF @Z IS NOT NULL
THEN SELECT * FROM TBL1 ... WHERE TBL3.F3 = @Z

Precisely this kind of thing bit me very hard in the butt last year. The
users wanted to be able to search the member table on any combination of
name, memberID, or address fields. I had to get them to pare it down to
three or four kinds of most commonly used queries, and separate it that
way.

I'm sure that there will be other replies to this thread that will explain
it even better but I hope this helps.
Jul 23 '05 #5
mas (bm*******@gmail.com) writes:
1. Create temp table to store report results, all columns are created
that will be needed at this point.
2. Select products and general product data into the temp table.
3. Create a cursor that loops through all the products in the temp
table, running a more complex query with each individual product.
4. The results of that query are updated on the temp table based on
the current product of the cursor.
5. A complex "totals" query is run and the results from that are
inserted into the temp table as the last 3 rows.

In all we are talking about 120 rows in the temp table with 8 columns
that are mostly numbers.

I originally wrote this report SP about a month ago and it worked fine,
ran in about 10 - 20 seconds based on server traffic and amount of
data in the temp table. For the example I'm running there are the
120 products.
10-20 seconds for the volumes you indicate is unreasonably high. You
should be able to slash this to subsecond execution time. Assuming
that is, you can rip out all the cursors, and replace everythng
with set-based processing.
Just yesterday the (SP started timing out and when I ran the SP
manually from Query Analyzer (QA) (exec SP_NAME ... ) with the same
parameters as it was getting in the code it took 6 minutes to complete.
I was floored. I immediately copied the SQL out of the SP and pasted
into another QA window, changed the variables to be hard coded values
and ran it. It completed in 10 seconds.
...
How can the exact same SQL code produce such different results (time,
disk reads) based on whether its in a SP or just run from QA but still
give me the exact same output. The reports both look correct and have
the same numbers of rows.


If the stored procedure uses variables, and QA uses hard-coded
constants there is a huge difference when it comes to performance.
If you have constants the optimizer knows exacrly which values that
you have. If you have variables, the optimizer has no idea, but have
to make standard assumptions. Less accurate information => less accurate
estimates => less optimal execution plan.

But the real problem is the cursor. Iterative processing in SQL is
something you should try to avoid.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 23 '05 #6

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

2 posts views Thread by Dino L. | last post: by
7 posts views Thread by JIM.H. | last post: by
45 posts views Thread by John | last post: by
reply views Thread by SOI_0152 | last post: by
5 posts views Thread by Lennart | last post: by
1 post views Thread by rainxy | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.