473,569 Members | 2,598 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Stored Procedure vs SQL huge difference in execution time

mas
I have a Stored Procedure (SP) that creates the data required for a
report that I show on a web page. The SP does all the work and just
returns back a results set that I dump in an ASP.NET DataGrid. The SP
takes a product area and a start and end date as parameters.

Here are the basics of the SP.

1. Create temp table to store report results, all columns are created
that will be needed at this point.
2. Select products and general product data into the temp table.
3. Create a cursor that loops through all the products in the temp
table, running a more complex query with each individual product.
4. The results of that query are updated on the temp table based on the
current product of the cursor.
5. A complex "totals" query is run and the results from that are
inserted into the temp table as the last 3 rows.

In all we are talking about 120 rows in the temp table with 8 columns
that are mostly numbers.

I originally wrote this report SP about a month ago and it worked fine,
ran in about 10 - 20 seconds based on server traffic and amount of
data in the temp table. For the example I'm running there are the
120 products.

Just yesterday the (SP started timing out and when I ran the SP
manually from Query Analyzer (QA) (exec SP_NAME ... ) with the same
parameters as it was getting in the code it took 6 minutes to complete.
I was floored. I immediately copied the SQL out of the SP and pasted
into another QA window, changed the variables to be hard coded values
and ran it. It completed in 10 seconds.

I'm really confused now. I ran a Profiler on the 2 when I ran them
again. The SQL code in QA executed again in ~10 seconds with 65,000
reads. When the SP finished some 6 minutes later it had completed wit
the right results but it needed 150,000,000 reads to do its job.

How can the exact same SQL code produce such different results (time,
disk reads) based on whether its in a SP or just run from QA but still
give me the exact same output. The reports both look correct and have
the same numbers of rows.

I asked my Sys Admin if he had done anything to anything and he said
no.

I've been reading about recompiles and temp table indexes and all
kinds of other stuff that could possibly be affecting it but have
gotten nowhere.

Any ideas are appreciated.

Jul 23 '05 #1
5 4124
Try dropping and recompiling the procedure. It may be that the query plan
stored for the procedure was based on table statistics that have now become
obsolete.
Joe Weinstein at BEA

mas wrote:
I have a Stored Procedure (SP) that creates the data required for a
report that I show on a web page. The SP does all the work and just
returns back a results set that I dump in an ASP.NET DataGrid. The SP
takes a product area and a start and end date as parameters.

Here are the basics of the SP.

1. Create temp table to store report results, all columns are created
that will be needed at this point.
2. Select products and general product data into the temp table.
3. Create a cursor that loops through all the products in the temp
table, running a more complex query with each individual product.
4. The results of that query are updated on the temp table based on the
current product of the cursor.
5. A complex "totals" query is run and the results from that are
inserted into the temp table as the last 3 rows.

In all we are talking about 120 rows in the temp table with 8 columns
that are mostly numbers.

I originally wrote this report SP about a month ago and it worked fine,
ran in about 10 - 20 seconds based on server traffic and amount of
data in the temp table. For the example I'm running there are the
120 products.

Just yesterday the (SP started timing out and when I ran the SP
manually from Query Analyzer (QA) (exec SP_NAME ... ) with the same
parameters as it was getting in the code it took 6 minutes to complete.
I was floored. I immediately copied the SQL out of the SP and pasted
into another QA window, changed the variables to be hard coded values
and ran it. It completed in 10 seconds.

I'm really confused now. I ran a Profiler on the 2 when I ran them
again. The SQL code in QA executed again in ~10 seconds with 65,000
reads. When the SP finished some 6 minutes later it had completed wit
the right results but it needed 150,000,000 reads to do its job.

How can the exact same SQL code produce such different results (time,
disk reads) based on whether its in a SP or just run from QA but still
give me the exact same output. The reports both look correct and have
the same numbers of rows.

I asked my Sys Admin if he had done anything to anything and he said
no.

I've been reading about recompiles and temp table indexes and all
kinds of other stuff that could possibly be affecting it but have
gotten nowhere.

Any ideas are appreciated.


Jul 23 '05 #2
mas
Tried that already, dropped and recreated the same procedure and also
created a whole new procedure with different name - still get the same
results.

Jul 23 '05 #3
can we take a look at the sp.

Jul 23 '05 #4
On 3 Jun 2005 11:07:33 -0700, mas wrote:
Tried that already, dropped and recreated the same procedure and also
created a whole new procedure with different name - still get the same
results.
Then try comparing the execution plans, updating the index statistics, or
consider adding an index. This part of your post
I immediately copied the SQL out of the SP and pasted
into another QA window, changed the variables to be hard coded values
and ran it. It completed in 10 seconds.


makes me think very strongly that there is a vital difference in execution
plans. When hard coded values are handed to the query optimizer, it can
make very intelligent decisions about what index to use. But a stored
procedure must be compiled ahead of time, and parameter values are not
known ahead of time, and the optimizer sometimes does not make the right
decisions.

This can happen especially if you have too many parameters in the query.
For instance

SELECT * FROM TBL1
INNER JOIN TBL2 ON TBL1.K=TBL2.K
INNER JOIN TBL3 ON TBL1.M=TBL3.M
WHERE (TBL1.F1 = @X OR @X IS NULL)
AND (TBL2.F2 = @Y OR @Y IS NULL)
AND (TBL3.F3 = @Z OR @Z IS NULL)

Even if you have clustered indexes on F1, F2, F3, the optimizer might not
use them. Instead it will do a table scan on all three tables, because it
knows the "OR @X IS NULL" subclause might match every row if @X gets passed
in as null.

This kind of query is best broken up into separate queries:

IF @X IS NOT NULL
THEN SELECT * FROM TBL1 .... WHERE TBL1.F1 = @X
ELSE
IF @Y IS NOT NULL
THEN SELECT * FROM TBL1 .... WHERE TBL2.F2 = @Y
ELSE
IF @Z IS NOT NULL
THEN SELECT * FROM TBL1 ... WHERE TBL3.F3 = @Z

Precisely this kind of thing bit me very hard in the butt last year. The
users wanted to be able to search the member table on any combination of
name, memberID, or address fields. I had to get them to pare it down to
three or four kinds of most commonly used queries, and separate it that
way.

I'm sure that there will be other replies to this thread that will explain
it even better but I hope this helps.
Jul 23 '05 #5
mas (bm*******@gmai l.com) writes:
1. Create temp table to store report results, all columns are created
that will be needed at this point.
2. Select products and general product data into the temp table.
3. Create a cursor that loops through all the products in the temp
table, running a more complex query with each individual product.
4. The results of that query are updated on the temp table based on
the current product of the cursor.
5. A complex "totals" query is run and the results from that are
inserted into the temp table as the last 3 rows.

In all we are talking about 120 rows in the temp table with 8 columns
that are mostly numbers.

I originally wrote this report SP about a month ago and it worked fine,
ran in about 10 - 20 seconds based on server traffic and amount of
data in the temp table. For the example I'm running there are the
120 products.
10-20 seconds for the volumes you indicate is unreasonably high. You
should be able to slash this to subsecond execution time. Assuming
that is, you can rip out all the cursors, and replace everythng
with set-based processing.
Just yesterday the (SP started timing out and when I ran the SP
manually from Query Analyzer (QA) (exec SP_NAME ... ) with the same
parameters as it was getting in the code it took 6 minutes to complete.
I was floored. I immediately copied the SQL out of the SP and pasted
into another QA window, changed the variables to be hard coded values
and ran it. It completed in 10 seconds.
...
How can the exact same SQL code produce such different results (time,
disk reads) based on whether its in a SP or just run from QA but still
give me the exact same output. The reports both look correct and have
the same numbers of rows.


If the stored procedure uses variables, and QA uses hard-coded
constants there is a huge difference when it comes to performance.
If you have constants the optimizer knows exacrly which values that
you have. If you have variables, the optimizer has no idea, but have
to make standard assumptions. Less accurate information => less accurate
estimates => less optimal execution plan.

But the real problem is the cursor. Iterative processing in SQL is
something you should try to avoid.

--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 23 '05 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

12
8326
by: serge | last post by:
I have an SP that is big, huge, 700-800 lines. I am not an expert but I need to figure out every possible way that I can improve the performance speed of this SP. In the next couple of weeks I will work on preparing SQL statements that will create the tables, insert sample record and run the SP. I would hope people will look at my SP and...
2
3892
by: Mark | last post by:
I created a test to check the execution time difference between executing a SQL Server stored procedured using explicit parameters versus not. In one case I created new SqlParameters in the code, and added the parameters to the SqlParametersCollection of the SqlCommand object. In the second, I just made it all into long execution string. I...
2
5443
by: Dino L. | last post by:
How can I run stored procedure (MSSQL) ?
7
3202
by: JIM.H. | last post by:
Hello, Is there any difference to between SLQ string in the code and call execute query and call a stored procedure and execute the query that way concerning speed, effectiveness, reliability, …? Thanks, Jim.
45
3379
by: John | last post by:
Hi When developing vb.bet winform apps bound to sql server datasource, is it preferable to use SELECTs or stored procedure to read and write data from/to SQL Server? Why? Thanks Regards
28
72380
by: mooreit | last post by:
The purpose for my questions is accessing these technologies from applications. I develop both applications and databases. Working with Microsoft C#.NET and Microsoft SQL Server 2000 Production and 2005 Test Environments. What is the purpose of a view if I can just copy the vode from a view and put it into a stored procedure? Should I be...
3
6861
by: comp_databases_ms-sqlserver | last post by:
This post is related to SQL server 2000 and SQL Server 2005 all editions. Many of my stored procedures create temporary tables in the code. I want to find a way to find the query plan for these procs Repro --*********************************** use pubs go CREATE PROCEDURE Test @percentage int
0
3159
by: SOI_0152 | last post by:
Hi all! Happy New Year 2008. Il hope it will bring you love and happyness I'm new on this forum. I wrote a stored procedure on mainframe using DB2 7.1.1 and IBM language c. Everything works fine. Now we decided to move from mainframe IMS-DB2 to Windows 2003 server-DB2 UDB for LUW 9.5.
5
4194
by: Lennart | last post by:
Here is a problem I would like some opinions on. I have a stored procedure (I suspect that the problem exists for other procedures as well, but I haven't verified it). First time it runs (each day) it takes 8-10 minutes to run. If I rerun the procedure after a while it runs in 2-4 minutes. The SP doesnt update any data itself, but of course...
0
7700
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main...
1
7676
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For...
0
7974
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the...
0
6284
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then...
0
5219
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert...
0
3653
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in...
0
3642
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
1221
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
0
938
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.