473,595 Members | 2,442 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Stored procedure performance mystery

My application fetches a batch of data through a web service and writes 1000
entities per batch to a SQL Server 2000 database. There are 4 tables in
every batch. There are the following number of SQL commands executed per
average of every batch;

Table #1: always 1
Table #2: 5
Table #3: 5
Table #4: 3

The problem is that the performance slows down for every batch. Below is an
excerpt from my log file;

2004-12-15 12:00:01 Starting job... (RAM usage: 6,38 mb)

2004-12-15 12:00:39 data fetch time: 00:00:28 (RAM usage: 23,04 mb)
2004-12-15 12:00:39 Total data fetch time: 00:00:37 (RAM usage: 23,04 mb)
2004-12-15 12:00:39 Inserting/updating 1000 entities...
2004-12-15 12:01:20 Write SQL time: 00:00:40

2004-12-15 12:01:49 data fetch time: 00:00:24 (RAM usage: 26,87 mb)
2004-12-15 12:01:49 Total data fetch time: 00:00:29 (RAM usage: 26,87 mb)
2004-12-15 12:01:49 Inserting/updating 1000 entities...
2004-12-15 12:02:59 Write SQL time: 00:01:10

2004-12-15 12:04:06 data fetch time: 00:00:29 (RAM usage: 27,48 mb)
2004-12-15 12:04:06 Total data fetch time: 00:01:06 (RAM usage: 27,48 mb)
2004-12-15 12:04:06 Inserting/updating 1000 entities...
2004-12-15 12:05:30 Write SQL time: 00:01:23

2004-12-15 12:06:05 data fetch time: 00:00:31 (RAM usage: 27,03 mb)
2004-12-15 12:06:05 Total data fetch time: 00:00:35 (RAM usage: 27,03 mb)
2004-12-15 12:06:05 Inserting/updating 1000 entities...
2004-12-15 12:07:37 Write SQL time: 00:01:32

As one can see, the Write SQL time increases per every batch.
I would like this time to stay around one minute per batch.

There are one trigger per table. There is one parent table which has a
primary-foreign key relationship to the three sub tables.

I have 2% automatic file size growth set on both the data and the log file.
Thank you in advance to the guru which helps me out with this!


Jul 23 '05 #1
5 2175
Magnus Österberg (ma******@abo.f i) writes:
My application fetches a batch of data through a web service and writes
1000 entities per batch to a SQL Server 2000 database. There are 4
tables in every batch. There are the following number of SQL commands
executed per average of every batch;

Table #1: always 1
Table #2: 5
Table #3: 5
Table #4: 3

The problem is that the performance slows down for every batch. Below is
an excerpt from my log file;


A number of possible reasons, of which some would have possible to rule
out, had you provided more information.

Are the table empty when you insert the first batch? Or you get this
behaviour if you restart SQL Server and start to insert rows into tables
that already has a million rows?

If you start with empty table, the problem could be in the triggers.

Another possibility is that you pay a penalty to auto-growth. 2% auto-grow
is a tad little.

I would recommend that you run a Profiler trace, and add the events
SP:Completed, SP:Recompile and the auto-grow event. Thereby you might
be able to see where the time is spent. Note that you don't get any
duration for recompilations, but you should always pay attention to
recompatilons, as they some time depending on batch size.

--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 23 '05 #2

"Erland Sommarskog" <es****@sommars kog.se> wrote in message
news:Xn******** **************@ 127.0.0.1...
Magnus Österberg (ma******@abo.f i) writes:
My application fetches a batch of data through a web service and writes
1000 entities per batch to a SQL Server 2000 database. There are 4
tables in every batch. There are the following number of SQL commands
executed per average of every batch;

Table #1: always 1
Table #2: 5
Table #3: 5
Table #4: 3

The problem is that the performance slows down for every batch. Below is
an excerpt from my log file;
A number of possible reasons, of which some would have possible to rule
out, had you provided more information.

Are the table empty when you insert the first batch? Or you get this
behaviour if you restart SQL Server and start to insert rows into tables
that already has a million rows?

Yes, all four tables are empty at start.

If you start with empty table, the problem could be in the triggers.

Another possibility is that you pay a penalty to auto-growth. 2% auto-grow
is a tad little.
I changed it back to 10... no change.

I would recommend that you run a Profiler trace, and add the events
SP:Completed, SP:Recompile and the auto-grow event. Thereby you might
be able to see where the time is spent. Note that you don't get any
duration for recompilations, but you should always pay attention to
recompatilons, as they some time depending on batch size.

I'll try that and tell you the results!

Thanks!


--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp

Jul 23 '05 #3
With the way MS-SQL server does caching with stored procedures and
indexes you could be running into a problem with an old cached query
plan.

Try adding a WITH RECOMPILE to the stored procedure. This may show
down the stored procedure for each execution, will not if you are using
dynamic queries in the SP, but if this is hte problem then you will not
have the decreased time as you go along.

Jul 23 '05 #4

<wd******@rmi.n et> wrote in message
news:11******** *************@z 14g2000cwz.goog legroups.com...
With the way MS-SQL server does caching with stored procedures and
indexes you could be running into a problem with an old cached query
plan.

Try adding a WITH RECOMPILE to the stored procedure. This may show
down the stored procedure for each execution, will not if you are using
dynamic queries in the SP, but if this is hte problem then you will not
have the decreased time as you go along.


I now added WITH RECOMPILE to all four SP's. I am still facing the growth in
execution time.

#1 1min 1s
#2 1min 5s
#3 1min 20s
#4 1min 25s
..
..
..

Jul 23 '05 #5
Magnus Österberg (ma************ **@abo.fi) writes:
Yes, all four tables are empty at start.


In that case, I would first fill up the table with the size you
expect in production, and then try to improve performance from there.
Most likely the problems are in the triggers.

If you are lucky, the problem with go away with increasing size. The
query plans for empty tables may not be good when the tables fills up.

A tip for triggers is that if you refer to inserted/deleted in several
places in the trigger, that you copy the columns you need into table
variables, since inserted/deleted are usually slow.

--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 23 '05 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
4664
by: Samir Pandey | last post by:
I have a table which contains approx 3,00,000 records. I need to import this data into another table by executing a stored procedure. This stored procedure accepts the values from the table as params. My current solution is reading the table in cursor and executing the stored procedure. This takes tooooooo long. approx 5-6 hrs. I need to make it better. Can anyone help ? Samir
4
6538
by: Jarrod Morrison | last post by:
Hi All Im trying to use the code at the bottom of this message inside my stored procedure and when i execute the procedure in query analyzer i get the following error: Server: Msg 207, Level 16, State 3, Line 1 Invalid column name 'H'. This error happens a few times and im pretty sure it is because the select
12
8334
by: serge | last post by:
I have an SP that is big, huge, 700-800 lines. I am not an expert but I need to figure out every possible way that I can improve the performance speed of this SP. In the next couple of weeks I will work on preparing SQL statements that will create the tables, insert sample record and run the SP. I would hope people will look at my SP and give me any hints on how I can better write the SP.
3
4259
by: Vagif Abilov | last post by:
Hello, I have a question regarding stored procedure desing that provides the optimal performance. Let's say we have a table Products that consists of three columns: Name, Status, RegistrationTime. All columns are indexed and users should be able to lookup data by any of the columns. We have two main options to design stored procedures for data retrieval: 1. Design separate stored procedures for each search criteria:...
10
3724
by: Thomas R. Hummel | last post by:
I have a stored procedure that suddenly started performing horribly. The query plan didn't look right to me, so I copy/pasted the code and ran it (it's a single SELECT statement). That ran pretty well and used a query plan that made sense. Now, I know what you're all thinking... stored procedures have to optimize for variable parameters, etc. Here's what I've tried to fix the issue: 1. Recompiled the stored procedure 2. Created a new,...
4
9554
by: shyner | last post by:
Hi Everyone, I've been battling this for two days with no luck. I'm using SQL Server 2000. Here's the mystery: I've got a stored procedure that takes a single varchar parameter to determine how the result set is sorted. Here it is: CREATE PROCEDURE spDemo @SortField varchar(30)
5
2132
by: Rhino | last post by:
This question relates to DB2 Version 6 on OS/390. Can a (COBOL) stored procedure on this platform do file I/O, i.e. write to a sequential file? I am trying to debug a stored procedure. As far as I know, DB2 stored procedures cannot do terminal I/O on any operating system but I know that (Java) stored procedures in Windows/Linux/Unix can write to files and I have done this many times.
5
3971
by: Sergey | last post by:
Hi everyone, It looks like a mystery, but I hope there should be some explanation to the issue I experience. Once in a blue moon a random stored procedure stops working the way it was designed. The stored procedure code looks unchanged. Recompiling, altering the code do not help. It looks like it is simply does not execute some part of it or does not run at all. However it returns no errors. One time a procedure entered into infinite...
11
3412
by: peter | last post by:
I am trying to get a SQL stored procedure to use user maintained MQT implicitly which raises questions on when they are used or not used. In theory you would expect the stored procedure to pick up the MQT at the time it is bound on the creation of the static SQL. This raises the question on how you stop it or start it using a MQT as there is no option on the bind. What happens when it is rebound? What happens if the plan is made invalid...
0
7955
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
7883
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
8261
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
8379
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
8019
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
6674
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
5839
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
1
2391
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
0
1223
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.