473,394 Members | 1,893 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,394 software developers and data experts.

Performance Tuning for Row-by-Row Update Statement

hi

For an unavoidable reason, I have to use row-by-row processing
(update) on a temporary table to update a history table every day.
I have around 60,000 records in temporary table and about 2 million in
the history table.

Could any one please suggest different methods to imporve the runtime
of the query?

Would highly appreciate!
Jul 20 '05 #1
9 4549
Is the row-by-row processing done in a cursor? Must you update exactly one
row at a time (if so, why?) or would it be acceptable to update 2,3 or 50
rows at a time?

You can use SET ROWCOUNT and a loop to fine-tune the batch size of rows to
be updated. Bigger batches should improve performance over updating single
rows.

SET ROWCOUNT 50

WHILE 1=1
BEGIN

UPDATE SomeTable
SET ...
WHERE /* row not already updated */

IF @@ROWCOUNT=0
BREAK

END

SET ROWCOUNT 0

--
David Portas
SQL Server MVP
--
Jul 20 '05 #2
Is the row-by-row processing done in a cursor? Must you update exactly one
row at a time (if so, why?) or would it be acceptable to update 2,3 or 50
rows at a time?

You can use SET ROWCOUNT and a loop to fine-tune the batch size of rows to
be updated. Bigger batches should improve performance over updating single
rows.

SET ROWCOUNT 50

WHILE 1=1
BEGIN

UPDATE SomeTable
SET ...
WHERE /* row not already updated */

IF @@ROWCOUNT=0
BREAK

END

SET ROWCOUNT 0

--
David Portas
SQL Server MVP
--
Jul 20 '05 #3

"Muzamil" <mu*****@hotmail.com> wrote in message
news:5a**************************@posting.google.c om...
hi

For an unavoidable reason, I have to use row-by-row processing
(update) on a temporary table to update a history table every day.
I have around 60,000 records in temporary table and about 2 million in
the history table.
Not much you can do if you absolutely HAVE to do row-by-row updating.

You might want to post DDL, etc. so others can take a crack at it. I've
seen many times someone will say, "I have to use a cursor", "I have to
update one row at a time" and then someone posts a much better/faster
solution.

Also, how are you handling transactions? Explicitly or implicitely? If
you're doing them implicitely, are you wrapping each update in its own, or
can up batch say 20 updates?

Finally, where's your log files? Separate physical drives?


Could any one please suggest different methods to imporve the runtime
of the query?

Would highly appreciate!

Jul 20 '05 #4
Hi
Thanks for your reply.

The row-by-row update is mandatory becuase the leagacy system is
sending us the information such as "Add", "Modify" or "delete" and
this information HAS to be processed in the same order otherwise we'll
get the erroneous data.
I know it's a dumb way of doing things but this is what our and their
IT department has chosen to be correct way of action after several
meetings. Hence the batch idea will not work here.
I am not using Cursors, instead I am using the loop based on the
primary key.

The log files are on different drives.

I've also tried using "WITH (ROWLOCK)" in the update statement but
it's not helping much.

Can you please still throw in some idea? Would be great help!

Thanks
"Greg D. Moore \(Strider\)" <mo****************@greenms.com> wrote in message news:<tO*******************@twister.nyroc.rr.com>. ..
"Muzamil" <mu*****@hotmail.com> wrote in message
news:5a**************************@posting.google.c om...
hi

For an unavoidable reason, I have to use row-by-row processing
(update) on a temporary table to update a history table every day.
I have around 60,000 records in temporary table and about 2 million in
the history table.


Not much you can do if you absolutely HAVE to do row-by-row updating.

You might want to post DDL, etc. so others can take a crack at it. I've
seen many times someone will say, "I have to use a cursor", "I have to
update one row at a time" and then someone posts a much better/faster
solution.

Also, how are you handling transactions? Explicitly or implicitely? If
you're doing them implicitely, are you wrapping each update in its own, or
can up batch say 20 updates?

Finally, where's your log files? Separate physical drives?


Could any one please suggest different methods to imporve the runtime
of the query?

Would highly appreciate!

Jul 20 '05 #5
Muzamil (mu*****@hotmail.com) writes:
The row-by-row update is mandatory becuase the leagacy system is
sending us the information such as "Add", "Modify" or "delete" and
this information HAS to be processed in the same order otherwise we'll
get the erroneous data.


Ouch. Life is cruel, sometimes.

I wonder what possibilities there could be to find parallel streams,
that is updates that could be performed independently. Maybe you
can modify 10 rows at a time then. But it does not sound like a very
easy thing to do.

Without knowing the details of the system, it is difficult to give
much advice. But any sort of pre-aggregation you can do, is probably
going to pay back.
--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #6
Details of the system:
The leagcy system sends us records flagged with "Add", "modify" or
"delete".
The purpose of these flags is self-explnatory. But the fun began when
we noticed that within same file , legacy system sends us "Add" and
then "Modify". Thus, we were left with no other option except to do
row-by-row processing.
We came up with the following logic:

a) If record‘s StatusFlag is ‘A' and record‘s key does not exist in
DataWareHouse's Table, then the record is inserted into
DataWareHouse's Table.

b) If record‘s StatusFlag is ‘A', but record‘s key exists in
DataWareHouse's Table, then the record is marked as invalid and will
be inserted into InvalidTable..

c) If record‘s StatusFlag is ‘M' and record‘s key exists in
DataWareHouse's Table and record is active, then the corresponding
record in DataWareHouse's Table will be updated.

d) If record‘s StatusFlag is ‘M' and record‘s key exists in
DataWareHouse's Table but record is inactive, then the record is
marked as invalid and will be inserted into InvalidTable.
e) If record‘s StatusFlag is ‘M' and record‘s key does not exist in
DataWareHouse's Table, then the record is marked as invalid and will
be inserted into InvalidTable.

f) If record‘s StatusFlag is ‘D' and record‘s key exists in
DataWareHouse's Table and record is active, then the corresponding
record in DataWareHouse's Table will be updated as inactive.

g) If record‘s StatusFlag is ‘D' and record‘s key exists in
DataWareHouse's Table but record is inactive, then the record is
marked as invalid and will be inserted into InvalidTable.

h) If record‘s StatusFlag is ‘D' and record‘s key does not exist in
DataWareHouse's Table, then the record is marked as invalid and will
be inserted into InvalidTable.

This logic takes care of ALL the anomalies we were facing before but
at the cost of long processing time.

I await your comments.
Thanks
Erland Sommarskog <so****@algonet.se> wrote in message news:<Xn*********************@127.0.0.1>...
Muzamil (mu*****@hotmail.com) writes:
The row-by-row update is mandatory becuase the leagacy system is
sending us the information such as "Add", "Modify" or "delete" and
this information HAS to be processed in the same order otherwise we'll
get the erroneous data.


Ouch. Life is cruel, sometimes.

I wonder what possibilities there could be to find parallel streams,
that is updates that could be performed independently. Maybe you
can modify 10 rows at a time then. But it does not sound like a very
easy thing to do.

Without knowing the details of the system, it is difficult to give
much advice. But any sort of pre-aggregation you can do, is probably
going to pay back.

Jul 20 '05 #7
Muzamil (mu*****@hotmail.com) writes:
Details of the system:
The leagcy system sends us records flagged with "Add", "modify" or
"delete".
The purpose of these flags is self-explnatory. But the fun began when
we noticed that within same file , legacy system sends us "Add" and
then "Modify". Thus, we were left with no other option except to do
row-by-row processing.
We came up with the following logic:


Hm, you might be missing a few cases. What if you get an Add, and record
exists in DW, but is marked inactive? With your current logic, the
input record moved to the Invalid table.

And could that feediug system be as weird as to send Add, Modify, Delete,
and Add again? Well, for a robust solution this is what we should assume.

It's a tricky problem, and I was about to defer the problem, when I
recalled a solution that colleague did for one of our stored procedures.
The secret word for tonight is bucketing! Assuming that there are
only a couple of input records for each key value, this should be
an excellent solution. You create buckets, so that each bucket has
at most one row per key value. Here is an example on how to do it:

UPDATE inputtbl
SET bucket = (SELECT count(*)
FROM inputtbl b
WHERE a.keyval = b.keyval
AND a.rownumber < b.rownumber) + 1
FROM inputtbl a

input.keyval is the keys for the records in the DW table. Rownumber
is a column which as describes the processing order. I assume that
you have such a column.

So now you can iterate over the buckets, and for each bucket, you can do
set- based processing. You still have to iterate, but instead over 60000
rows, only over a couple of buckets.
--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #8
I think I was not articulate enough to convey the logic properly.
Anyways, thanks to everyone for your help.
By using the ROWLOCK and proper indexes, I was ale to reduce the time considerably.

Erland Sommarskog <so****@algonet.se> wrote in message news:<Xn********************@127.0.0.1>...
Muzamil (mu*****@hotmail.com) writes:
Details of the system:
The leagcy system sends us records flagged with "Add", "modify" or
"delete".
The purpose of these flags is self-explnatory. But the fun began when
we noticed that within same file , legacy system sends us "Add" and
then "Modify". Thus, we were left with no other option except to do
row-by-row processing.
We came up with the following logic:


Hm, you might be missing a few cases. What if you get an Add, and record
exists in DW, but is marked inactive? With your current logic, the
input record moved to the Invalid table.

And could that feediug system be as weird as to send Add, Modify, Delete,
and Add again? Well, for a robust solution this is what we should assume.

It's a tricky problem, and I was about to defer the problem, when I
recalled a solution that colleague did for one of our stored procedures.
The secret word for tonight is bucketing! Assuming that there are
only a couple of input records for each key value, this should be
an excellent solution. You create buckets, so that each bucket has
at most one row per key value. Here is an example on how to do it:

UPDATE inputtbl
SET bucket = (SELECT count(*)
FROM inputtbl b
WHERE a.keyval = b.keyval
AND a.rownumber < b.rownumber) + 1
FROM inputtbl a

input.keyval is the keys for the records in the DW table. Rownumber
is a column which as describes the processing order. I assume that
you have such a column.

So now you can iterate over the buckets, and for each bucket, you can do
set- based processing. You still have to iterate, but instead over 60000
rows, only over a couple of buckets.

Jul 20 '05 #9
Muzamil (mu*****@hotmail.com) writes:
I think I was not articulate enough to convey the logic properly.
Anyways, thanks to everyone for your help. By using the ROWLOCK and
proper indexes, I was ale to reduce the time considerably.


Good indexes is always useful, and of course for iterative processing
it is even more imperative, since the cost a less-than-optimal plan
is multiplied.

I'm just curious, would my bucketing idea be applicable to your problem?
It should give you even more speed, but if you have good-ebough now, there
is of course no reason to spend more time on it.
--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: Arni Snorri Eggertsson | last post by:
Hi I am trying to design an IO subsystem for my SQL Server and for that I need to try and predict IO activity on each table in my MSSQL Database. My idea is to move the hottest tables into...
0
by: NathanG | last post by:
Hi, I've run the (SQL2K) Index Tuning Wizard against our 50GB database and it has recommended changes that will produce a "-3506%" improvement iin performance. The recommendations include a...
0
by: Tober | last post by:
Anyone happen to know of any good books on performance tuning for MS-SQL 2000? I am currently building a Cognos datawarehouse and would like to ensure that the db is tuned for peak performance. ...
9
by: pheonix1t | last post by:
hello, I've been assigned to do performance tuning on an SQL2000 database (around 10GB in size, several instances). So far, I see a single RAID5 array, 4CPU (xeon 700MHZ), 4GB RAM. I see the...
0
by: George McLean | last post by:
Hello, I am a bit confused on tuning parameters for DB2. I have a VC++ application using ADO for database connectivity. The application uses server side cursors. For each transaction set the...
1
by: Fusheng Wang | last post by:
Hi, I have an insert intensive database, and it seems that the performance is a problem. Any suggestions on performance tuning on insert performance? Thanks a lot! Frank
35
by: sacha.prins | last post by:
Hi, I read a lot about DB2 INSERT performance here. I have a nice story as well. The thing is, I work on 2 installations of DB2 (on completely different locations) which run on different (but...
2
by: Jeff S | last post by:
I'm looking for guidance (tutorials, backgrounders, tips, or otherwise) on measuring the performance of ASP.NET applications. I'm specifically interested in acquiring the capability of generating...
3
by: abcd | last post by:
I am using one asp web page with code behind and using C#. The page is very slow when first hit. some time doesnt display but if I wait and request the page it is displayed. What could be the...
13
by: atlaste | last post by:
Hi, I'm currently developing an application that uses a lot of computational power, disk access and memory caching (to be more exact: an information retrieval platform). In these kind of...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.