473,385 Members | 1,487 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,385 software developers and data experts.

deleting records without writing to the transaction logs

vb
I need to delete records from several large tables - 1 million plus
rows. Some of these tables have nested foreign key relationships.
When I perform the delete statements, it can take forever as the
database works out the key relationships and commits things to the log.

Is there a way to speed up some of these deletes with out writing the
information to the database logs?

I am going to probably drop the RI on the tables and manually delete
from the child tables to improve things.

Any tips or suggestions would be appreciated. We have been using the
db2 loader with "replace" functionality but have new requirements that
prevent us from overwriting all the data.

Thanks,
Michael

Nov 12 '05 #1
3 14615
vb@sharedvision.com wrote:
I need to delete records from several large tables - 1 million plus
rows. Some of these tables have nested foreign key relationships.
When I perform the delete statements, it can take forever as the
database works out the key relationships and commits things to the log.

Is there a way to speed up some of these deletes with out writing the
information to the database logs?

I am going to probably drop the RI on the tables and manually delete
from the child tables to improve things.

Any tips or suggestions would be appreciated. We have been using the
db2 loader with "replace" functionality but have new requirements that
prevent us from overwriting all the data.

Thanks,
Michael

You can try teh following (I have no clue if all steps will work):
1. Turn the RI into NOT ENFORCED (using ALTER TABLE).
2. ACTIVATE NOT LOGGED INITIALLY (be aware of what happens in case of
error!)
3. DELETE (make sure auto-commit is OFF if you use CLP
4. COMMIT
5. SET INTEGRITY OFF for the involved tables
6. Turn the RI back on
7 SET INTEGRITY using UNCHECKED

You can also just turn RI back on without SET INTEGRITY but DB2 will
double check which may or may not take too long for your taste.

Cheers
Serge
--
Serge Rielau
DB2 SQL Compiler Development
IBM Toronto Lab
Nov 12 '05 #2
How about unloading the rows which shall remain in the tables,reload them
with REPLACE and SET INTEGRITY IMMEDIATE UNCHECKED.
This works well when the amount of records to be kept is much smaller than
the amount which must be deleted.

HTH
Joachim
<vb@sharedvision.com> schrieb im Newsbeitrag
news:11**********************@c13g2000cwb.googlegr oups.com...
I need to delete records from several large tables - 1 million plus
rows. Some of these tables have nested foreign key relationships.
When I perform the delete statements, it can take forever as the
database works out the key relationships and commits things to the log.

Is there a way to speed up some of these deletes with out writing the
information to the database logs?

I am going to probably drop the RI on the tables and manually delete
from the child tables to improve things.

Any tips or suggestions would be appreciated. We have been using the
db2 loader with "replace" functionality but have new requirements that
prevent us from overwriting all the data.

Thanks,
Michael

Nov 12 '05 #3
vb@sharedvision.com wrote:
I need to delete records from several large tables - 1 million plus
rows. Some of these tables have nested foreign key relationships.
When I perform the delete statements, it can take forever as the
database works out the key relationships and commits things to the log.
Is there a way to speed up some of these deletes with out writing the
information to the database logs?


Don't know if this will work for you, but in case you don't want to
drop your ri constraint & activate non-logging, and if your concern
about deletion time is primarily that it is slowing down a linear
workflow...

Then you can move the delete step out of the normal load workflow into
an independent process. This process would get scheduled to run at
frequent intervals (ex: every 5 minutes), would delete all candidate
rows up to a small amount (ex: 50,000 rows). This approach won't
lessen the load on the server, and won't delete those rows prior to
some other process. But it will do it with minimal impact to logs &
locking, and might allow you to complete the load process more quickly.
Of course, you'll want to be able to quickly find the candidate rows
with an index or mdc.

buck

Perhaps you could delete the rows using a process that is scheduled
indepdently , continually deletes (say) 50,000 rows at a time.

Nov 12 '05 #4

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
by: xo55ox | last post by:
Hi, I have been trying to set up an automated restore process from prod to backup server. First, I schedule the full database backup nightly, transfer the backup file and restore it to the...
4
by: Andrew Chanter | last post by:
I have been working with an A97 database that performs a data processing function. It imports data from a flat text file then uses a dao transaction that executes a number of sql statements (about...
2
by: Stuart Norris | last post by:
Dear Readers, I am developing an application that stores "messages" in array list and writes new entries to a disk file at the end. This file is used incase the user choses to restart the...
12
by: Adriano | last post by:
Hello, I have a DBF table, connection is okay, can select necessary records and fill into dataset, but can't delete and update them, can anyone please help me on how to do this/ thanks in...
2
by: francois1 | last post by:
I am running a website with a SQL Server database attached. My transaction logs are full and my hosting co. won't allocate more disk space for me. I need to delete my database transaction logs...
3
by: TG | last post by:
Coming from a (mostly) Oracle shop, I am unclear how SS transaction logs work as far as up to point of failure recovery goes. I have the few MSSQL databases I look after in full recovery mode,...
7
by: Susan Mackay | last post by:
I have a data table that is connected to a database table with a data adapter in the 'standard' manner. However I want to be able to remove selected rows from the data table (i.e. no longer...
24
by: Frank Swarbrick | last post by:
We have a batch process that inserts large numbers (100,000 - 1,000,000) of records into a database each day. (DL/I database.) We're considering converting it to a DB2 table. Currently we have...
1
by: JA | last post by:
Hi, I know just about nothing about SQL Server. I am getting this error: The log file for database 'my_database' is full. Back up the transaction log for the database to free up some log space. ...
1
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.