473,387 Members | 1,882 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,387 software developers and data experts.

Bulk delete on operational data

Hello,

I read several articles of newsgroup about the bulk delete, and I found
one way is to:
-create a temporary table with all constraints of original table
-insert rows to be retained into that temp table
-drop constraints on original table
-drop the original table
-rename the temporary table

My purge is a daily job, and my question is how this work on a heavy
load operational database? I mean thousand of records are written into
my tables (the same table that I want to purge some rows from) every
second. While I am doing the copy to temp table and drop the table what
happens to those operational data?

I also realized another way of doing the bulk delete is using BCP:

1) BCP out rows to be deleted to an archive file
2) BCP out rows to be retained
3) Drop indexes and truncate table
4) BCP in rows to be retained
5) Create indexes

Again the same question: When I'm doing the BCP is there any insertion
blocking to my original table? What happens to my rows meantime to be
inserted?
Does BCP acquire an exclusive lock on the table which prevents any
other insertion?
Does any one have an experience with a BCP command for querying out 2
million records, and how long will it take?

I appreciate your help.

Nov 23 '05 #1
2 8574
Zarrin (zl******@yahoo.com) writes:
My purge is a daily job, and my question is how this work on a heavy
load operational database? I mean thousand of records are written into
my tables (the same table that I want to purge some rows from) every
second. While I am doing the copy to temp table and drop the table what
happens to those operational data?


I don't think it is a good idea to create new tables every day. Particularly
not if data are being added as you delete.

I can think of two ways:

1) Keep it all in the same table. It's imperative that the condition for
the DELETE is aligned with the clustered index, so that INSERTs are
not blocked by the DELETE:

2) Use a partitioned view. A partitioned view consists of a number
tables with the same structure and where the PK has a CHCEK constraint
the defines what does into which table. The table are then combined
in a SELECT with UNION ALL. Again, the partition condition has to be
aligned for the condition for the DELETE. More exactly so that a
DELETE is simple a TRUNCATE TABLE on a single table. Since you do this
on daily basis, it will be a lot of tables, unless there is a round-
robin mechanism. You can of course change the view defintion every
now and then, and while this is a swift operation, you must still lock
out the INSERT process, so it does not try to insert into a non-
existing view.

Read more about partitioned views in Books Online.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Nov 23 '05 #2
Thanks Erlands, and sorry for replying so late. I was working with it
several days and finally I got it to a point while execution time for
my purge improved a lot from more than 24 hours to 1 hour with using
clustered indexes on all tables created to do the purge.

To clarify how I do the delete, I first select all IDs of records to
be deleted from the main table and keep them in another table, then I
search my 10 other tables that have related records and delete those
related IDs. I do all my deletes in chunks of 1000 to avoid blocking
the inserts into the tables meantime.

So what I changed was to use a clustered index on the intermediate
table that I had for keeping "to be deleted" IDs.

If later I end up in having more than a few millions purge records I
probably have to use your second solution which is partitioned views.

Thanks again it helped.

Dec 1 '05 #3

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: Diego | last post by:
HI, I'm trying yo improve the performance of the following piece of code. Here I'm archiving Items that are done processing to Archive Tables. I believe that if I use BULK INSERTS/SELECTS/UPDATES...
2
by: Chris | last post by:
Any help would be appreciated. I am running a script that does the following in succession. 1-Drop existing database and create new database 2-Defines tables, stored procedures and functions...
2
by: php newbie | last post by:
Hello, I am trying to load a simple tab-delimited data file to SQL Server. I created a format file to go with it, since the data file differs from the destination table in number of columns. ...
2
by: rcamarda | last post by:
Hi, I need to delete rows from my user tables dependant upon there non existence from another table: delete student where student_id not in (select student_id from tblStudent) The reasons is...
6
by: pk | last post by:
Sorry for the piece-by-piece nature of this post, I moved it from a dormant group to this one and it was 3 separate posts in the other group. Anyway... I'm trying to bulk insert a text file of...
16
by: Philip Boonzaaier | last post by:
I want to be able to generate SQL statements that will go through a list of data, effectively row by row, enquire on the database if this exists in the selected table- If it exists, then the colums...
4
by: pankaj_wolfhunter | last post by:
Greetings, I want to bulk load data into user defined SQL Server tables. For this i want to disable all the constraints on all the user defined tables. I got solution in one of the thread and did...
0
by: teddymeu | last post by:
Hi Guys, since I've done nothing but ask questions these last few weeks to get my first application up and running I thought it was about time to share the wealth and help out a newbie like me since...
4
by: shreyask | last post by:
I have been working on doing bulk updates and inserts through sqlapi++ using a bulk udpate extention library, developed, a long time ago, internally. The database is oracle. The bulk inserts/updates...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.