473,238 Members | 1,598 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,238 software developers and data experts.

Bulk delete on operational data


I read several articles of newsgroup about the bulk delete, and I found
one way is to:
-create a temporary table with all constraints of original table
-insert rows to be retained into that temp table
-drop constraints on original table
-drop the original table
-rename the temporary table

My purge is a daily job, and my question is how this work on a heavy
load operational database? I mean thousand of records are written into
my tables (the same table that I want to purge some rows from) every
second. While I am doing the copy to temp table and drop the table what
happens to those operational data?

I also realized another way of doing the bulk delete is using BCP:

1) BCP out rows to be deleted to an archive file
2) BCP out rows to be retained
3) Drop indexes and truncate table
4) BCP in rows to be retained
5) Create indexes

Again the same question: When I'm doing the BCP is there any insertion
blocking to my original table? What happens to my rows meantime to be
Does BCP acquire an exclusive lock on the table which prevents any
other insertion?
Does any one have an experience with a BCP command for querying out 2
million records, and how long will it take?

I appreciate your help.

Nov 23 '05 #1
2 8568
Zarrin (zl******@yahoo.com) writes:
My purge is a daily job, and my question is how this work on a heavy
load operational database? I mean thousand of records are written into
my tables (the same table that I want to purge some rows from) every
second. While I am doing the copy to temp table and drop the table what
happens to those operational data?

I don't think it is a good idea to create new tables every day. Particularly
not if data are being added as you delete.

I can think of two ways:

1) Keep it all in the same table. It's imperative that the condition for
the DELETE is aligned with the clustered index, so that INSERTs are
not blocked by the DELETE:

2) Use a partitioned view. A partitioned view consists of a number
tables with the same structure and where the PK has a CHCEK constraint
the defines what does into which table. The table are then combined
in a SELECT with UNION ALL. Again, the partition condition has to be
aligned for the condition for the DELETE. More exactly so that a
DELETE is simple a TRUNCATE TABLE on a single table. Since you do this
on daily basis, it will be a lot of tables, unless there is a round-
robin mechanism. You can of course change the view defintion every
now and then, and while this is a swift operation, you must still lock
out the INSERT process, so it does not try to insert into a non-
existing view.

Read more about partitioned views in Books Online.
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
Books Online for SQL Server 2000 at
Nov 23 '05 #2
Thanks Erlands, and sorry for replying so late. I was working with it
several days and finally I got it to a point while execution time for
my purge improved a lot from more than 24 hours to 1 hour with using
clustered indexes on all tables created to do the purge.

To clarify how I do the delete, I first select all IDs of records to
be deleted from the main table and keep them in another table, then I
search my 10 other tables that have related records and delete those
related IDs. I do all my deletes in chunks of 1000 to avoid blocking
the inserts into the tables meantime.

So what I changed was to use a clustered index on the intermediate
table that I had for keeping "to be deleted" IDs.

If later I end up in having more than a few millions purge records I
probably have to use your second solution which is partitioned views.

Thanks again it helped.

Dec 1 '05 #3

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

by: Diego | last post by:
HI, I'm trying yo improve the performance of the following piece of code. Here I'm archiving Items that are done processing to Archive Tables. I believe that if I use BULK INSERTS/SELECTS/UPDATES...
by: Chris | last post by:
Any help would be appreciated. I am running a script that does the following in succession. 1-Drop existing database and create new database 2-Defines tables, stored procedures and functions...
by: php newbie | last post by:
Hello, I am trying to load a simple tab-delimited data file to SQL Server. I created a format file to go with it, since the data file differs from the destination table in number of columns. ...
by: rcamarda | last post by:
Hi, I need to delete rows from my user tables dependant upon there non existence from another table: delete student where student_id not in (select student_id from tblStudent) The reasons is...
by: pk | last post by:
Sorry for the piece-by-piece nature of this post, I moved it from a dormant group to this one and it was 3 separate posts in the other group. Anyway... I'm trying to bulk insert a text file of...
by: Philip Boonzaaier | last post by:
I want to be able to generate SQL statements that will go through a list of data, effectively row by row, enquire on the database if this exists in the selected table- If it exists, then the colums...
by: pankaj_wolfhunter | last post by:
Greetings, I want to bulk load data into user defined SQL Server tables. For this i want to disable all the constraints on all the user defined tables. I got solution in one of the thread and did...
by: teddymeu | last post by:
Hi Guys, since I've done nothing but ask questions these last few weeks to get my first application up and running I thought it was about time to share the wealth and help out a newbie like me since...
by: shreyask | last post by:
I have been working on doing bulk updates and inserts through sqlapi++ using a bulk udpate extention library, developed, a long time ago, internally. The database is oracle. The bulk inserts/updates...
by: jianzs | last post by:
Introduction Cloud-native applications are conventionally identified as those designed and nurtured on cloud infrastructure. Such applications, rooted in cloud technologies, skillfully benefit from...
by: abbasky | last post by:
### Vandf component communication method one: data sharing ​ Vandf components can achieve data exchange through data sharing, state sharing, events, and other methods. Vandf's data exchange method...
by: fareedcanada | last post by:
Hello I am trying to split number on their count. suppose i have 121314151617 (12cnt) then number should be split like 12,13,14,15,16,17 and if 11314151617 (11cnt) then should be split like...
by: davi5007 | last post by:
Hi, Basically, I am trying to automate a field named TraceabilityNo into a web page from an access form. I've got the serial held in the variable strSearchString. How can I get this into the...
by: MeoLessi9 | last post by:
I have VirtualBox installed on Windows 11 and now I would like to install Kali on a virtual machine. However, on the official website, I see two options: "Installer images" and "Virtual machines"....
by: DolphinDB | last post by:
The formulas of 101 quantitative trading alphas used by WorldQuant were presented in the paper 101 Formulaic Alphas. However, some formulas are complex, leading to challenges in calculation. Take...
by: DolphinDB | last post by:
Tired of spending countless mintues downsampling your data? Look no further! In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million...
by: Aftab Ahmad | last post by:
Hello Experts! I have written a code in MS Access for a cmd called "WhatsApp Message" to open WhatsApp using that very code but the problem is that it gives a popup message everytime I clicked on...
by: Aftab Ahmad | last post by:
So, I have written a code for a cmd called "Send WhatsApp Message" to open and send WhatsApp messaage. The code is given below. Dim IE As Object Set IE =...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.