469,090 Members | 1,106 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,090 developers. It's quick & easy.

Large Data needs to be Deleted??? HOW???

Sam
I have a database that is 70GB big. One of the tables has over 350
million rows of data. I need to delete about 1/3 of the data in that
one table.

I was going to use a simple delete command to delete the unnessacay
data.
Something like

Delete from tablename where columname > '100'

However, it takes SOOO LONG to run, over 8 hours, if not longer....

What is the best way to delete this data???

One idea was to create a Temp Table, then move the needed data to the
temp table, then Truncate the Table, and move that data back.
I'm not very good with SQL Query Language, so can somone give me an
example on how to do this?? Or if you have a differant idea that would
be faster please let me know.

thanks,

Sam
Jul 20 '05 #1
2 7735
s0***@go.com (Sam) wrote in message news:<ed**************************@posting.google. com>...
I have a database that is 70GB big. One of the tables has over 350
million rows of data. I need to delete about 1/3 of the data in that
one table.

I was going to use a simple delete command to delete the unnessacay
data.
Something like

Delete from tablename where columname > '100'

However, it takes SOOO LONG to run, over 8 hours, if not longer....

What is the best way to delete this data???

One idea was to create a Temp Table, then move the needed data to the
temp table, then Truncate the Table, and move that data back.
I'm not very good with SQL Query Language, so can somone give me an
example on how to do this?? Or if you have a differant idea that would
be faster please let me know.

thanks,

Sam


One possible way is like this:

1. Create a new table with identical structure to the existing one
2. Insert the data you want to keep:

insert into dbo.NewTable
select col1, col2, ...
from dbo.OldTable
where ...

3. Drop the old table
4. Rename the new table:

exec sp_rename 'NewTable', 'OldTable'

This approach can be a little awkward if you have lots of keys and
constraints, so using a batch deletion is another option:

declare @rows int
set @rows = -1

set rowcount 50000 -- batch size
while @rows <> 0
begin
delete from dbo.MyTable where...
set @rows = @@rowcount
end

set rowcount 0

You can also truncate the log periodically inside the loop, if
necessary. This assumes that you don't need to be able to roll back
the deletion.

Simon
Jul 20 '05 #2
Sam (s0***@go.com) writes:
I have a database that is 70GB big. One of the tables has over 350
million rows of data. I need to delete about 1/3 of the data in that
one table.

I was going to use a simple delete command to delete the unnessacay
data.
Something like

Delete from tablename where columname > '100'

However, it takes SOOO LONG to run, over 8 hours, if not longer....

What is the best way to delete this data???

One idea was to create a Temp Table, then move the needed data to the
temp table, then Truncate the Table, and move that data back.


Since you are keeping two thirds of the data, that is likely to take
even longer time.

A better strategy is to take the operation in portions. If you are using
simple recovery, SQL Server will truncate the log between the rounds.
If you are using full or bulk-logged you have to truncate yourself.
Whichever, don't forget to take a full backup when you're done. The
significance of the log here, is that you avoid costly autogrows of
the log.

To do it portions, there are a couple of strategies.

If there is key column which determins the delete condition, you
can use this for iteration. But the simplest is probably to do:

SET ROWCOUNT 100000
WHILE 1 = 1
BEGIN
DELETE tbl WHERE ...
IF @@rowcount < 100000
BREAK
END
SET ROWCOUNT 0

Also, watch out for these things:

* Drop all indexes on the table that are not good for finding the rows
to delete. Reapply the indexes when you are done.
* If there is a trigger on the table, use ALTER TABLE DISABLE TRIGGER.
Don't forget to enable when you are done. And check that the trigger
does not perform any cascading updates or deletes.
* If the table is referenced by a foreign key in another table, make
sure that that referencing table has an index on that column.
--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #3

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

5 posts views Thread by Lars Behrens | last post: by
1 post views Thread by Andrew Arace | last post: by
5 posts views Thread by Louis LeBlanc | last post: by
5 posts views Thread by JL | last post: by
10 posts views Thread by Peter Duniho | last post: by
25 posts views Thread by tekctrl | last post: by
reply views Thread by kglaser89 | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.