By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
459,608 Members | 1,940 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 459,608 IT Pros & Developers. It's quick & easy.

DELETING 100 million from a table weekly SQl SERVER 2000

P: n/a
DELETING 100 million from a table weekly SQl SERVER 2000

Hi All

We have a table in SQL SERVER 2000 which has about 250 million records
and this will be growing by 100 million every week. At a time the table
should contain just 13 weeks of data. when the 14th week data needs to
be loaded the first week's data has to be deleted.

And this deletes 100 million every week, since the delete is taking lot
of transaction log space the job is not successful.

Can you please help with what are the approaches we can take to fix
this problem?

Performance and transaction log are the issues we are facing. We tried
deletion in steps too but that also is taking time. What are the
different ways we can address this quickly.

Please reply at the earliest.

Thanks
Harish

Nov 5 '05 #1
Share this Question
Share on Google+
4 Replies


P: n/a
Hi Harish,

You should look at partitioning, keep a cycle the partitions and simply
CREATE TABLE and DROP TABLE the new partitions, that way you won't have to
do any logging.

Tony

--
Tony Rogerson
SQL Server MVP
http://sqlserverfaq.com - free video tutorials
"harish" <ha*************@gmail.com> wrote in message
news:11**********************@o13g2000cwo.googlegr oups.com...
DELETING 100 million from a table weekly SQl SERVER 2000

Hi All

We have a table in SQL SERVER 2000 which has about 250 million records
and this will be growing by 100 million every week. At a time the table
should contain just 13 weeks of data. when the 14th week data needs to
be loaded the first week's data has to be deleted.

And this deletes 100 million every week, since the delete is taking lot
of transaction log space the job is not successful.

Can you please help with what are the approaches we can take to fix
this problem?

Performance and transaction log are the issues we are facing. We tried
deletion in steps too but that also is taking time. What are the
different ways we can address this quickly.

Please reply at the earliest.

Thanks
Harish

Nov 5 '05 #2

P: n/a
Am 4 Nov 2005 19:56:22 -0800 schrieb harish:
DELETING 100 million from a table weekly SQl SERVER 2000

Hi All

We have a table in SQL SERVER 2000 which has about 250 million records
and this will be growing by 100 million every week. At a time the table
should contain just 13 weeks of data. when the 14th week data needs to
be loaded the first week's data has to be deleted.

And this deletes 100 million every week, since the delete is taking lot
of transaction log space the job is not successful.

Can you please help with what are the approaches we can take to fix
this problem?

Performance and transaction log are the issues we are facing. We tried
deletion in steps too but that also is taking time. What are the
different ways we can address this quickly.

Please reply at the earliest.

Thanks
Harish


In this special case i would think about using a table per week. There is
no faster way then DROP/CREATE or maybe TRUNCATE. You have to change a lot
in the way you work with this data, but you have UNION and maybe you can
use VIEWS.
Or you use a big Solid State Disk for your database :-))

bye,
Helmut
Nov 5 '05 #3

P: n/a
helmut woess (hw@iis.at) writes:
In this special case i would think about using a table per week. There is
no faster way then DROP/CREATE or maybe TRUNCATE. You have to change a lot
in the way you work with this data, but you have UNION and maybe you can
use VIEWS.
Or you use a big Solid State Disk for your database :-))


Since one table per week becomes quite a job to manage, I would go for
one table per month, and then truncate once per month.

If this would be too much data, I would then try every tenth day. This
makes it a lot easier to set up the check constraints for the partitions.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp

Nov 5 '05 #4

P: n/a

"Erland Sommarskog" <es****@sommarskog.se> wrote in message
news:Xn**********************@127.0.0.1...
helmut woess (hw@iis.at) writes:
In this special case i would think about using a table per week. There is no faster way then DROP/CREATE or maybe TRUNCATE. You have to change a lot in the way you work with this data, but you have UNION and maybe you can
use VIEWS.
Or you use a big Solid State Disk for your database :-))
Since one table per week becomes quite a job to manage, I would go for
one table per month, and then truncate once per month.

If this would be too much data, I would then try every tenth day. This
makes it a lot easier to set up the check constraints for the partitions.


Another way to handle this which is SQL Server specific is to set a rowcount
of say 10,000 and loop through deleting 10,000 rows at a time.

And either back up the log frequently enough or use a simple recovery
method.



--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp

Nov 6 '05 #5

This discussion thread is closed

Replies have been disabled for this discussion.