473,394 Members | 1,737 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,394 software developers and data experts.

DELETING 100 million from a table weekly SQl SERVER 2000

DELETING 100 million from a table weekly SQl SERVER 2000

Hi All

We have a table in SQL SERVER 2000 which has about 250 million records
and this will be growing by 100 million every week. At a time the table
should contain just 13 weeks of data. when the 14th week data needs to
be loaded the first week's data has to be deleted.

And this deletes 100 million every week, since the delete is taking lot
of transaction log space the job is not successful.

Can you please help with what are the approaches we can take to fix
this problem?

Performance and transaction log are the issues we are facing. We tried
deletion in steps too but that also is taking time. What are the
different ways we can address this quickly.

Please reply at the earliest.

Thanks
Harish

Nov 5 '05 #1
4 2021
Hi Harish,

You should look at partitioning, keep a cycle the partitions and simply
CREATE TABLE and DROP TABLE the new partitions, that way you won't have to
do any logging.

Tony

--
Tony Rogerson
SQL Server MVP
http://sqlserverfaq.com - free video tutorials
"harish" <ha*************@gmail.com> wrote in message
news:11**********************@o13g2000cwo.googlegr oups.com...
DELETING 100 million from a table weekly SQl SERVER 2000

Hi All

We have a table in SQL SERVER 2000 which has about 250 million records
and this will be growing by 100 million every week. At a time the table
should contain just 13 weeks of data. when the 14th week data needs to
be loaded the first week's data has to be deleted.

And this deletes 100 million every week, since the delete is taking lot
of transaction log space the job is not successful.

Can you please help with what are the approaches we can take to fix
this problem?

Performance and transaction log are the issues we are facing. We tried
deletion in steps too but that also is taking time. What are the
different ways we can address this quickly.

Please reply at the earliest.

Thanks
Harish

Nov 5 '05 #2
Am 4 Nov 2005 19:56:22 -0800 schrieb harish:
DELETING 100 million from a table weekly SQl SERVER 2000

Hi All

We have a table in SQL SERVER 2000 which has about 250 million records
and this will be growing by 100 million every week. At a time the table
should contain just 13 weeks of data. when the 14th week data needs to
be loaded the first week's data has to be deleted.

And this deletes 100 million every week, since the delete is taking lot
of transaction log space the job is not successful.

Can you please help with what are the approaches we can take to fix
this problem?

Performance and transaction log are the issues we are facing. We tried
deletion in steps too but that also is taking time. What are the
different ways we can address this quickly.

Please reply at the earliest.

Thanks
Harish


In this special case i would think about using a table per week. There is
no faster way then DROP/CREATE or maybe TRUNCATE. You have to change a lot
in the way you work with this data, but you have UNION and maybe you can
use VIEWS.
Or you use a big Solid State Disk for your database :-))

bye,
Helmut
Nov 5 '05 #3
helmut woess (hw@iis.at) writes:
In this special case i would think about using a table per week. There is
no faster way then DROP/CREATE or maybe TRUNCATE. You have to change a lot
in the way you work with this data, but you have UNION and maybe you can
use VIEWS.
Or you use a big Solid State Disk for your database :-))


Since one table per week becomes quite a job to manage, I would go for
one table per month, and then truncate once per month.

If this would be too much data, I would then try every tenth day. This
makes it a lot easier to set up the check constraints for the partitions.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp

Nov 5 '05 #4

"Erland Sommarskog" <es****@sommarskog.se> wrote in message
news:Xn**********************@127.0.0.1...
helmut woess (hw@iis.at) writes:
In this special case i would think about using a table per week. There is no faster way then DROP/CREATE or maybe TRUNCATE. You have to change a lot in the way you work with this data, but you have UNION and maybe you can
use VIEWS.
Or you use a big Solid State Disk for your database :-))
Since one table per week becomes quite a job to manage, I would go for
one table per month, and then truncate once per month.

If this would be too much data, I would then try every tenth day. This
makes it a lot easier to set up the check constraints for the partitions.


Another way to handle this which is SQL Server specific is to set a rowcount
of say 10,000 and loop through deleting 10,000 rows at a time.

And either back up the log frequently enough or use a simple recovery
method.



--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp

Nov 6 '05 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

7
by: Warren Wright | last post by:
Hello, We maintain a 175 million record database table for our customer. This is an extract of some data collected for them by a third party vendor, who sends us regular updates to that data...
1
by: KC | last post by:
Hello, I am using Access 2002. WinXP, Template from MS called Orders Mgmt DB. I have tweaked this DB to work for our small co. It has worked pretty well up until I made the mistake of deleting...
6
by: Martin Bischoff | last post by:
Hi, I'm creating temporary directories in my web app (e.g. ~/data/temp/temp123) to allow users to upload files. When I later delete these directories (from the code behind), the application...
4
by: serge | last post by:
I am running a query in SQL 2000 SP4, Windows 2000 Server that is not being shared with any other users or any sql connections users. The db involves a lot of tables, JOINs, LEFT JOINs, UNIONS...
1
by: iamset via SQLMonster.com | last post by:
We are using SQL Server 2000 Standard ed, sp4 on Server 2000 Advanced. We have one table that is three times as large as the rest of the database. Most of the data is static after approximately...
1
by: DavidOwens | last post by:
Hi there everybody iv designed a database, curently when the delete button is pushed, it deletes the record completly, but i dont want that i want it to just disable the dispenser. iv...
14
by: DavidOwens | last post by:
Hi there everybody iv designed a database, curently when the delete button is pushed, it deletes the record completly, but i dont want that i want it to just disable the dispenser. iv crreated,...
1
by: dmcadams | last post by:
I need help deleting 300 Millions of rows from a table and then reclaim the space in the table after completing the delete process. The database needs to be online and available to the users. There...
2
by: sudheer0886 | last post by:
Hi, I have one issue in the like I need to delete a large number of (2 million) tuples from a table of 5 million based on some criteria.The criteria(condition) is fetched from one base table,based...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.