473,378 Members | 1,512 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,378 software developers and data experts.

Defragment Heap Tables

Hi guru,

I've been new company for only a month and started analysing Index
Fragmentation.

After I ran DBCC DBREINDEX and capture data into permanent table, I 've
seen lots of tables with no indexes. These tables showed:

Very low scan density,
High extent fragmentation
High Avg. Bytes Free per Page

What are the best strategies to defragment tables with no indexes?

I'm planning to make a rule that each table must have a clustered index
and this index must be created on the best column (highest
selectivity).

Please help.

Thanks,
Silaphet,

Oct 13 '05 #1
1 5837
km********@yahoo.com (sm********@bremer.com) writes:
I've been new company for only a month and started analysing Index
Fragmentation.

After I ran DBCC DBREINDEX and capture data into permanent table, I 've
seen lots of tables with no indexes. These tables showed:

Very low scan density,
High extent fragmentation
High Avg. Bytes Free per Page

What are the best strategies to defragment tables with no indexes?
Create a clustered index on them. If the index is absolute undesired,
drop the index once your done.
I'm planning to make a rule that each table must have a clustered index
and this index must be created on the best column (highest
selectivity).


Yes, that is a good rule. Heap tables with deletions can easily become
very fragmented.

As for which column to cluster on, you may want to find a column
that grows monotonically, if all you want to do is avoid fragmentation.
Of course, adding indexes to improve queries is a good idea too!
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp

Oct 13 '05 #2

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

4
by: el emperador | last post by:
I have a big table (heap)... well, not so big, I have a small server and I want to spread access to it across several new disks dedicated only to that table. I known its possible to do that...
2
by: Jay King | last post by:
I want to create a small deframentor that will defragment a file that I specify. Any Api..?
1
by: Albretch | last post by:
DBMS like MySQL and hsqldb (the only two I know that can keep and process tables on the heap) have a CREATE DATABASE case in which the 'database' is specified to reside in memory, that is RAM. For...
2
by: Ulrich Wisser | last post by:
Hello, yesterday I took our database down and started in single user mode. After reindexing of three tables I thought a vacuum would be a good idea. backend> VACUUM FULL VERBOSE ANALYZE...
0
by: richard.c.lin | last post by:
Hi all, I autocoded (with Real Time Workshop) a Simulink model that contains a large number of lookup tables. One of the resulting autocode files is a 153 MB file called model_data.c which...
10
by: Woody Ling | last post by:
In 32 bits DB2 environment, is it meaningful to set sheapthres larger than 256MB for the following case.. 1. Intra-parallel is ON 2. Intra-parallel is OFF
3
by: nethajireddy | last post by:
Hi, Can any one help me out that how can we defragment the remote pc by any command prompt like that. (not by using Remote desktop) Thanks in advance... ...
5
by: kumarmdb2 | last post by:
Hi guys, For last few days we are getting out of private memory error. We have a development environment. We tried to figure out the problem but we believe that it might be related to the OS...
1
by: aquasudha | last post by:
hai experts explain to Beginer 1.please explain the processes during defragment? 2.what is fragmented files? n how it's forming? 3.how often i defragment my pc? 4.can i stop defragment process...
1
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...
0
by: ryjfgjl | last post by:
In our work, we often need to import Excel data into databases (such as MySQL, SQL Server, Oracle) for data analysis and processing. Usually, we use database tools like Navicat or the Excel import...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.