473,396 Members | 1,767 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,396 software developers and data experts.

Delete Duplicate Records from a large table

Hi All,

I want to delete duplicate records from a large table.

There is one index(INDEX_U1) on 4 columns(col1,col2,col3,col4) which is in unusable state.
First of all when i tried to rebuild index it showed error as unique key violation.

So i want to delete duplicate records for col1,col2,col3,col4 combination.

How can i delete the duplicate records from this large table?

Please suggest some effective way of deleting duplicate records so that i can rebuild the index.

Thanks
Dilip
Sep 1 '08 #1
6 5902
debasisdas
8,127 Expert 4TB
please check for some articles in the how to section.
Sep 2 '08 #2
Thanks Debashish

I wanted to delete the records from that table with out dropping the index or truncating the table.

Is there anyway for this

Thanks
Dilip
Sep 2 '08 #3
amitpatel66
2,367 Expert 2GB
What did you try so far??
Sep 3 '08 #4
I tried using alter session SKIP_INDEX_UNUSUABLE=TRUE

Still i could not delete the records.

Thanks
Dilip
Sep 3 '08 #5
amitpatel66
2,367 Expert 2GB
I tried using alter session SKIP_INDEX_UNUSUABLE=TRUE

Still i could not delete the records.

Thanks
Dilip
YOU will need to write a DELETE query to delete the dupliactes from a table
Sep 4 '08 #6
I have already tried to write the query which will delete duplicate records.

Query:
delete from table where row_id not in
(
select min(row_id) from table group col1,col2,col3,col4
);

Since i am not able delete a single duplicate record from the table with index in unusuable state, the above query also gave the same error.

Finally i had to drop the index and delete the duplicate records and then recreate the index.

But i wanted to do the same with out dropping the index.

Is there any way to do so?

Thanks & Regards
Dilip
Sep 4 '08 #7

Sign in to post your reply or Sign up for a free account.

Similar topics

1
by: Patrizio | last post by:
Hi All, I've the following table with a PK defined on an IDENTITY column (INSERT_SEQ): CREATE TABLE MYDATA ( MID NUMERIC(19,0) NOT NULL, MYVALUE FLOAT NOT NULL, TIMEKEY ...
3
by: Alexander Anderson | last post by:
I have a DELETE statement that deletes duplicate data from a table. It takes a long time to execute, so I thought I'd seek advice here. The structure of the table is little funny. The following is...
2
by: ms | last post by:
Access 2000: I am trying to delete duplicate records imported to a staging table leaving one of the duplicates to be imported into the live table. A unique record is based on a composite key of 3...
6
by: satish mullapudi | last post by:
Hi All, I am getting strange situation. These r the steps I have followed: 1. Created an EMPLOYEE table with around 14 fields & 688038 records. (so a large table indeed). 2. Tried to delete all...
7
by: AccessHunter | last post by:
I am using the following code to find and delete records in a table. The logic will go through each record and if a duplicate row is found will delete it. I ran this code and it worked the first...
7
by: jmstur2 | last post by:
I have a table with what I consider duplicate records. Data in all columns are duplicate except for the date column, meaning that duplicate data was entered on different dates and those dates were...
2
by: farouqdin | last post by:
Hi all i have code which loops through table and deletes the duplicate records. This code does it for one table. How do i change it so it goes through several tables? On Error Resume Next Dim...
4
by: ramdil | last post by:
Hi All I have table and it have around 90000 records.Its primary key is autonumber field and it has also have date column and name, then some other columns Now i have problem with the table,as my...
1
watertraveller
by: watertraveller | last post by:
Hi all. My ultimate goal is to return two columns, where no single value appears anywhere twice. This means that not only do I want to check that nothing from column A appears in column B and...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.