473,394 Members | 1,306 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,394 software developers and data experts.

Deleting duplicate entries from MySQL database table

The topic is related to MySQL database.

Suppose a table "address" contains the following records

-------------------------------------------------------
| name | address | phone |
-------------------------------------------------------
| mr x | 8th lane | 124364 |
| mr x | 6th lane | 435783 |
| mrs x | 6th lane | 435783 |
| mr x | 8th lane | 124364 |
-------------------------------------------------------
>Execute single query (MySQL Version: No Restriction), with sub-query or some other method
>After executing the query
-------------------------------------------------------
| name | address | phone |
-------------------------------------------------------
| mr x | 8th lane | 124364 |
| mr x | 6th lane | 435783 |
| mrs x | 6th lane | 435783 |
-------------------------------------------------------
>Please suggest on this
Jul 7 '06 #1
5 3962
*** Manish escribió/wrote (7 Jul 2006 04:11:54 -0700):
>>Please suggest on this
If you don't make a question I don't really what you kind of suggestions
you are expecting. Anyway, the best way to avoid duplicate information is
having a good design so the database is normalised and using primary keys
and unique indexes so the dupes cannot be physically inserted.
--
-+ http://alvaro.es - Álvaro G. Vicario - Burgos, Spain
++ Mi sitio sobre programación web: http://bits.demogracia.com
+- Mi web de humor con rayos UVA: http://www.demogracia.com
--
Jul 7 '06 #2
I think he wants to get a distinct set of all the addresses.

Use the group by clause or the distinct modifier.

Alvaro G. Vicario wrote:
*** Manish escribió/wrote (7 Jul 2006 04:11:54 -0700):
>Please suggest on this

If you don't make a question I don't really what you kind of suggestions
you are expecting. Anyway, the best way to avoid duplicate information is
having a good design so the database is normalised and using primary keys
and unique indexes so the dupes cannot be physically inserted.
--
-+ http://alvaro.es - Álvaro G. Vicario - Burgos, Spain
++ Mi sitio sobre programación web: http://bits.demogracia.com
+- Mi web de humor con rayos UVA: http://www.demogracia.com
--
Jul 7 '06 #3
Sorry for not formulating the question properly.

The scenario is as:
1. A table is created
2. No Primary Key, Unique Key, Validation process.
3. Various duplicate entries gor inserted in the table (all fields
contain same data)

May be programming login bug.

In the edit process, nothing is edited, (all data are same), and
instead of updating the record, new row get inserted.

4. Now it is desired that all the entries with duplicate entries gets
deleted abd only 1 record (within duplicate entry) remains in the
table.

As in shown in example.

Suppose a table "address" contains the following records

-------------------------------------------------------
| name | address | phone |
-------------------------------------------------------
| mr x | 8th lane | 124364 |
| mr x | 6th lane | 435783 |
| mrs x | 6th lane | 435783 |
| mr x | 8th lane | 124364 |
-------------------------------------------------------
>Execute single query (MySQL Version: No Restriction), with sub-query or some other method
After executing the query
-------------------------------------------------------
| name | address | phone |
-------------------------------------------------------
| mr x | 8th lane | 124364 |
| mr x | 6th lane | 435783 |
| mrs x | 6th lane | 435783 |
-------------------------------------------------------

Here instead of deleting both duplicate entry

| mr x | 8th lane | 124364 |

only one gets deleted (out of two) and no duplicate entries are there
in table.

Hope it will clarify my question.

Thanks.

Manish
Alvaro G. Vicario wrote:
*** Manish escribió/wrote (7 Jul 2006 04:11:54 -0700):
>Please suggest on this

If you don't make a question I don't really what you kind of suggestions
you are expecting. Anyway, the best way to avoid duplicate information is
having a good design so the database is normalised and using primary keys
and unique indexes so the dupes cannot be physically inserted.
--
-+ http://alvaro.es - Álvaro G. Vicario - Burgos, Spain
++ Mi sitio sobre programación web: http://bits.demogracia.com
+- Mi web de humor con rayos UVA: http://www.demogracia.com
--
Jul 9 '06 #4
Manish wrote:
The topic is related to MySQL database.

Suppose a table "address" contains the following records

-------------------------------------------------------
| name | address | phone |
-------------------------------------------------------
| mr x | 8th lane | 124364 |
| mr x | 6th lane | 435783 |
| mrs x | 6th lane | 435783 |
| mr x | 8th lane | 124364 |
-------------------------------------------------------

>>>Execute single query (MySQL Version: No Restriction), with sub-query or some other method

>>>After executing the query


-------------------------------------------------------
| name | address | phone |
-------------------------------------------------------
| mr x | 8th lane | 124364 |
| mr x | 6th lane | 435783 |
| mrs x | 6th lane | 435783 |
-------------------------------------------------------

>>>Please suggest on this

Manish,

Since your question is related to a MySQL database, might I suggest you
ask in a MySQL newsgroup - like comp.databases.mysql?

That's where the MySQL gurus hang out.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================
Jul 9 '06 #5
Manish wrote:
Sorry for not formulating the question properly.

The scenario is as:
1. A table is created
2. No Primary Key, Unique Key, Validation process.
3. Various duplicate entries gor inserted in the table (all fields
contain same data)

May be programming login bug.

In the edit process, nothing is edited, (all data are same), and
instead of updating the record, new row get inserted.

4. Now it is desired that all the entries with duplicate entries gets
deleted abd only 1 record (within duplicate entry) remains in the
table.

As in shown in example.

Suppose a table "address" contains the following records

-------------------------------------------------------
| name | address | phone |
-------------------------------------------------------
| mr x | 8th lane | 124364 |
| mr x | 6th lane | 435783 |
| mrs x | 6th lane | 435783 |
| mr x | 8th lane | 124364 |
-------------------------------------------------------
Execute single query (MySQL Version: No Restriction), with sub-query or some other method
After executing the query

-------------------------------------------------------
| name | address | phone |
-------------------------------------------------------
| mr x | 8th lane | 124364 |
| mr x | 6th lane | 435783 |
| mrs x | 6th lane | 435783 |
-------------------------------------------------------

Here instead of deleting both duplicate entry

| mr x | 8th lane | 124364 |

only one gets deleted (out of two) and no duplicate entries are there
in table.

Hope it will clarify my question.

Thanks.

Manish

Use distinct and create a new table.

INSERT INTO nodupes VALUES SELECT DISTINCT * FROM dupes

the distinct modifier will make sure all the rows of the result set are
different.

Note: this is under the assumption that all fields of the duplicate
rows are the same.

Another way is to create second table and set the primary key to
whatever should be unique part. Then do INSERT INTO IGNORE nodupes
VALUES SELECT * FROM dupes
The ignore portion will ignore any duplicate key errors.
Alvaro G. Vicario wrote:
*** Manish escribió/wrote (7 Jul 2006 04:11:54 -0700):
>>Please suggest on this
If you don't make a question I don't really what you kind of suggestions
you are expecting. Anyway, the best way to avoid duplicate information is
having a good design so the database is normalised and using primary keys
and unique indexes so the dupes cannot be physically inserted.
--
-+ http://alvaro.es - Álvaro G. Vicario - Burgos, Spain
++ Mi sitio sobre programación web: http://bits.demogracia.com
+- Mi web de humor con rayos UVA: http://www.demogracia.com
--
Jul 9 '06 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: Francesco Moi | last post by:
Hello. I've got a 'books' table with 'id', 'name', 'author' and 'isbn' fields. Due to an error, my books are duplicated, so I've got (e.g.): 430 - Moby Dick - Herman Melville - 0192833855...
0
by: Gary Lundquest | last post by:
I have an application with MS Access 2000 as the front end and MySQL as the back end. All was well until I upgraded the MySQL (Linux) server. The Problem: I insert data into a cumulative table....
1
by: Gary Lundquest | last post by:
It appears to me that MySQL version 4 returns an error messge when doing an Insert that results in duplicate entries. Version 3 did NOT return an error - it dropped the duplicate entries and ran...
1
by: JStrummer | last post by:
I just put together a PHP mailing list sign-up page that posts to a mySQL DB. The PHP script prevents INSERTs when the email address is already located in the database. Problem: I need to import...
18
by: Dan | last post by:
hello, I would to know if it is possible to delete an instance in an array, The following does not allow me to do a delete. I am trying to find and delete the duplicate in an array, thanks ...
3
by: Nathan Bloom | last post by:
Hi, I have a data entry form (access 2000) that also allows the user to add, update, and delete records from the form. The Delete action is carried out in an event procedure and has the...
0
by: Manish | last post by:
Suppose a table "address" contains the following records ------------------------------------------------------- | name | address | phone |...
2
by: rimvydazas | last post by:
Hi guys, I have a script which deletes necessary entries on specific date. Here is the script: #!/usr/bin/perl -w use DBI; my $dbh = DBI->connect( "dbi:mysql:test", "newbie", "")
7
by: php_mysql_beginer911 | last post by:
Hi .. i am trying to update a table where if field contents any duplictaed entries than one of the field should be updated with random number which is unique so i can make all entries unique i...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.