473,805 Members | 2,172 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

How do I get rid of duplicate records?

mysql> select id, student_first_n ame, student_last_na me, email,
application_dat e, modification_da te, unique_key from student where id
in (7268, 862);
+------+--------------------+-------------------+----------------------------+---------------------+---------------------+------------------+
| id | student_first_n ame | student_last_na me | email
| application_dat e | modification_da te | unique_key |
+------+--------------------+-------------------+----------------------------+---------------------+---------------------+------------------+
| 862 | Phil | Powell | ph**@blah.com |
2006-02-27 00:00:00 | 2006-02-27 00:00:00 | dF0WByrCP0vACft A |
| 7268 | Phil | Powell | ph**@blah.com |
2006-02-27 00:00:00 | 2006-02-27 00:00:00 | dF0WByrCP0vACft A |
+------+--------------------+-------------------+----------------------------+---------------------+---------------------+------------------+
I accidentally created duplicate records upon attempting to migrate
data from one server to another. You will have 2, 3 or more records
with every single field identical except for the ID. Best way to know
they're dups is by "application_da te" along with "unique_key ".

How do I get rid of the dups?

Thanx
Phil

Mar 2 '06 #1
2 3391
ph************* *@gmail.com wrote:
mysql> select id, student_first_n ame, student_last_na me, email,
application_dat e, modification_da te, unique_key from student where id
in (7268, 862);
+------+--------------------+-------------------+----------------------------+---------------------+---------------------+------------------+
| id | student_first_n ame | student_last_na me | email
| application_dat e | modification_da te | unique_key |
+------+--------------------+-------------------+----------------------------+---------------------+---------------------+------------------+
| 862 | Phil | Powell | ph**@blah.com |
2006-02-27 00:00:00 | 2006-02-27 00:00:00 | dF0WByrCP0vACft A |
| 7268 | Phil | Powell | ph**@blah.com |
2006-02-27 00:00:00 | 2006-02-27 00:00:00 | dF0WByrCP0vACft A |
+------+--------------------+-------------------+----------------------------+---------------------+---------------------+------------------+
I accidentally created duplicate records upon attempting to migrate
data from one server to another. You will have 2, 3 or more records
with every single field identical except for the ID. Best way to know
they're dups is by "application_da te" along with "unique_key ".

How do I get rid of the dups?

Thanx
Phil

This is going to depend on which values you want to keep.
mysql> select * from testa;
+------+------+------+
| a | c | d |
+------+------+------+
| 1 | 2 | 3 |
| 2 | 2 | 3 |
| 3 | 3 | 4 |
| 4 | 3 | 4 |
+------+------+------+
4 rows in set (0.01 sec)

mysql> select c,d,min(a) e,count(*) from testa group by c,d having
count(*) > 1;
+------+------+--------+----------+
| c | d | e | count(*) |
+------+------+--------+----------+
| 2 | 3 | 1 | 2 |
| 3 | 4 | 3 | 2 |
+------+------+--------+----------+
2 rows in set (0.01 sec)

mysql> select c,d,max(a) e from testa group by c,d having count(*) > 1;

+------+------+------+
| c | d | e |
+------+------+------+
| 2 | 3 | 2 |
| 3 | 4 | 4 |
+------+------+------+
2 rows in set (0.00 sec)
test this by changing the "delete from" to "select * from"

DELETE FROM some_table WHERE primaryKey NOT IN
(SELECT MIN(primaryKey) FROM some_table GROUP BY some_column)

Mar 3 '06 #2
noone wrote:
ph************* *@gmail.com wrote:
mysql> select id, student_first_n ame, student_last_na me, email,
application_dat e, modification_da te, unique_key from student where id
in (7268, 862);
+------+--------------------+-------------------+----------------------------+---------------------+---------------------+------------------+
| id | student_first_n ame | student_last_na me | email
| application_dat e | modification_da te | unique_key |
+------+--------------------+-------------------+----------------------------+---------------------+---------------------+------------------+
| 862 | Phil | Powell | ph**@blah.com |
2006-02-27 00:00:00 | 2006-02-27 00:00:00 | dF0WByrCP0vACft A |
| 7268 | Phil | Powell | ph**@blah.com |
2006-02-27 00:00:00 | 2006-02-27 00:00:00 | dF0WByrCP0vACft A |
+------+--------------------+-------------------+----------------------------+---------------------+---------------------+------------------+
I accidentally created duplicate records upon attempting to migrate
data from one server to another. You will have 2, 3 or more records
with every single field identical except for the ID. Best way to know
they're dups is by "application_da te" along with "unique_key ".

How do I get rid of the dups?

Thanx
Phil

This is going to depend on which values you want to keep.
mysql> select * from testa;
+------+------+------+
| a | c | d |
+------+------+------+
| 1 | 2 | 3 |
| 2 | 2 | 3 |
| 3 | 3 | 4 |
| 4 | 3 | 4 |
+------+------+------+
4 rows in set (0.01 sec)

mysql> select c,d,min(a) e,count(*) from testa group by c,d having
count(*) > 1;
+------+------+--------+----------+
| c | d | e | count(*) |
+------+------+--------+----------+
| 2 | 3 | 1 | 2 |
| 3 | 4 | 3 | 2 |
+------+------+--------+----------+
2 rows in set (0.01 sec)

mysql> select c,d,max(a) e from testa group by c,d having count(*) > 1;

+------+------+------+
| c | d | e |
+------+------+------+
| 2 | 3 | 2 |
| 3 | 4 | 4 |
+------+------+------+
2 rows in set (0.00 sec)
test this by changing the "delete from" to "select * from"

DELETE FROM some_table WHERE primaryKey NOT IN
(SELECT MIN(primaryKey) FROM some_table GROUP BY some_column)


What I wound up doing very late last night (love working at midnight..
*sigh*).. was

CREATE TABLE temp_student SELECT min(id) AS id, student_first_n ame,
student_last_na me, email, application_dat e, modification_da te,
unique_key from student

DELETE s.* FROM student s, temp_student t WHERE s.id != t.id AND
s.student_first _name = t.student_first _name ...

It worked, however, it crashed both MySQL and Apache upon transacting
:(

Phil

Mar 3 '06 #3

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
838
by: Gary Lundquest | last post by:
It appears to me that MySQL version 4 returns an error messge when doing an Insert that results in duplicate entries. Version 3 did NOT return an error - it dropped the duplicate entries and ran to completion. Version 4 seems to STOP when it encounters a duplicate entry, so that the records before the duplicate are inserted and the records after the duplicate are not inserted. 3.22.27.1 - previous ver MySQL that did not return error...
2
4994
by: ms | last post by:
Access 2000: I am trying to delete duplicate records imported to a staging table leaving one of the duplicates to be imported into the live table. A unique record is based on a composite key of 3 fields (vehicleID, BattID, and ChgHrs). VehicleID and BattID are a TEXT datatype and ChrHrs are a number(long int.) datatype. Since records to be imported can have duplicate records of the composite key I need to clean all but one of the...
4
6166
by: KT | last post by:
Is there any one click solution that would do the trick? I would like to create a button, so the person who maintains the database can perform clean up work to delete duplicate records which contain same information in the ID field and the account number field once a week. Thanks in advance! KT
2
28909
by: Carroll | last post by:
I'm looking for a way in SQL to find duplicate records in a single table, that are the same based on 3 columns, regardless of what is in the other columns in the duplicate records. I would like to keep both records (or it could be more than 2 as well) where duplicate records are found. Also, I am interested in selecting all columns from the duplicate records. Thanks, Carroll Rinehart
0
2107
by: B.N.Prabhu | last post by:
Hi, I have a DataTable with several rows. Its having 20 Columns. when i click the Insert button then i have to check the Database Rows. Whether these new rows are already available in the Database. If its there, then i need to seperate the Duplicate Records Based upon 4 columns(EmployeeID, ProjectName, ProjectType, StartTime -- should be activate as a Composite Primary key)
2
2077
by: nethravathy | last post by:
Hi, The following table namely elcbtripselect contains 5147 records.I want to know wether this table contains duplicate records or not. I tried with following query 1)SELECT elcbtripselect.ELCBTRIP_voltsMIN, elcbtripselect.ELCBTRIP_voltsMAX, elcbtripselect.ELCBTrip_is_partwinding, elcbtripselect.ELCBTrip_is_ydelta, elcbtripselect.ELCBTRIP_starter_size, elcbtripselect.ELCBTRIP_UnitFunction, elcbtripselect.ELCBTRIP_strcb_speedi_frame_ty,...
4
4201
by: Thomas Arthur Seidel | last post by:
Hello to all, I have a small or big problem with a customer data base, where during a change of system we might have created duplicate records. This should be easy to find, you might think, but, we are talking about roughly 10000 records or less in a total volume of 1 MIO records or more. I have considered a strategy: The station ID and a field with something like a sequence number are supposed to be unique during that period. The...
2
4021
by: nomvula | last post by:
hi guys i need some help to duplicate records on my form datasheet: here's the example of my form results: ClientLookup DateCaptured ForecastDate Description ForecastQuantity Forecast Actual UJ 18-Apr-08 01-Mar-08 Fees: Asset 1 R 31,200.00 R 31,200.00 NMBM 22-Apr-08 23-Mar-08 P-MI (E) 07/2006 3 R 47,485 R 38,849 i have 200 records deplayed in the form i'm using access2007 and i have a command button which is the built in command button to...
6
5937
by: Dilip1983 | last post by:
Hi All, I want to delete duplicate records from a large table. There is one index(INDEX_U1) on 4 columns(col1,col2,col3,col4) which is in unusable state. First of all when i tried to rebuild index it showed error as unique key violation. So i want to delete duplicate records for col1,col2,col3,col4 combination. How can i delete the duplicate records from this large table?
1
7287
by: xraive | last post by:
I have a problem with this. Currently I am trying Allen's code and i am not successful. Current Design Table1 (Main Form) TravelID (PK) ApprovedBY EntreredBy BudgetCode ExpenseCode
0
9596
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10613
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10363
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
10368
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
9186
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
7649
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
5678
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
2
3846
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
3008
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.