473,700 Members | 2,350 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

[perl-python] a program to delete duplicate files

here's a large exercise that uses what we built before.

suppose you have tens of thousands of files in various directories.
Some of these files are identical, but you don't know which ones are
identical with which. Write a program that prints out which file are
redundant copies.

Here's the spec.
--------------------------
The program is to be used on the command line. Its arguments are one or
more full paths of directories.

perl del_dup.pl dir1

prints the full paths of all files in dir1 that are duplicate.
(including files in sub-directories) More specifically, if file A has
duplicates, A's full path will be printed on a line, immediately
followed the full paths of all other files that is a copy of A. These
duplicates's full paths will be prefixed with "rm " string. A empty
line follows a group of duplicates.

Here's a sample output.

inPath/a.jpg
rm inPath/b.jpg
rm inPath/3/a.jpg
rm inPath/hh/eu.jpg

inPath/ou.jpg
rm inPath/23/a.jpg
rm inPath/hh33/eu.jpg

order does not matter. (i.e. which file will not be "rm " does not
matter.)

------------------------

perl del_dup.pl dir1 dir2

will do the same as above, except that duplicates within dir1 or dir2
themselves not considered. That is, all files in dir1 are compared to
all files in dir2. (including subdirectories) And, only files in dir2
will have the "rm " prefix.

One way to understand this is to imagine lots of image files in both
dir. One is certain that there are no duplicates within each dir
themselves. (imagine that del_dup.pl has run on each already) Files in
dir1 has already been categorized into sub directories by human. So
that when there are duplicates among dir1 and dir2, one wants the
version in dir2 to be deleted, leaving the organization in dir1 intact.

perl del_dup.pl dir1 dir2 dir3 ...

does the same as above, except files in later dir will have "rm "
first. So, if there are these identical files:

dir2/a
dir2/b
dir4/c
dir4/d

the c and d will both have "rm " prefix for sure. (which one has "rm "
in dir2 does not matter) Note, although dir2 doesn't compare files
inside itself, but duplicates still may be implicitly found by indirect
comparison. i.e. a==c, b==c, therefore a==b, even though a and b are
never compared.
--------------------------

Write a Perl or Python version of the program.

a absolute requirement in this problem is to minimize the number of
comparison made between files. This is a part of the spec.

feel free to write it however you want. I'll post my version in a few
days.

http://www.xahlee.org/perl-python/python.html

Xah
xa*@xahlee.org
http://xahlee.org/PageTwo_dir/more.html

Jul 18 '05 #1
44 4050
On 9 Mar 2005 04:56:13 -0800, rumours say that "Xah Lee" <xa*@xahlee.org > might
have written:
Write a Perl or Python version of the program.

a absolute requirement in this problem is to minimize the number of
comparison made between files. This is a part of the spec.


http://groups-beta.google.com/group/...8e292ec9adb82d

The whole thread is about finding duplicate files.
--
TZOTZIOY, I speak England very best.
"Be strict when sending and tolerant when receiving." (from RFC1958)
I really should keep that in mind when talking with people, actually...
Jul 18 '05 #2
On Wednesday 09 March 2005 06:56 am, Xah Lee wrote:
here's a large exercise that uses what we built before.

suppose you have tens of thousands of files in various directories.
Some of these files are identical, but you don't know which ones are
identical with which. Write a program that prints out which file are
redundant copies.


For anyone interested in responding to the above, a starting
place might be this maintenance script I wrote for my own use. I don't
think it exactly matches the spec, but it addresses the problem. I wrote
this to clean up a large tree of image files once. The exact behavior
described requires the '--exec="ls %s"' option as mentioned in the help.

#!/usr/bin/env python
# (C) 2003 Anansi Spaceworks
#---------------------------------------------------------------------------
# find_duplicates
"""
Utility to find duplicate files in a directory tree by
comparing their checksums.
"""
#---------------------------------------------------------------------------
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#---------------------------------------------------------------------------

import os, sys, md5, getopt
def file_walker(tbl , srcpath, files):
"""
Visit a path and collect data (including checksum) for files in it.
"""
for file in files:
filepath = os.path.join(sr cpath, file)
if os.path.isfile( filepath):
chksum = md5.new(open(os .path.join(srcp ath, file)).read()). digest()
if not tbl.has_key(chk sum): tbl[chksum]=[]
tbl[chksum].append(filepat h)

def find_duplicates (treeroot, tbl=None):
"""
Find duplicate files in directory.
"""
dup = {}
if tbl is None: tbl = {}
os.path.walk(tr eeroot, file_walker, tbl)
for k,v in tbl.items():
if len(v) > 1:
dup[k] = v
return dup

usage = """
USAGE: find_duplicates <options> [<path ...]

Find duplicate files (by matching md5 checksums) in a
collection of paths (defaults to the current directory).

Note that the order of the paths searched will be retained
in the resulting duplicate file lists. This can be used
with --exec and --index to automate handling.

Options:
-h, -H, --help
Print this help.

-q, --quiet
Don't print normal report.

-x, --exec=<command string>
Python-formatted command string to act on the indexed
duplicate in each duplicate group found. E.g. try
--exec="ls %s"

-n, --index=<index into duplicates>
Which in a series of duplicates to use. Begins with '1'.
Default is '1' (i.e. the first file listed).

Example:
You've copied many files from path ./A into path ./B. You want
to delete all the ones you've processed already, but not
delete anything else:

% find_duplicates -q --exec="rm %s" --index=1 ./A ./B
"""

def main():
action = None
quiet = 0
index = 1
dup = {}

opts, args = getopt.getopt(s ys.argv[1:], 'qhHn:x:',
['quiet', 'help', 'exec=', 'index='])

for opt, val in opts:
if opt in ('-h', '-H', '--help'):
print usage
sys.exit()
elif opt in ('-x', '--exec'):
action = str(val)
elif opt in ('-n', '--index'):
index = int(val)
elif opt in ('-q', '--quiet'):
quiet = 1

if len(args)==0:
dup = find_duplicates ('.')
else:
tbl = {}
for arg in args:
dup = find_duplicates (arg, tbl=tbl)

for k, v in dup.items():
if not quiet:
print "Duplicates :"
for f in v: print "\t%s" % f
if action:
os.system(actio n % v[index-1])

if __name__=='__ma in__':
main()

--
--
Terry Hancock ( hancock at anansispacework s.com )
Anansi Spaceworks http://www.anansispaceworks.com

Jul 18 '05 #3
I wrote something similar, have a look at
http://www.homepages.lu/pu/fdups.html.
Jul 18 '05 #4
On Wed, 9 Mar 2005 16:13:20 -0600, rumours say that Terry Hancock
<ha*****@anansi spaceworks.com> might have written:
For anyone interested in responding to the above, a starting
place might be this maintenance script I wrote for my own use. I don't
think it exactly matches the spec, but it addresses the problem. I wrote
this to clean up a large tree of image files once. The exact behavior
described requires the '--exec="ls %s"' option as mentioned in the help.


The drawback of this method is that you have to read everything. For example,
if you have ten files less than 100KiB each and one file more than 2 GiB in
size, there is no need to read the 2 GiB file, is there?

If it's a one-shot attempt, I guess it won't mind a lot.

On POSIX filesystems, one has also to avoid comparing files having same (st_dev,
st_inum), because you know that they are the same file.
--
TZOTZIOY, I speak England very best.
"Be strict when sending and tolerant when receiving." (from RFC1958)
I really should keep that in mind when talking with people, actually...
Jul 18 '05 #5
P
I've written a python GUI wrapper around some shell scripts:
http://www.pixelbeat.org/fslint/

the shell script logic is essentially:

exclude hard linked files
only include files where there are more than 1 with the same size
print files with matching md5sum

Pádraig.
Jul 18 '05 #6
On Thu, 10 Mar 2005 10:54:05 +0100, rumours say that Patrick Useldinger
<pu*********@gm ail.com> might have written:
I wrote something similar, have a look at
http://www.homepages.lu/pu/fdups.html.


That's fast and good.

A minor nit-pick: `fdups.py -r .` does nothing (at least on Linux).

Have you found any way to test if two files on NTFS are hard linked without
opening them first to get a file handle?
--
TZOTZIOY, I speak England very best.
"Be strict when sending and tolerant when receiving." (from RFC1958)
I really should keep that in mind when talking with people, actually...
Jul 18 '05 #7
Christos TZOTZIOY Georgiou wrote:
On POSIX filesystems, one has also to avoid comparing files having same (st_dev,
st_inum), because you know that they are the same file.


I then have a bug here - I consider all files with the same inode equal,
but according to what you say I need to consider the tuple
(st_dev,ST_ium) . I'll have to fix that for 0.13.

Thanks ;-)
-pu
Jul 18 '05 #8
Christos TZOTZIOY Georgiou wrote:
That's fast and good.
Nice to hear.
A minor nit-pick: `fdups.py -r .` does nothing (at least on Linux).
I'll look into that.
Have you found any way to test if two files on NTFS are hard linked without
opening them first to get a file handle?


No. And even then, I wouldn't know how to find out.

-pu
Jul 18 '05 #9
In article <11************ **********@l41g 2000cwc.googleg roups.com>,
"Xah Lee" <xa*@xahlee.org > wrote:
a absolute requirement in this problem is to minimize the number of
comparison made between files. This is a part of the spec.


You need do no comparisons between files. Just use a sufficiently
strong hash algorithm (SHA-256 maybe?) and compare the hashes.

--
David Eppstein
Computer Science Dept., Univ. of California, Irvine
http://www.ics.uci.edu/~eppstein/
Jul 18 '05 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
4684
by: Julia Bell | last post by:
I would like to run the same script on two different platforms. The directory in which the script(s) will be stored is common to the two platforms. (I see the same directory contents regardless of which platform I use to access the directory.) Platform 1: perl is installed in /tps/bin/perl. CPAN modules are available Perl is also installed in /usr/bin/perl Platform 1, but the modules are not accessible with this version. Platform...
1
668
by: sm00thcrimnl13 | last post by:
if i have windows 2000 and know how to write perl scripts, how to i actuvate the script through perl?
4
9045
by: Firewalker | last post by:
Hey guys, I am newbie to perl. I am trying to deal with dates ( such trying to find what the date would be after month). Is therea function or date class( I am a java programmer, I couldnt find the right word instead of class) to do the job? Thanks for any help.
1
3633
by: smsabu2002 | last post by:
Hi, I am facing the build problem while installing the DBD-MySql perl module (ver 2.9008) using both GCC and CC compilers in HP-UX machine. For the Build using GCC, the compiler error is produced due to the unknown GCC compiler option "+DAportable". For the Build using CC, the preprocessor error is produced due to the recursive declration of macro "PerlIO" in perlio.h file.
13
3258
by: Otto J. Makela | last post by:
I'm trying to install to php the Perl-1.0.0.tgz package (from http://pecl.php.net/package/perl, enabling one to call perl libraries) to a pre-existing Solaris system. Unfortunately, the attempt fails in a rather dramatic way, spewing out thousands of "relocation remains"... I'm somewhat lost on what to do next, the documentation that came along with the Perl package is somewhat sparse. Anyone have suggestions? % uname -a
6
3009
by: surfivor | last post by:
I may be involved in a data migration project involving databases and creating XML feeds. Our site is PHP based, so I imagine the team might suggest PHP, but I had a look at the PHP documentation for one of the Pear modules for creating XML and it didn't look like much. I've used Perl XML:Twig and I have the impression that there is more Perl stuff available as well as the Perl stuff being well documented as I have a Perl DBI book, a Perl...
4
3705
by: billb | last post by:
I installed a perl extension for PHP to use some perl inside my php primarily because I have perl working with oracle and not php and oracle. So I want to use my old perl scripts, and use the functionality of php. The extension uses the '$perl->eval' function. i am kind of stuck with the syntax or how to put the php variable into the perl script. I have a form where the user puts in a grid reference. Then a php script that gets the...
4
8614
by: vijayarl | last post by:
Hi All, i have the following software installed in my system : 1.OS: Win2k 2.Eclipse Version used :3.4.0 & even the perl too... 1. I have imported the my own perl project in Eclipse, when i tried to run the External Tools --> Perl -w am getting the popup saying then it says : " Variable references empty selection:
0
8644
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
9214
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
9074
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
8970
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
8924
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
5902
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4656
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3088
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
2027
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.