kj7ny wrote:
I am attempting to incrementally back up files using python. How can I
compare a file on a hard drive to a file in a python created zip file
to tell if the file has changed? Or should I be using some other
method to determine if a file has changed?
You can easily determine size and date from the hard drive/filesystem
and from zip files. Look at the Python docs for the os and zipfile
modules. That would be the fastest way to detect changes for most files
you will encounter.
Zip files have the CRC of each file (generated from the original
uncompressed file) for checksumming purposes. You can calculate the CRC
for the file on the hard drive and then compare the CRC values.
Calculating the CRC should be faster than decompressing the ZIPped file
or compressing the file on the hard drive. While not perfect it should
definitely give you a very good indication of changed content. Python
doesn't any built-in functions for CRC calculations, AFAIK. There are
third party modules around.
A better plan would be to store a list of MD5 or SHA1 hashes for each
file in your archive. These functions are built in to Python and are
more suitable for detection large differences in files (these are hash
functions while CRC is a checksum function and thus are less likely to
have collisions). You can store the list of hashes in a separate file,
or as a file in the ZIP file, or even in the ZIP file header.
Your last resort would be to decompress each file and compare it with
the one on disk. This will take a lot of time and memory. Not at all
advisable but definitely foolproof.
There are other issues you will have to contend with, such as renaming
of files and directories, and deletions. If I were you, I'd take a look
at the work done on programs like rsync, unison (both file
synchronization programs), or CVS and Subversion (both version or source
control programs) to see how they deal with such issues.