On Tue, 11 Apr 2006 19:07:19 +1200, rumours say that Lawrence D'Oliveiro
<ld*@geek-central.gen.new_zealand> might have written:
In article <11********************@j33g2000cwa.googlegroups.c om>,
s9************@yahoo.com wrote:
i need to go into a directory to grab some files and do some
processing.
The thing is, i need to wait till the process that generates the files
in that directory to finish
before i can grab the files. eg if file A is being generated and has
not finished, my python script will not go into the directory.
how can i check that file A has actually finished?
I wrote a similar system that watches for new files arriving in an
"uploads" directory, whether copied there via FTP or using a GUI desktop
script. My heuristic was to only process files whose last-modified
date/time was at least 5 minutes in the past. My assumption was that it
was unlikely that 5 minutes would go by between more information being
added to a file.
This works (unless there are long network timeouts, when downloading...),
but another idea is to wait for the existence of a zero-byte sentinel file
that is created last, after the transfer (or process, in general) has ended.
This method is the one that has worked best for me.
--
TZOTZIOY, I speak England very best.
"Dear Paul,
please stop spamming us."
The Corinthians