Ami
System performance is going to be incredibly poor with 40k files in the same
directory. Opening a file is going to take for ever. You should probably
be using a message queue here. However, assuming you are stuck with it ...
I agree with Harry that FileSystemWatcher is unreliable, and it doesn't
account for the files that were present when the program started up.
Try:
string path = @"C:\MyFiles";
string pattern = "*.xml";
string[] files = Directory.GetFiles(path, pattern);
int numberOfFiles = files.Length;
Another problem you may get is that you'll occasionally try to process files
that are still being written. The safe way out of this is to insist that
the file writer write each file with the wrong type - .tmp, say - then
renames the file to the correct file after closing it. Otherwise you'll
have to develop a strategy for dealing with files you can't open because
they are still being written.
Nick
"Amy L." <am**@paxemail.com> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
I am working on some code that will be used in a Windows Service that will
monitor specific files in a queue.
I would like to get an integer value of the amount of specfic files in a
directory. Now I have been using this code to get what I need.
private DirectoryInfo di ;
di = new DirectoryInfo( @"c:\temp" ) ;
int files_in_directory_value = di.GetFileSystemInfos( "*.abc" ).Length ;
Now the issue I beleive that I will face is that the directory I will be
checking at times can have 40K+ files that match the search string.
Seeing that GetFileSystemInfos returns an array of all the files that I am
searching on I am concerned about potential memory usage of this structure
and just general decrease in performance. I am sure everyone would agree
that just getting the integer value would be better.
Any thoughts on other framework functions to get this info?.
Amy