By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
429,471 Members | 722 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 429,471 IT Pros & Developers. It's quick & easy.

speeding up reading opendir...

P: n/a
Hi!

I have a system, which reads in the entire tree - all files. As of
now, there are 1312 folders to read.

1st time it takes 53-60 seconds to read. The data is somehow cached,
2nd time time is takes 2-3 seconds.

Is there a way to "cache" data beforehand? Like "preparing" the
directory....?

WBR
Sonnich
Jan 9 '08 #1
Share this Question
Share on Google+
3 Replies


P: n/a
jodleren wrote:
Hi!

I have a system, which reads in the entire tree - all files. As of
now, there are 1312 folders to read.

1st time it takes 53-60 seconds to read. The data is somehow cached,
2nd time time is takes 2-3 seconds.

Is there a way to "cache" data beforehand? Like "preparing" the
directory....?
Before what? Before you read them?

Bit of a logical impasse there ;-)

If its *nix, you might execute a cron script every few minutes and read
the whole directory structure, which will bring it into the disk file cache.

Of course under heavy I/O load that caching may get flushed again..

WBR
Sonnich
Jan 9 '08 #2

P: n/a
On Jan 9, 9:34 am, jodleren <sonn...@hot.eewrote:
I have a system, which reads in the entire tree - all files. As of
now, there are 1312 folders to read.

1st time it takes 53-60 seconds to read. The data is somehow cached,
2nd time time is takes 2-3 seconds.

Is there a way to "cache" data beforehand? Like "preparing" the
directory....?

WBR
Sonnich
You might simply direct the output of the dir
command into a file (or string or array depending
on which exec type of command you use) and then
parse that yourself. It should be FAR faster.

For reference,
It took my Win XP system about 315 seconds to do:
C:\>dir /s delme.dir
with the entire c: drive, about 140000 files
totaling about 44 gigabytes in 32000 directories.
The resultant file was about 10 megabytes

Csaba Gabor from Vienna
Jan 9 '08 #3

P: n/a
Rob
On Jan 9, 12:56*pm, Csaba Gabor <dans...@gmail.comwrote:
On Jan 9, 9:34 am, jodleren <sonn...@hot.eewrote:
I have a system, which reads in the entire tree - all files. As of
now, there are 1312 folders to read.
1st time it takes 53-60 seconds to read. The data is somehow cached,
2nd time time is takes 2-3 seconds.
Is there a way to "cache" data beforehand? Like "preparing" the
directory....?
WBR
Sonnich

You might simply direct the output of the dir
command into a file (or string or array depending
on which exec type of command you use) and then
parse that yourself. *It should be FAR faster.

For reference,
It took my Win XP system about 315 seconds to do:
C:\>dir /s delme.dir
with the entire c: drive, about 140000 files
totaling about 44 gigabytes in 32000 directories.
The resultant file was about 10 megabytes

Csaba Gabor from Vienna
If you're running this on Windows, you're probably seeing the Windows
cache coming into play, which is why it runs faster the second time.

As previously suggested, try usign exec() to output a directory to a
file, and then parse that instead.
Rob.
Jan 9 '08 #4

This discussion thread is closed

Replies have been disabled for this discussion.