By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,984 Members | 1,461 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,984 IT Pros & Developers. It's quick & easy.

file_exists and thousand of files

P: n/a
I've a site working as follows

Table "articles"
[id]
[name]
[...more fields]

When somebody want to see the article's details, I generate (thorugh
php) a page where all the detalis are listed, as well as the article's
views (images)

If the articles's id is $id, then, I search for its images this way:

for ($i=1;$i<11;$i++){
$filename= $path.$id.'_'.$i.'.jpg';

if (file_exists($filename)) {
Show the image here
}

}///for
This let me upload from 1 to 10 views, and show them as long as they
are in the sever

Pretty easy as you can see, but, the problem is that there could be (in
fact, there will be) more that 10000 images in the same directory.
I'm wondering whether there will be some kind of speed-problem with
file_exists function using it to search among such a number of files...

Had anyone experiences/problems with this ?

regards - jm

Jun 13 '06 #1
Share this Question
Share on Google+
2 Replies


P: n/a
julian_m wrote:
I've a site working as follows

Table "articles"
[id]
[name]
[...more fields]

When somebody want to see the article's details, I generate (thorugh
php) a page where all the detalis are listed, as well as the article's
views (images)

If the articles's id is $id, then, I search for its images this way:

for ($i=1;$i<11;$i++){
$filename= $path.$id.'_'.$i.'.jpg';

if (file_exists($filename)) {
Show the image here
}

}///for
This let me upload from 1 to 10 views, and show them as long as they
are in the sever

Pretty easy as you can see, but, the problem is that there could be (in
fact, there will be) more that 10000 images in the same directory.
I'm wondering whether there will be some kind of speed-problem with
file_exists function using it to search among such a number of files...

Had anyone experiences/problems with this ?

regards - jm


Hi,

Good question.
If PHP is scanning through the directory 10 times in a row, you are clearly
wasting precious CPU.
Some OS's can be helpfull by having their filesystem indexed, making your
approach doable.

I think you could do a general faster job by trying to match a substring to
the filename in a certain directory, and do it once, not 10 times.
That is sometimes refered to as 'globbing'.
I know this from Perl, but I just saw it exists in PHP under the name....
glob. What a surpise. ;-)

So go to www.php.net and look for the function glob.

Good luck

Regards,
Erwin Moller
Jun 13 '06 #2

P: n/a
Erwin Moller wrote:
I think you could do a general faster job by trying to match a substring to
the filename in a certain directory, and do it once, not 10 times.
That is sometimes refered to as 'globbing'.
I know this from Perl, but I just saw it exists in PHP under the name....
glob. What a surpise. ;-)


Somehow I doubt you can do a better job than the OS. Typically a binary
search is used for directory look-up. Finding one file amidst a
thousand wouldn't take more than 10 string comparisons. So for ten
look-ups the OS would do no more than 100 comparisons. Unless you write
a b-search routine in PHP yourself, you'll certainly do worse. A search
with in_array() or array_search() would require on average 500
comparisons per look-up. With a hash-table it's better, but then the
construction of it would require generating a 1000 hash keys. And if
you're on Windows you have case-sensitivity to consider...

Jun 13 '06 #3

This discussion thread is closed

Replies have been disabled for this discussion.