lk******@geocities.com wrote in news:1107388121.254255.123800
@z14g2000cwz.googlegroups.com:
I've written some template code and one thing I'm trying to protect
against is references to images that don't exist. Because users have
the ability to muck around with the templates after everything's been
set up, there is a chance they'll delete an image or ruin some tag
after the web designer has set up everything perfectly. I want the
software to catch mistakes like that and, at the very least, not show
broken links. On the control panel sometimes as many as 100 thumbnails
are run for each page, I'm wondering if running file_exists on all of
those slows things down at all?
After some local testing, I believe the answer is no. I built the following
script:
<?php
function getmicrotime(){
list($usec, $sec) = explode(' ', microtime());
return ((float)$usec + (float)$sec);
}
#Find out how long it takes to do nothing 1000 times
$start = getmicrotime();
for($i=0; $i<1000; $i++){
;; // Control - do nothing
}
$finish = getmicrotime();
$latency = sprintf("%.2f", ($finish - $start));
echo "It took $latency seconds to do nothing 1000 times.\n";
#Find out how long it takes to sleep 10 seconds
$start = getmicrotime();
sleep(10);
$finish = getmicrotime();
$latency = sprintf("%.2f", ($finish - $start));
echo "It took $latency seconds to sleep 10 seconds.\n";
#Find out how long it takes to generate 1000 random filenames
$start = getmicrotime();
for($i=0; $i<1000; $i++){
$foo = uniqid('');
}
$finish = getmicrotime();
$latency = sprintf("%.2f", ($finish - $start));
echo "It took $latency seconds to call uniqid() 1000 times.\n";
#Find out how long it takes to lookup 1000 random filenames
$start = getmicrotime();
for($i=0; $i<1000; $i++){
$foo = file_exists('/home/foo/' . uniqid(''));
}
$finish = getmicrotime();
$latency = sprintf("%.2f", ($finish - $start));
echo "It took $latency seconds to look up 1000 files.\n";
?>
The purpose of this script is to test how long it takes to do various
things 1,000 times. First, it tests doing nothing. Then, it tests sleeping
for 10 seconds as another control. Next, it generates 1,000 random
filenames using uniqid(). Finally, it runs file_exists() against 1,000
filenames randomly generated with uniqid().
Here is the output from a few runs on my local dev machine:
[root@winfosec phptest]# php test.php
It took 0.00 seconds to do nothing 1000 times.
It took 10.00 seconds to sleep 10 seconds.
It took 20.00 seconds to call uniqid() 1000 times.
It took 20.05 seconds to look up 1000 files.
[root@winfosec phptest]# php test.php
It took 0.00 seconds to do nothing 1000 times.
It took 10.01 seconds to sleep 10 seconds.
It took 20.04 seconds to call uniqid() 1000 times.
It took 20.06 seconds to look up 1000 files.
[root@winfosec phptest]# php test.php
It took 0.00 seconds to do nothing 1000 times.
It took 10.01 seconds to sleep 10 seconds.
It took 20.02 seconds to call uniqid() 1000 times.
It took 20.04 seconds to look up 1000 files.
What I'm really looking at here are the last two values. Test #3 judges the
generation of random strings via uniqid(''). Test #4 tests the time it
takes to perform file_exists() on random filenames generated via uniqid
(''). Test #4 takes approximately the same time as test #3. That tells me
that the file_exists() calls aren't using so much time as the uniqid('')
calls are.
file_exists() appears to be a pretty low-resource function, at least from
my results. This test was performed using PHP 4.3.10 on a FreeBSD 4.10
system, P3 600mhz, 40 megs of RAM. Better systems will no doubt give better
results. If your server is a bit more modern than my test machine, I would
suggest that you shouldn't have any problem calling file_exists() hundreds
or even thousands of times per execution.
hth
--
Bulworth : PHP/MySQL/Unix | Email : str_rot13('f@fung.arg');
--------------------------|---------------------------------
<http://www.phplabs.com/> | PHP scripts, webmaster resources