By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
435,099 Members | 2,144 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 435,099 IT Pros & Developers. It's quick & easy.

fsockopen() question

P: n/a
Hi All,

I've been puzzling over this, but can't find a satisfactory answer
anywhere. I'm not sure if the problem is in my code, or if it's
something to do with the PHP/Apache set-up (probably the former as I
am self taught, and probably taught myself a lot of bad habits).

Anyway, the problem is this: I have a database with about 7,000 or so
URLs stored in it (amongst other things). I need to check these URLs
once a month or so and I have tried doing this with a script, using
fsockopen() and reading the HTTP headers to get the status of the site
(404, 200 etc). Ok in theory, but there is a problem. The script runs
ok, but once it reaches around link 200 or so instead of doing its
thing fsockopen() starts to repeatedly fail. I've tried this a couple
of ways, the outlines of which I've appended below as pseudo code. The
first method just gets all the URLs and loops through them. The second
method gets them 50 at a time. Both methods fail at about the same
point.
Can anyone suggest what may be going wrong, or a better method to
achieve what I'm after?

tia

Chris

// method 1

$sql = "select URL from table"
do{
fsockopen($URL)
if(result == 200)
update row status = ok
else
update row status = broken
}
}while(rows to fetch)
// method 2

while($idx < total rows){
$sql = "select URL from table limit 50,$idx";
do{
fsockopen($URL)
if(result == 200)
update row status = ok
else
update row status = broken
}
}while(rows to fetch)
increment $idx
}
Jul 17 '05 #1
Share this Question
Share on Google+
2 Replies


P: n/a
Chris <Ch************@nez.oc.ku> wrote:
Anyway, the problem is this: I have a database with about 7,000 or so
URLs stored in it (amongst other things). I need to check these URLs
once a month or so and I have tried doing this with a script, using
fsockopen() and reading the HTTP headers to get the status of the site
(404, 200 etc). Ok in theory, but there is a problem. The script runs
ok, but once it reaches around link 200 or so instead of doing its
thing fsockopen() starts to repeatedly fail. I've tried this a couple
of ways, the outlines of which I've appended below as pseudo code. The
first method just gets all the URLs and loops through them. The second
method gets them 50 at a time. Both methods fail at about the same
point.


Stupid question, I know, but do you actually close the sockets? If not, it
might hit the OS limit for concurrent open sockets (probably 255).

JOn
Jul 17 '05 #2

P: n/a
On Wed, 01 Oct 2003 10:03:37 +0100, Jon Kraft <jo*@jonux.co.uk> wrote:

Stupid question, I know, but do you actually close the sockets? If not, it
might hit the OS limit for concurrent open sockets (probably 255).


I've been using fclose($fp) as per the manual, but I think you may be
on to something as the failure is consistantly around this number. The
script fails like this on our current server (Linux based), but used
to work ok (well, better anyway) on our old server (Solaris).

I'll dig and experiment a bit more. Thanks for the pointer re: maximum
open sockets.

C
Jul 17 '05 #3

This discussion thread is closed

Replies have been disabled for this discussion.