By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
437,751 Members | 1,158 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 437,751 IT Pros & Developers. It's quick & easy.

how does the PHP interpreter work?

P: n/a
I notice that when my weblog software tries to contact
www.weblogs.com, to use the update service, my whole site (all PHP)
slows down. Contacting www.weblogs.com can take a long time. I can't
figure out what is going on, unless the PHP interpreter gets bogged
down trying to contact www.weblogs.com, and so is less able to serve
up PHP pages to other website visitors. But if the PHP interpreter was
that vulnerable to slow-downs, it doesn't seem like PHP could scale to
handle large sites, and I know that it runs some very large sites.

What sort of situations can slow down the PHP interpreter?
Jul 17 '05 #1
Share this Question
Share on Google+
12 Replies


P: n/a
lk******@geocities.com (lawrence) emerged reluctantly from the
curtain and staggered drunkenly up to the mic. In a cracked and
slurred voice he muttered:
What sort of situations can slow down the PHP interpreter?


Most of the time the bottleneck is the database server, not the
language interpreter.

--
Phil Roberts | Without me its just aweso. | http://www.flatnet.net/

"Mankind differs from the animals only by a little,
and most people throw that away."
- Confucious
Jul 17 '05 #2

P: n/a
> lk******@geocities.com (lawrence) emerged reluctantly from the
curtain and staggered drunkenly up to the mic. In a cracked and
slurred voice he muttered:
What sort of situations can slow down the PHP interpreter?


Most of the time the bottleneck is the database server, not the
language interpreter.

Agree with Phil. Your log files are probably big, and you run into
problems accessing them. Try to reduce the size.

DB is the culprit 90% (or 95%) of the times. Simply adding more
memory, caching keys, doing a better structure (including
denormalization of schema), optimizing tables usually does the trick.

--
http://www.dbForumz.com/ This article was posted by author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbForumz.com/PHP-interpre...ict125397.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbForumz.com/eform.php?p=418231
Jul 17 '05 #3

P: n/a
"lawrence" <lk******@geocities.com> wrote in message
news:da**************************@posting.google.c om...
I notice that when my weblog software tries to contact
www.weblogs.com, to use the update service, my whole site (all PHP)
slows down. Contacting www.weblogs.com can take a long time. I can't
figure out what is going on, unless the PHP interpreter gets bogged
down trying to contact www.weblogs.com, and so is less able to serve
up PHP pages to other website visitors. But if the PHP interpreter was
that vulnerable to slow-downs, it doesn't seem like PHP could scale to
handle large sites, and I know that it runs some very large sites.

What sort of situations can slow down the PHP interpreter?


Trying to read a stream of data over the Internet.

Jul 17 '05 #4

P: n/a
"CJ Llewellyn" <sa****@tmslifeline.com> wrote in message news:<cc**********@slavica.ukpost.com>...
"lawrence" <lk******@geocities.com> wrote in message
news:da**************************@posting.google.c om...
I notice that when my weblog software tries to contact
www.weblogs.com, to use the update service, my whole site (all PHP)
slows down. Contacting www.weblogs.com can take a long time. I can't
figure out what is going on, unless the PHP interpreter gets bogged
down trying to contact www.weblogs.com, and so is less able to serve
up PHP pages to other website visitors. But if the PHP interpreter was
that vulnerable to slow-downs, it doesn't seem like PHP could scale to
handle large sites, and I know that it runs some very large sites.

What sort of situations can slow down the PHP interpreter?


Trying to read a stream of data over the Internet.


I'm not sure my question was understood. Let me clarify this, please,
with another question. Let's pose this as a hypothetical. Suppose I
have a website at www.myDomain.com. It is all PHP. Someone comes to
visit the site. The PHP script that renders the page needs to open an
RSS feed on another website, and then it renders that RSS with a style
sheet. It takes the PHP 10 seconds to get the RSS and render the page.
However, 2 seconds after the first visitor arrived, a second visitor
comes to the site. Are you saying that PHP can not begin to render any
pages for the second visitor until it is done reading the stream of
data over the Internet for the first visitor? That is, the second
visitor must wait 8 seconds before the PHP interpreter even begins to
work on rendering a page for them?

If this is the way PHP works, then how does PHP scale up to sites that
serve millions of hits a week? If the average PHP script took 1 second
to run, then a PHP site would be unable to serve more than 86400 pages
a day.
Jul 17 '05 #5

P: n/a
steve wrote:
DB is the culprit 90% (or 95%) of the times.**Simply*adding*more
memory, caching keys, doing a better structure (including
denormalization of schema), optimizing tables usually does the trick.


And don't forget to index tables as well. I've often seen tables with no
indexes that struggle because there's so much data in them. As soon as
indexes are added on the appropriate columns they're nice and fast.

--
Chris Hope - The Electric Toolbox - http://www.electrictoolbox.com/
Jul 17 '05 #6

P: n/a
lk******@geocities.com (lawrence) wrote in message news:<da**************************@posting.google. com>...
I notice that when my weblog software tries to contact
www.weblogs.com, to use the update service, my whole site (all PHP)
slows down. Contacting www.weblogs.com can take a long time. I can't
figure out what is going on, unless the PHP interpreter gets bogged
down trying to contact www.weblogs.com, and so is less able to serve
up PHP pages to other website visitors. But if the PHP interpreter was
that vulnerable to slow-downs, it doesn't seem like PHP could scale to
handle large sites, and I know that it runs some very large sites.


Perhaps session locking? <http://in2.php.net/session_write_close>

--
| Just another PHP saint |
Email: rrjanbiah-at-Y!com
Jul 17 '05 #7

P: n/a
lk******@geocities.com (lawrence) emerged reluctantly from the
curtain and staggered drunkenly up to the mic. In a cracked and
slurred voice he muttered:
Are you saying that PHP can not begin to render any
pages for the second visitor until it is done reading the stream
of data over the Internet for the first visitor? That is, the
second visitor must wait 8 seconds before the PHP interpreter
even begins to work on rendering a page for them?
Yes.

If this is the way PHP works, then how does PHP scale up to
sites that serve millions of hits a week? If the average PHP
script took 1 second to run, then a PHP site would be unable to
serve more than 86400 pages a day.


Most such sites do not rely on external data. That or they use
cacheing to reduce the number of remote data transfers. The Magpie
RSS library enables the RSS data to be cached in order to speed up
rendering.

--
Phil Roberts | Without me its just aweso. | http://www.flatnet.net/

"Mankind differs from the animals only by a little,
and most people throw that away."
- Confucious
Jul 17 '05 #8

P: n/a
Phil Roberts <ph**********@googlemail.com> wrote in message news:<Xn*************************@216.196.97.132>. ..
lk******@geocities.com (lawrence) emerged reluctantly from the
curtain and staggered drunkenly up to the mic. In a cracked and
slurred voice he muttered:
Are you saying that PHP can not begin to render any
pages for the second visitor until it is done reading the stream
of data over the Internet for the first visitor? That is, the
second visitor must wait 8 seconds before the PHP interpreter
even begins to work on rendering a page for them?


Yes.

If this is the way PHP works, then how does PHP scale up to
sites that serve millions of hits a week? If the average PHP
script took 1 second to run, then a PHP site would be unable to
serve more than 86400 pages a day.


Most such sites do not rely on external data. That or they use
cacheing to reduce the number of remote data transfers. The Magpie
RSS library enables the RSS data to be cached in order to speed up
rendering.


You are insisting that most PHP scripts run in much less than a
second. Is this true? It is the clear implication of what you've
written. There are only 86400 seconds in a day, so no PHP site would
be able to scale beyond that point, unless it could run in less than a
second. You could not do a site like Slashdot using PHP, unless the
code ran in less than a second, on average. Is this what you are
saying? You are saying that PHP is unable to handle 2 (or 100)
visitors concurrently. I find that hard to believe.

Suppose you want to run a site like Slashdot using PHP. Suppose the
database connection is occassionally hard to get. The average time for
a script might run up to 1 second, and then you can no longer support
your site. What would the solution be then? Switch to Perl?

Can I ask where you got your information? I'd like to do more research
on this subject. Your argument surprises me. If you are right, then
PHP seems much more limited than I thought.
Jul 17 '05 #9

P: n/a

"lawrence" <lk******@geocities.com> wrote in message
news:da**************************@posting.google.c om...
Phil Roberts <ph**********@googlemail.com> wrote in message

news:<Xn*************************@216.196.97.132>. ..
lk******@geocities.com (lawrence) emerged reluctantly from the
curtain and staggered drunkenly up to the mic. In a cracked and
slurred voice he muttered:
Are you saying that PHP can not begin to render any
pages for the second visitor until it is done reading the stream
of data over the Internet for the first visitor? That is, the
second visitor must wait 8 seconds before the PHP interpreter
even begins to work on rendering a page for them?


Yes.


This is crap. A web server like Apache can run multiple threads at the same
time, and as each HTTP request comes in it is handed over to the first
available thread. It is therefore possible to have multiple instances of PHP
being executed at the same time as each instance is within its own thread.
having the ability to allow multiple concurrent threads is a function of the
web server, not PHP.

--
Tony Marston

http://www.tonymarston.net

Jul 17 '05 #10

P: n/a
>> > Are you saying that PHP can not begin to render any
> pages for the second visitor until it is done reading the stream
> of data over the Internet for the first visitor? That is, the
> second visitor must wait 8 seconds before the PHP interpreter
> even begins to work on rendering a page for them?
Yes.
I don't agree, when PHP is used as an Apache module for even an
ancient version of Apache (e.g. 1.3.1, and I suspect 1.1 as well),
or as a CGI. Apache runs several concurrent processes (or threads).
I have one page that can take considerable time to generate (e.g.
20 minutes to overnight), due to heavy database activity, and I
have no trouble getting OTHER PHP pages from the same server in a
different browser window during that time. This page really ought
to cache its output, but not while I'm still debugging the output.

Database locking is a separate issue.
> If this is the way PHP works, then how does PHP scale up to
> sites that serve millions of hits a week? If the average PHP
> script took 1 second to run, then a PHP site would be unable to
> serve more than 86400 pages a day.

If a script averages more than one *CPU* second a day, then you run
out of CPU horsepower with 86400 hits a day. Solution: get a
faster CPU, use multiple processors, add more memory, or round-robin
multiple servers. Or make the code run faster.

If a script averages more than one second *clock execution time*,
say, due to a few people who insist on doing PPP over 300 bps links,
or your database uses stone carvings as a storage medium, no problem,
except for the impatient users. Multiple PHP threads (for different
requests) can run at once (except in the presence of locking, where
the code deliberately avoids multiple threads doing certain things).
Most such sites do not rely on external data. That or they use
cacheing to reduce the number of remote data transfers. The Magpie
RSS library enables the RSS data to be cached in order to speed up
rendering.


You are insisting that most PHP scripts run in much less than a
second. Is this true?


CPU second, probably. Real-time-clock second, I doubt it and I
really don't care - it won't limit the number of visitors.
It is the clear implication of what you've
written. There are only 86400 seconds in a day, so no PHP site would
be able to scale beyond that point, unless it could run in less than a
second. You could not do a site like Slashdot using PHP, unless the
code ran in less than a second, on average. Is this what you are
saying? You are saying that PHP is unable to handle 2 (or 100)
visitors concurrently. I find that hard to believe.
As do I, and I have demonstrated this to my own satisfaction. Try,
for example, putting up a PHP page that goes into an infinite loop
(the run time limit will get it eventually - try setting it for 24
hours). With Apache you can still view other pages - at least on
an OS with a decent scheduler (and I believe all forms of UNIX
qualify, and I thought even various versions of Windows NT and
successors did also).
Suppose you want to run a site like Slashdot using PHP. Suppose the
database connection is occassionally hard to get. The average time for
a script might run up to 1 second, and then you can no longer support
your site. What would the solution be then? Switch to Perl?
If there really were a single-thread limit like this, it would
probably be an Apache (or whatever web server) problem, and it would
apply equally to PHP, Perl, CGI written in C, or whatever. I think
you can create this problem in Apache by setting "MaxClients 1".
This is NOT a setting you want for a production web server, or even
a toy web server with only one user ever (browsers often request
images in parallel, so one user can easily be responsible for 4
simultaneous requests even if he has only one browser window open.
And yes, images can be generated by PHP - sometimes this is used
by pay sites to protect against deep linking). The distributed
default for this seems to be "MaxClients 150", and for a major site
like slashdot, you'd want to raise it a lot.
Can I ask where you got your information? I'd like to do more research
on this subject. Your argument surprises me. If you are right, then
PHP seems much more limited than I thought.


Gordon L. Burditt
Jul 17 '05 #11

P: n/a
"Tony Marston" <to**@NOSPAM.demon.co.uk> emerged reluctantly from
the curtain and staggered drunkenly up to the mic. In a cracked
and slurred voice he muttered:
This is crap. A web server like Apache can run multiple threads
at the same time, and as each HTTP request comes in it is handed
over to the first available thread. It is therefore possible to
have multiple instances of PHP being executed at the same time
as each instance is within its own thread. having the ability to
allow multiple concurrent threads is a function of the web
server, not PHP.


The question was in relation to the fetching of remote XML data
feeds though. In which case PHP will not render the data until it
has been fetched. Network lag, not interpreter lag. Sorry for any
confusion.

--
Phil Roberts | Without me its just aweso. | http://www.flatnet.net/

"Mankind differs from the animals only by a little,
and most people throw that away."
- Confucious
Jul 17 '05 #12

P: n/a
>> This is crap. A web server like Apache can run multiple threads
at the same time, and as each HTTP request comes in it is handed
over to the first available thread. It is therefore possible to
have multiple instances of PHP being executed at the same time
as each instance is within its own thread. having the ability to
allow multiple concurrent threads is a function of the web
server, not PHP.


The question was in relation to the fetching of remote XML data
feeds though. In which case PHP will not render the data until it
has been fetched. Network lag, not interpreter lag. Sorry for any
confusion.


Network lag on one page should not affect other pages much unless
the network (or the other end of the connection) is really being
saturated, or there's some locking going on (like it's enforced
that only one PHP process will go get fresh data from the feed when
the cached data is older than X minutes, and if another process
wants it, it will wait for the first process to get it, then use
it).

Gordon L. Burditt

Jul 17 '05 #13

This discussion thread is closed

Replies have been disabled for this discussion.