|
I have a cgi-script dat uses the modules cgi, os, sys and time. Offcourse I
can not time the time used to import time, but os and sys do not take more
as a millisecond. My script itself takes 3 or 4 milliseconds. But importing
cgi takes 95 milliseconds. (This is on my test system a PII 300 MHz. Is
there a way to make this more fast? The import off cgi makes the script at
least 20 times as slow. Something like mod-python is not a possibility. I
could use it on my test machine, but not at the osting provider. | |
Share:
|
Cecil Westerhof wrote:
I have a cgi-script dat uses the modules cgi, os, sys and time. Offcourse I
can not time the time used to import time, but os and sys do not take more
as a millisecond. My script itself takes 3 or 4 milliseconds. But importing
cgi takes 95 milliseconds. (This is on my test system a PII 300 MHz. Is
there a way to make this more fast? The import off cgi makes the script at
least 20 times as slow. Something like mod-python is not a possibility. I
could use it on my test machine, but not at the osting provider.
Does the hosting provider support fastcgi? It would avoid starting the
interpreter at each request too. | | |
Daniele Varrazzo wrote:
Cecil Westerhof wrote:
>I have a cgi-script dat uses the modules cgi, os, sys and time. Offcourse I can not time the time used to import time, but os and sys do not take more as a millisecond. My script itself takes 3 or 4 milliseconds. But importing cgi takes 95 milliseconds. (This is on my test system a PII 300 MHz. Is there a way to make this more fast? The import off cgi makes the script at least 20 times as slow. Something like mod-python is not a possibility. I could use it on my test machine, but not at the osting provider.
Does the hosting provider support fastcgi? It would avoid starting the
interpreter at each request too.
I am afraid not. It is very bare bone. Also they do not really support
python. (When they had a problem, perl and bash scripts worked within a
day, python scripts had to wait for one and a halve week.)
I am thinking about switching my provider. What are important things to keep
in mind when selecting another one? | | |
"Cecil Westerhof" <du***@dummy.nlescribió en el mensaje
news:45*********************@news.xs4all.nl...
>I have a cgi-script dat uses the modules cgi, os, sys and time. Offcourse I
can not time the time used to import time, but os and sys do not take more
as a millisecond. My script itself takes 3 or 4 milliseconds. But
importing
cgi takes 95 milliseconds. (This is on my test system a PII 300 MHz. Is
there a way to make this more fast? The import off cgi makes the script at
least 20 times as slow. Something like mod-python is not a possibility. I
could use it on my test machine, but not at the osting provider.
Surely os was imported earlier, and was already loaded. sys is a builtin
module. But I think your problem is not how much time takes importing cgi,
but how much time takes launching a new python process on each request.
--
Gabriel Genellina | | |
Gabriel Genellina wrote:
"Cecil Westerhof" <du***@dummy.nlescribió en el mensaje
news:45*********************@news.xs4all.nl...
>>I have a cgi-script dat uses the modules cgi, os, sys and time. Offcourse I can not time the time used to import time, but os and sys do not take more as a millisecond. My script itself takes 3 or 4 milliseconds. But importing cgi takes 95 milliseconds. (This is on my test system a PII 300 MHz. Is there a way to make this more fast? The import off cgi makes the script at least 20 times as slow. Something like mod-python is not a possibility. I could use it on my test machine, but not at the osting provider.
Surely os was imported earlier, and was already loaded. sys is a builtin
module. But I think your problem is not how much time takes importing cgi,
but how much time takes launching a new python process on each request.
Nope, it was certainly cgi. When I fetch time after importing and after the
script finishes, the difference is 4 milliseconds. If I import the modules
apart from cgi after I fetch the first time, there is added about 1
millisecond to the difference. When I also import cgi after taking the
time, the difference grows with 95 milliseconds. So for one reason ore
another, cgi is very expensive. | | |
"Cecil Westerhof" <du***@dummy.nlescribió en el mensaje
news:45*********************@news.xs4all.nl...
Gabriel Genellina wrote:
>"Cecil Westerhof" <du***@dummy.nlescribió en el mensaje news:45*********************@news.xs4all.nl...
>>>I have a cgi-script dat uses the modules cgi, os, sys and time. Offcourse I can not time the time used to import time, but os and sys do not take more as a millisecond. My script itself takes 3 or 4 milliseconds. But importing cgi takes 95 milliseconds. (This is on my test system a PII 300 MHz. Is there a way to make this more fast? The import off cgi makes the script at least 20 times as slow. Something like mod-python is not a possibility. I could use it on my test machine, but not at the osting provider.
Surely os was imported earlier, and was already loaded. sys is a builtin module. But I think your problem is not how much time takes importing cgi, but how much time takes launching a new python process on each request.
Nope, it was certainly cgi. When I fetch time after importing and after
the
script finishes, the difference is 4 milliseconds. If I import the modules
apart from cgi after I fetch the first time, there is added about 1
millisecond to the difference. When I also import cgi after taking the
time, the difference grows with 95 milliseconds. So for one reason ore
another, cgi is very expensive.
I'll try to explain better: the cgi *protocol* (I'm not talking about the
cgi *module*) requires a *new* python process to be created on *each*
request. Try to measure the time it takes to launch Python, that is, the
time from when you type `python ENTER` on your shell and the interpreter
prompt appears. That time is wasted for *every* cgi request, and I bet it is
much greater than the 95 ms you measure importing a module (be it cgi or
whatever). You'll gain much more responsiveness if you can switch to another
protocol, be it FastCGI, WSGI, mod_python or another.
Anyway, comparing the import time between os, sys, and cgi is not very
meaningful. sys is a builtin module, so "import sys" does very little. os is
likely to be already imported by the time your script begins, so "import os"
just verifies that os is already in sys.modules. "import cgi" is the only
example when Python actually has to load something, so it's not a surprise
if it takes longer.
--
Gabriel Genellina | | |
Cecil Westerhof <du***@dummy.nlwrote:
> I have a cgi-script dat uses the modules cgi, os, sys and time. Offcourse I can not time the time used to import time, but os and sys do not take more as a millisecond. My script itself takes 3 or 4 milliseconds. But importing cgi takes 95 milliseconds. (This is on my test system a PII 300 MHz. Is there a way to make this more fast? The import off cgi makes the script at least 20 times as slow. Something like mod-python is not a possibility. I could use it on my test machine, but not at the osting provider.
Realistically, do you plan to support more than a few dozen requests per
minute? If not, then it doesn't matter at all. The script launch overhead
is an insignificant part of the user's browser experience.
If you do expect a hundred requests per minute, then CGI is not an
appropriate choice. You either need to switch to a one of the web
frameworks (like CherryPy or Django or WebWare or one of the hundreds of
others), or <gackmove to PHP.
--
Tim Roberts, ti**@probo.com
Providenza & Boekelheide, Inc. | | |
On 1/18/07, Cecil Westerhof <du***@dummy.nlwrote:
I have a cgi-script dat uses the modules cgi, os, sys and time. OffcourseI
can not time the time used to import time, but os and sys do not take more
as a millisecond. My script itself takes 3 or 4 milliseconds. But importing
cgi takes 95 milliseconds. (This is on my test system a PII 300 MHz. Is
there a way to make this more fast? The import off cgi makes the script at
least 20 times as slow. Something like mod-python is not a possibility. I
could use it on my test machine, but not at the osting provider.
Maybe python-launcher-daemon can help you? http://blogs.gnome.org/view/johan/2007/01/18/0 But if you can not use
mod_python then you probably can not use any long running processes
either.
--
mvh Björn | | |
BJörn Lindqvist wrote:
On 1/18/07, Cecil Westerhof <du***@dummy.nlwrote:
>I have a cgi-script dat uses the modules cgi, os, sys and time. Offcourse I can not time the time used to import time, but os and sys do not take more as a millisecond. My script itself takes 3 or 4 milliseconds. But importing cgi takes 95 milliseconds. (This is on my test system a PII 300 MHz. Is there a way to make this more fast? The import off cgi makes the script at least 20 times as slow. Something like mod-python is not a possibility. I could use it on my test machine, but not at the osting provider.
Maybe python-launcher-daemon can help you? http://blogs.gnome.org/view/johan/2007/01/18/0 But if you can not use
mod_python then you probably can not use any long running processes
either.
By my current provider I can not use this. But I am going to look for
another. What are the pro's en cons off the different systems? FastCGI,
PyApache, mod_pythion and maybe others I am not aware off. | | |
Tim Roberts wrote:
Cecil Westerhof <du***@dummy.nlwrote:
>> I have a cgi-script dat uses the modules cgi, os, sys and time. Offcourse I can not time the time used to import time, but os and sys do not take more as a millisecond. My script itself takes 3 or 4 milliseconds. But importing cgi takes 95 milliseconds. (This is on my test system a PII 300 MHz. Is there a way to make this more fast? The import off cgi makes the script at least 20 times as slow. Something like mod-python is not a possibility. I could use it on my test machine, but not at the osting provider.
Realistically, do you plan to support more than a few dozen requests per
minute? If not, then it doesn't matter at all. The script launch
overhead is an insignificant part of the user's browser experience.
Not at the moment. The application is in alpha, so it is mostly a few
testers. At the moment it is going life I want to have another provider.
So I have a little time to search for another provider and select 'the best'
framework. ;-} | | |
Gabriel Genellina wrote:
I'll try to explain better: the cgi *protocol* (I'm not talking about the
cgi *module*) requires a *new* python process to be created on *each*
request. Try to measure the time it takes to launch Python, that is, the
time from when you type `python ENTER` on your shell and the interpreter
prompt appears. That time is wasted for *every* cgi request, and I bet it
is much greater than the 95 ms you measure importing a module (be it cgi
or whatever). You'll gain much more responsiveness if you can switch to
another protocol, be it FastCGI, WSGI, mod_python or another.
The import of the cgi module takes about 95 milliseconds and the browsers
takes around 260 milliseconds to fetch the xml-page. This is why I thought
it to be important, but I think you are right, I should not worry about
this and switch to another protocol. Is not possible with the current
hosting provider, but I'll switch.
Pointers where to look for in selecting the protocol are apreciated.
Anyway, comparing the import time between os, sys, and cgi is not very
meaningful. sys is a builtin module, so "import sys" does very little. os
is likely to be already imported by the time your script begins, so
"import os" just verifies that os is already in sys.modules. "import cgi"
is the only example when Python actually has to load something, so it's
not a surprise if it takes longer.
Okay, thank you for the explanation. | | |
On Thu, 18 Jan 2007 14:15:44 -0300, Gabriel Genellina <ga******@yahoo.com.arwrote:
....
I'll try to explain better: the cgi *protocol* (I'm not talking about the
cgi *module*) requires a *new* python process to be created on *each*
request. Try to measure the time it takes to launch Python, that is, the
time from when you type `python ENTER` on your shell and the interpreter
prompt appears.
On my Mac Mini with all of Python on local disk:
tuva:~time python </dev/null
0.024u 0.012s 0:00.02 150.0% 0+0k 0+0io 0pf+0w
tuva:~time python < /dev/null
0.028u 0.004s 0:00.02 100.0% 0+0k 0+0io 0pf+0w
tuva:~>
I.e. about 200--300ms. I assume startup time >shutdown time.
If Python was at the other end of a NFS file system, much worse figures.
/Jorgen
--
// Jorgen Grahn <grahn@ Ph'nglui mglw'nafh Cthulhu
\X/ snipabacken.dyndns.org R'lyeh wgah'nagl fhtagn! | | This discussion thread is closed Replies have been disabled for this discussion. Similar topics
10 posts
views
Thread by Carol Carrot |
last post: by
|
1 post
views
Thread by Fadly Tabrani |
last post: by
|
10 posts
views
Thread by Steven Reddie |
last post: by
|
4 posts
views
Thread by g |
last post: by
|
29 posts
views
Thread by Tuvas |
last post: by
|
5 posts
views
Thread by Dara Durum |
last post: by
|
reply
views
Thread by bruce |
last post: by
|
3 posts
views
Thread by Steven W. Orr |
last post: by
|
12 posts
views
Thread by Chris Allen |
last post: by
| | | | | | | | | | |