On 17 Nov 2004 05:47:05 -0800
an********@doxdesk.com (Andrew Clover) wrote:
Nehal <ne*********@gmx.net> wrote:
when uploading small files, you won't notice a difference, but
if you upload files larger than 2 megs, you can notice it.
Yep. Large file upload in cgi.py is slow. I don't immediately
see a way to speed it up without re-architecting some of its
internals.
In any case I dislike(*) the cgi module's interface too, so I
rewrote the lot:
http://www.doxdesk.com/software/py/form.html
This isn't drop-in compatible, and is getting a bit crusty (I'm
expecting to rewrite most of it soon to be more
objecty/threadable, and support WSGI), but in my experience it's
considerably faster than cgi for very large files. (We were
commonly using files in the 10-50MB range.)
(* - more then than now; cgi's interface has got slightly better
since Python 1.5.2's time.)
--
Andrew Clover
I have tested Andrew's 'form.py' module, and also upload cgi
scripts from other languages, i did some benchmarking, i tried
uploading a 6 meg file to localhost and writing to an output file
on apache 2.0.52 win32. here are the results (note: i checked the
error log to make sure all scripts were working and processing the
data as expected):
ruby: 2 sec
Andrew Clover's form.py: 2.5 sec
perl: 2.5 sec
tcl (3rd party module): 4.5 sec
python: 8 sec
of course in practice, most people won't be receiving data at 3
megs/sec, so you won't have to process data at such a speed.
nevertheless, it will put a greater load on the CPU, which may be
an issue for many servers.
it would not be a good idea to put Andrew's form.py in the
official python distribution and have 2 different modules for
processing CGI. either it would have to somehow merged into CGI
module, and keeping backward compatibility, or the existing CGI
module must be optimized; maybe the above benchmark data will
motivate someone to do so ;). until then, i'll stick to form.py
-- thx, Nehal