By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,740 Members | 790 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,740 IT Pros & Developers. It's quick & easy.

BaseHTTPServer fails POST requests: disconnected by peer; maybe sockets bug? help!?

P: n/a
Peace, Pythonphiles!

Python's std lib BaseHTTPServer: I'm stuck with a nasty spoiler -- I
get erratic "Network error: disconnected by peer" for _POST_ requests


I'm developing a Python CGI app. Was using Xitami, which is really
light and nice, reasonable performance (circa 200-300 msec to respond,
I guess mostly to start a Python process; localhost; win95 (retch!)).

But, deployed site is the usual Apache/Linux monstrosity. And I wanted
to use mod_rewrite, to export nice URLs. Xitami is inflexible about
URL associations with scripts.

So I tried hacking a python22/libs/SimpleHTTPServer to "simulate"
Apache's mod_rewrite. It's unbelievable easy!

=The problem=

GET requests work fantastically! 40-100 millisecs to render response
in HTML -- I was really surprised!

But POST requests fail: the script runs ok, response fully generated
and written to stdout, but somewhere while rendering Netscape aborts
with that "Network error: lost connection, disconnected by peer".

This is erratic: sometimes not a character was displayed before it
disconnects, and sometimes the entire page _except the last line_.

These tools are of course buggy as hell, and I learned to "live" with
their limitations during the years, but this one really gets me down:
I want to use Python's *HTTPServer to experiment with FCGI and a
standalone, dedicated Web server for this application, and the
performance was so nice, and _I haven't a clue as to why this strange
behavior occurs_!



Invoking script:

sys.stdout = self.wfile
sys.stdin = self.rfile
execfile( "", {} )

with both in/out set to _no buffering_:

rbufsize = 0
wbufsize = 0

(Actually, more like

sys.stdout = cgi_output = StringIO.StringIO()
self.send_response( status )
self.wfile.write( cgi_output.getvalue() )

so can (keep a log and) scan the script's output for a "Status:"
header, aka "parsed-headers CGI" -- I'm simulating Apache, right?)

I tried pausing just before exiting the request handling call:

time.sleep( 25 ) # Trying to get smart with bug?

and it _sometimes_ helps! ie, Netscape renders the page, then gives up
on waiting for the connection to close, I guess, so... there.
But, unreliable.


I've no idea where to look -- what's different between a GET and

It's probably some Micrapsoft buffoonery with sockets.
I really like to patch this somehow.

Any suggestions about how to find the exact point of failure?
-- M
Jul 18 '05 #1
Share this question for a faster answer!
Share on Google+

This discussion thread is closed

Replies have been disabled for this discussion.