By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
439,957 Members | 1,969 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 439,957 IT Pros & Developers. It's quick & easy.

[urllib2] No time-out?

P: n/a
Hello

I'm using urllib2 to download web pages. The strange thing in the code
below, is that it seems like urllib2.urlopen retries indefinitely by
itself instead of raising an exception:

=====
timeout = 30
socket.setdefaulttimeout(timeout)

i = 0
while i < 5:
try:
url = 'http://www.acme.com'
print url
req = urllib2.Request(url, None, headers)
response = urllib2.urlopen(req).read()
except:
#Never called :-/
print Timed-out."
if i == 4:
print "Exiting."
connection.close(True)
sys.exit()
else:
print "Trying again"
i = i + 1
time.sleep(10)
continue
=====

I haven't found a switch within urllib2 that would tell it to raise an
exception after it times out trying to download a web page. Any idea
how to have it stop trying after 5 tries?

Thank you.
Nov 16 '08 #1
Share this Question
Share on Google+
1 Reply


P: n/a
On Sun, 16 Nov 2008 12:04:02 +0100, Gilles Ganault wrote:
Hello

I'm using urllib2 to download web pages. The strange thing in the code
below, is that it seems like urllib2.urlopen retries indefinitely by
itself instead of raising an exception:
Try this instead (untested):

timeout = 30
socket.setdefaulttimeout(timeout)
url = 'http://www.acme.com'
for i in range(5):
try:
print url
req = urllib2.Request(url, None, headers)
response = urllib2.urlopen(req).read()
break
except urllib2.URLError:
print "Timed-out."
time.sleep(10)
else:
print "Exiting."
connection.close(True) # What is this?
sys.exit() # Do you really want the application to exit?

--
Steven
Nov 16 '08 #2

This discussion thread is closed

Replies have been disabled for this discussion.