469,579 Members | 1,112 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,579 developers. It's quick & easy.

Re: Retrieve Custom 404 page.

On Mon, 2008-11-17 at 13:59 -0800, godavemon wrote:
I'm using urllib2 to pull pages for a custom version of a web proxy
and am having issues with 404 errors. Urllib2 does a great job of
letting me know that a 404 happened with the following code.

import urllib2
url = 'http://cnn.com/asfsdafsadfasdf/'
try:
page = urllib2.urlopen( url )
except urllib2.URLError, e:
print e

returns: HTTP Error 404: Not Found
>From the urllib2 docs: HTTPError is also a valid HTTP response, so you
can treat an HTTP error as an exceptional event or a valid response:

import urllib2
url = 'http://cnn.com/asfsdafsadfasdf/'
try:
page = urllib2.urlopen(url)
except urllib2.URLError, e:
print e.read()
>
http://cnn.com/asdfasdfadsf

Nov 17 '08 #1
1 1669
Perfect! Thanks!

On Nov 17, 4:16*pm, Albert Hopkins <mar...@python.invalidwrote:
On Mon, 2008-11-17 at 13:59 -0800, godavemon wrote:
I'm using urllib2 to pull pages for a custom version of a web proxy
and am having issues with 404 errors. *Urllib2 does a great job of
letting me know that a 404 happened with the following code.
import urllib2
url = 'http://cnn.com/asfsdafsadfasdf/'
try:
* * page = urllib2.urlopen( url )
except urllib2.URLError, e:
* * print e
returns: HTTP Error 404: Not Found
From the urllib2 docs: HTTPError is also a valid HTTP response, so you

can treat an HTTP error as an exceptional event or a valid response:

import urllib2
url = 'http://cnn.com/asfsdafsadfasdf/'
try:
* * page = urllib2.urlopen(url)
except urllib2.URLError, e:
* * print e.read()
http://cnn.com/asdfasdfadsf

Nov 17 '08 #2

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

6 posts views Thread by Scott Zabolotzky | last post: by
reply views Thread by Georg | last post: by
reply views Thread by godavemon | last post: by
reply views Thread by suresh191 | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.