By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
443,768 Members | 2,054 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 443,768 IT Pros & Developers. It's quick & easy.

urllib2 problem

P: n/a

I'm relatively new to using python for interacting with webpages but
Ive run into a problem that really has me stumped. I wrote a script
that would figure out all the variables needed to request data from a
website. I originally just used urllib.urlopen and everything worked
fine on my Windows PC at work. I tried the same script at home on my
Fedora COre 3 box using python 2.4, and whenever I try to connect to
the site I get the (110, Connection Timed Out) Error.

At first I thought my firewall was causing problems with the script but
I noticed an odd patten. If the web site asked to accept cookies (like
the site I need) the script times out), if I point it to a site that
doesnt it works fine. Ive tried several attempts at using urllib2 and
the HTTPCookieProccessor and I still have no luck. Can anyone give me
any advice or pointers on what may be the problem here? I apologize if
this is kind of a rookie question but Ive been searching for about a
week with no luck.


Jeremy Martin

Oct 27 '05 #1
Share this question for a faster answer!
Share on Google+

This discussion thread is closed

Replies have been disabled for this discussion.