By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,551 Members | 1,142 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,551 IT Pros & Developers. It's quick & easy.

Downloading multiple csv files from a website

P: n/a
I'd like to download data from the website
http://www.russell.com/Indexes/perfo..._values_US.asp. On
this web page, there are links to a number of .csv files, and I'd like
to download all of them automatically each day. The file names are not
visible on the page, but if I click on a link, a csv file opens in
Excel. I've searched this group and looked into urllib, but have not
found functions or code snippets that will allow me to download and
rename each file. Would someone kindly point me to appropriate
libraries/functions and/or code snippets that will get me started?

Thanks in advance

Thomas Philips

Aug 17 '07 #1
Share this Question
Share on Google+
2 Replies


P: n/a
On Aug 17, 8:08 am, tkp...@hotmail.com wrote:
I'd like to download data from the websitehttp://www.russell.com/Indexes/performance/daily_values_US.asp. On
this web page, there are links to a number of .csv files, and I'd like
to download all of them automatically each day. The file names are not
visible on the page, but if I click on a link, a csv file opens in
Excel. I've searched this group and looked into urllib, but have not
found functions or code snippets that will allow me to download and
rename each file. Would someone kindly point me to appropriate
libraries/functions and/or code snippets that will get me started?

Thanks in advance

Thomas Philips
This link shows how to extract a list of URLs:
http://www.java2s.com/Code/Python/Ne...inawebpage.htm

and this one shows how to download:

http://aspn.activestate.com/ASPN/Coo...n/Recipe/83208

Mike

Aug 17 '07 #2

P: n/a
Our systems administrator suggested that I try wget, a GNU utility
that is designed to pick up data. It might prove to be the easiest way
to get the data I want, and I am going to try that first.

Thanks again.

Thomas Philips
Aug 17 '07 #3

This discussion thread is closed

Replies have been disabled for this discussion.