468,316 Members | 2,150 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 468,316 developers. It's quick & easy.

Automating a tedious web browsing task

I have to download files every day from a web based application and
it's getting very repetitious. I would like to find a way to automate
the task. What I have to do each day is go to a website that starts
with a login screen. After logging in, the web application takes me to
a page with about 20 selection buttons. I have to start with the first
selection. That takes me to another page. That page has a menu bar
across the top. I have to go to the "Download" menu, which takes me to
a new page. On this page I have a list of files with dates. I then
look for any new files and download them by clicking on the file name.
After I've done this, I go back and repeat the activity for all 20 of
the selection buttons.

The pages are all javascript driven. I noticed by looking at the
source code that the files I need all seem to be just sitting on the C:
drive of a web server somewhere under a certain directory. This was
the code I saw for a checkbox that is next to a file I needed to
download:

<INPUT TYPE=CHECKBOX NAME="checkbox3"
value="C:\Downloads\SJFile\Report.pdf"
ONCLICK="addFile(this.form.checkbox3.value,this.fo rm.checkbox3.checked),checkIfAllSelected()"
>
Then I also noticed

<a
href="/FA/jspHtml/MultipleDownload.jsp?act=download&file=%5CReport.p df/"
onMouseOver="return makeTrue(domTT_activate(this, event, 'content',
'<b>Report.pdf</b><br>', 'statusText', 'Click to download'))"><tt><font
class="s2cB">Report.pdf</font></tt></a>

I've tried to get at the files by pasting these URLs into the address
bar but of course it takes me back to the login screen.

Automating this seems like it should be simple enough, but at the
moment I'm stuck clicking on buttons and waiting endlessly.

Thanks for any advice....

Dave

Nov 23 '06 #1
2 1191
"Dave" <da********@mail.comwrote in
news:11**********************@l39g2000cwd.googlegr oups.com:

Thanks for any advice....

There's really not much you can do. What you have described is a horribly
designed user interface. Unfortunately, they are all to common.

It *might* be possible to use Greasemonkey in some way, but I'm not that
familiar with it, and it would involve more than just cutting and pasting.

Nov 23 '06 #2
I'm checking out Greasemonkey now. Thanks, it might be helpful....

Back in the old days I remember a unix command line program I use to
use called httpnab. It's argument was the URL and it would return the
URL. It's probably not feasible now days with java script and frames
etc, but I was envisioning something similar to this. If I could get
logged in, I could just "httpnab" the URLs and use pattern matching to
find the next page to go to and which files to download.

Probably an overly simplistic idea these days tho....
Good Man wrote:
"Dave" <da********@mail.comwrote in
news:11**********************@l39g2000cwd.googlegr oups.com:
Nov 24 '06 #3

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

12 posts views Thread by Jerry Weinstein | last post: by
1 post views Thread by Mark | last post: by
reply views Thread by DanielBH | last post: by
4 posts views Thread by ad | last post: by
4 posts views Thread by access-newb | last post: by
1 post views Thread by richard | last post: by
6 posts views Thread by =?Utf-8?B?QWRyaWFuIEpvbmVz?= | last post: by
reply views Thread by NPC403 | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.