I'm moving an application from XP to Linux.
The XP version of the application uses com automation to instantiate
an instance of IE6 and then drive it to various web sites. These are
generally fairly sophisticated with content driven by scripts, involve
frames, etc. I need to measure the time between the start of
navigation and its completion, reference the DOM to extract certain
data from the page, possibly save the page (not just the HTML, but
exactly what you would get by manually doing a SAVEAS), etc. Because
of the sophistication of the pages LWP is not a viable answer (no
support to scripts or frames). Likewise wget. It appears that I'll
need to automate a real browser (Mozilla?).
I'd like to do this using Perl and Mozilla but am unsure how to
approach the problem. Is it possible? What is the general approach?
Are there any examples? I'm generally aware of PLXPCOMP and XPCOMP
but unclear of how to use them to attack this task.
Thanks for any clues.
R