This would only be for my own pages. I've used an activex construct
(included below) that scrapes tables out of the html and puts the data
into an Excel spreadsheet. This works well if the end user has the
right operating system/browser combination.
In another context, I've used Perl with the Spreadsheet::WriteExcel
and Spreadsheet::ParseExcel modules to generate Excel spreadsheets.
So, I am hoping to be able to send the html tables to a Perl cgi where
I can parse and write the data out to an Excel spreadsheet similiar to
the activex method, but without the activex lack of portability.
Here's the activex example (found this example somewhere, sorry I
can't currently find where to correctly attribute):
<A href=
"javascript
:(function(){Ts=document.getElementsByT agName('table');
if(!Ts)alert('No%20tables.');else{try{X=new%20Acti veXObject('Excel.Application');X.visible=true;for( i=0;T=Ts[i];++i){Rs=T.rows;if(Rs.length>1&&Rs[0].cells.length>2){X.workBooks.add();for(r=0;R=Rs[r];++r){for(c=0;C=R.cells[c];++c){D=X.cells(r+1,c+1);
if(r==0){D.entireColumn.columnWidth=12;D.entireCol umn.verticalAlignment=-4160;}D.value=C.innerText.replace(/\r/g,'');}}}}}catch(e){alert('Couldn\'t%20open%20Exce l.');}}})();
">Download table to Excel</A><BR>
thanks again for any help!
Lee <RE**************@cox.net> wrote in message news:<bu********@drn.newsguy.com>...
Matthew said:
I've been trying to find a way to gather up data contained in a table
or tables on a previously generated html page in order to send it to a
cgi for further processing. Ideally this would scrape the page for
the tables' data (or even just those tables with more than 2 rows and
2 columns) in a way that could be sent to the cgi using a form.
Thanks for any help or pointers in the right direction.
How are you planning to use this? To extract data from pages that
you visit on other sites, or from your own pages? In either case,
it would probably be easier to do it on the server than in a web
browser.