Hello Everyone,
I am not sure if I posting this to the right forum or not. So, I apologize before hand if this is not the forum for posting this question.
I want a script that would run weather in Windows or Linux that will gather the information of a website and save it in a file (maybe to a .xls file). I want to use a program because I want to search more than 1000s of pages. However, the good thing is that each page is identical to the other. For example, the program browses: http://www.xxxxxx.yyy/1.html and grabs a variable's content from inside and saves it to a file. Next it goes to: http://www.xxxxxx.yyy/2.html and does the same thing.
Can PERL do this?
Thanks