It could very easily be that your web parser is not reporting itself as a
web browser that their site understands.
For example... Use panels. These in most cases render as a div, but they
only do so if the .net environment understands the calling browser. If it
doesn't understand it, then .net goes downlevel and renders it as tables
instead. (This caused me a headache on a project I had last year, I had
written the .net page, a remote screen scraper was viewing it as tables, not
divs.)
To get around it from your direction, you need to report as a modern
browser.
If they are using link buttons to navigate their site, they are going to be
in for a lot of problems, for example, search engines (just like you) don't
click link buttons. They can't submit the form, which means many hidden
pages. (a trap that many in-experienced developers can easily fall into). It
could be worth mentioning that to them.
--
Best regards,
Dave Colliver.
http://www.AshfieldFOCUS.com
~~
http://www.FOCUSPortals.com - Local franchises available
<ky*****@gmail.comwrote in message
news:11*********************@q75g2000hsh.googlegro ups.com...
Hi,
I had written a program long time ago that loaded a web site and
parsed the info on that web site and created a database about the
information.
Then the comapny that maintains the web site has moved to aspx web
forms, and my old program does not work, because the server recognizes
that there is no web browser at the other end and does not send the
appropriate page. I also used to parse the page to see if there is
more data and ask for the following pages. Now I can not do that
either, because it requires that I somehow need to send a submit
command with approriate aspx info.
Is there a way to write program that would fake the server to load the
page, and I will parse it again, ask for the next page, and so on?
Perhaps some examples?
Your help is appreciated.
Thanks.