Andrew,
What I want to do is show the search engines a different content, not
prevent them from coming to my site.
The problem is that I have pages that contain text in 2 languages which is
shown depending on the browser's prefered language and/or selected language
saved in a cookie.
Doing it this way, I don't have to show urls with ugly query strings like
http://www.mysite.com/default.aspx?lang=en
The problem with search engines is that they only use the default language,
but can't switch language to reindex the content in the other language.
My goal is to detect if the requester is a web crawler, and if it is, show
both languages. If not, continue the normal way.
I have found an interesting post, which I believe I will be able to use
(
http://forums.asp.net/p/908519/1012090.aspx#1012090).
I should be able to modify it to monitor the major search engines - I am
only interested in those major ones.
Thanks for the suggestion anyway,
Zolt
"Andrew Morton" wrote:
Zolt wrote:
I have a web site for which I want to show a different content for
search engine bots.
Rather than try to get the site blacklisted by search engines, why not just
use a robots.txt file to exclude them?
Andrew