By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
445,876 Members | 1,206 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 445,876 IT Pros & Developers. It's quick & easy.

Search engines crawling our .NET site

P: n/a
Our site gets searched by robots all the time. This is great. However,
many of our pages that we want to be cataloged are data driven, so we end up
with pages like:

www.ourdomain.com/products.aspx?productid=356

Let's assume that we stop selling productid 356. This means that the url
above is invalid. If a general user has bookmarked this page or pastes in a
url into a browser that isn't quite right, we want them to get a 'pretty'
error message. However, using this approach means that a search engine like
google will interpret this page as still being "OK" and will continue to
catalog it.

Off the top of my head, I see two solutions:

1. Redirect to a bogus page that doesn't exist, so a 404 message displays.
Search engines would remove the link in their catalog(hopefully????), but
users would not have a "pleasing" experience.
2. Throw an unhandled exception. A ugly ASP.NET error message page is
displayed. This ticks off the user, but hopefully the search engine
realizes that the page does not actually exist? Probably not.

Suggestions on how to handle this?

Thanks in advance.

Mark
Nov 19 '05 #1
Share this Question
Share on Google+
3 Replies


P: n/a
Option 3:
Don't worry about it and send the consumer to a page that says "we no longer
carry that product, but suggest this instead" and keep old products in a
"dead product" table with cross sell possibilities.
---

Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

***************************
Think Outside the Box!
***************************

"Mark" wrote:
Our site gets searched by robots all the time. This is great. However,
many of our pages that we want to be cataloged are data driven, so we end up
with pages like:

www.ourdomain.com/products.aspx?productid=356

Let's assume that we stop selling productid 356. This means that the url
above is invalid. If a general user has bookmarked this page or pastes in a
url into a browser that isn't quite right, we want them to get a 'pretty'
error message. However, using this approach means that a search engine like
google will interpret this page as still being "OK" and will continue to
catalog it.

Off the top of my head, I see two solutions:

1. Redirect to a bogus page that doesn't exist, so a 404 message displays.
Search engines would remove the link in their catalog(hopefully????), but
users would not have a "pleasing" experience.
2. Throw an unhandled exception. A ugly ASP.NET error message page is
displayed. This ticks off the user, but hopefully the search engine
realizes that the page does not actually exist? Probably not.

Suggestions on how to handle this?

Thanks in advance.

Mark

Nov 19 '05 #2

P: n/a
most search robots support the robot headers, just include on your friendly
page:

<META name="ROBOTS" content="NOINDEX">

-- bruce (sqlwork.com)
"Mark" <Ma**@nowhere.com> wrote in message
news:u7**************@TK2MSFTNGP12.phx.gbl...
| Our site gets searched by robots all the time. This is great. However,
| many of our pages that we want to be cataloged are data driven, so we end
up
| with pages like:
|
| www.ourdomain.com/products.aspx?productid=356
|
| Let's assume that we stop selling productid 356. This means that the url
| above is invalid. If a general user has bookmarked this page or pastes in
a
| url into a browser that isn't quite right, we want them to get a 'pretty'
| error message. However, using this approach means that a search engine
like
| google will interpret this page as still being "OK" and will continue to
| catalog it.
|
| Off the top of my head, I see two solutions:
|
| 1. Redirect to a bogus page that doesn't exist, so a 404 message displays.
| Search engines would remove the link in their catalog(hopefully????), but
| users would not have a "pleasing" experience.
| 2. Throw an unhandled exception. A ugly ASP.NET error message page is
| displayed. This ticks off the user, but hopefully the search engine
| realizes that the page does not actually exist? Probably not.
|
| Suggestions on how to handle this?
|
| Thanks in advance.
|
| Mark
|
|
Nov 19 '05 #3

P: n/a
Hi Mark,

You could also place a "robots.txt" text file in the root directory of
your app, containing this :

User-agent:*
Disallow:products.aspx

Although I find Gregory's solution very elegant, user-friendly and
commercialy sound. I'd go that way. Out of the box...

HTH,

Michel

"Mark" <Ma**@nowhere.com> wrote in message news:<u7**************@TK2MSFTNGP12.phx.gbl>...
Our site gets searched by robots all the time. This is great. However,
many of our pages that we want to be cataloged are data driven, so we end up
with pages like:

www.ourdomain.com/products.aspx?productid=356

Let's assume that we stop selling productid 356. This means that the url
above is invalid. If a general user has bookmarked this page or pastes in a
url into a browser that isn't quite right, we want them to get a 'pretty'
error message. However, using this approach means that a search engine like
google will interpret this page as still being "OK" and will continue to
catalog it.

Off the top of my head, I see two solutions:

1. Redirect to a bogus page that doesn't exist, so a 404 message displays.
Search engines would remove the link in their catalog(hopefully????), but
users would not have a "pleasing" experience.
2. Throw an unhandled exception. A ugly ASP.NET error message page is
displayed. This ticks off the user, but hopefully the search engine
realizes that the page does not actually exist? Probably not.

Suggestions on how to handle this?

Thanks in advance.

Mark

Nov 19 '05 #4

This discussion thread is closed

Replies have been disabled for this discussion.