473,883 Members | 1,793 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Robots.txt issue

My robots.txt is as simple as below:

User-agent: *

Disallow: /auth

Is there something wrong with it? Yahoo Slurp still indexes content in
the auth folder...

Advice appreciated.

Jul 23 '05 #1
3 1433
"Spondishy" <sp*******@tisc ali.co.uk> wrote:
My robots.txt is as simple as below:

User-agent: *

Disallow: /auth

Is there something wrong with it? Yahoo Slurp still indexes content in
the auth folder...


Try:
Disallow: /auth/*

Note that not all bots respect robots.txt rules.

--
Spartanicus
Jul 23 '05 #2
"Spondishy" ,comp.infosyste ms.www.authoring.html:
My robots.txt is as simple as below:

User-agent: *

Disallow: /auth


There should not be any blank line between User-Agent: and Disallow:
lines. cf <URL: http://www.robotstxt.org/wc/norobots.html >.
Jul 23 '05 #3
Spartanicus wrote:
Is there something wrong with it? Yahoo Slurp still indexes content in
the auth folder...

Try:
Disallow: /auth/*


The Robots Exclusion Protocol does not specify wildcards, some robots
support this as a proprietary extension.

The problem in the original post was probably the blank line between the
User-agent and the Disallow lines.

--
Klaus Johannes Rusch
Kl********@atme dia.net
http://www.atmedia.net/KlausRusch/
Jul 23 '05 #4

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

8
2065
by: Craig Cockburn | last post by:
Hi I'm aware of the use of robots.txt and the use of <META NAME="ROBOTS" CONTENT="index,follow"> However, what would be more useful is to be able to control within a page which elements of the page should be indexed and seen by robots and which elements are simply page furniture and it is safe to ignore or not cache (e.g. adverts).
56
3655
by: Anonymous, quoting Philip Ronan | last post by:
Subject: Warning: robots.txt unreliable in Apache servers From: Philip Ronan <invalid@invalid.invalid> Newsgroups: alt.internet.search-engines Message-ID: <BF89BF33.39FDF%invalid@invalid.invalid> Date: Sat, 29 Oct 2005 23:07:46 GMT Hi, I recently discovered that robots.txt files aren't necessarily any use on Apache servers.
2
1381
by: Janus Knudsen | last post by:
Hello Im collecting ideas for a piece of software I've in mind. I need to create an application which can be started with some parameters, the application have to be started in many instances and on many servers. Every instance has a special job to fullfill, think of the instances as small robots - a robot farm or something like that. Every robot should be able to tell for instance when its idle, when its running etc..
4
2357
by: Misfit | last post by:
I have wondered, and I've tried this on a few random sites. I type the name of a site. www.somesite.com and follow it with /robots.txt. This can tell the robots not to bother indexing the /images/ directory or something, but it can also tell script kiddies where to look for stuff. For example the Disallow may read Disallow: /AdminPages/. So, isn't that a simple way to tell someone to type that into their browser and see if it is open?...
5
2344
by: John Nagle | last post by:
Python's "robots.txt" file parser may be misinterpreting a special case. Given a robots.txt file like this: User-agent: * Disallow: // Disallow: /account/registration Disallow: /account/mypro Disallow: /account/myint ...
4
1603
by: Les Caudle | last post by:
I'm noticing that web requrests are coming in with /robots.txt appended at the end: http://www.domain.com/ProductDetails.aspx?productID=527/robots.txt I can correct these, one by one for each page, but I'd like to find a way to have ASP.NET 2.,0 strip this invalid /robots.txt off the end of any URL for me. Is this possible? --
2
2631
by: John Nagle | last post by:
For some reason, Python's parser for "robots.txt" files doesn't like Wikipedia's "robots.txt" file: False The Wikipedia robots.txt file passes robots.txt validation, and it doesn't disallow unknown browsers. But the Python parser doesn't see it that way. No matter what user agent or URL is specified; for that robots.txt file, the only answer is "False". It's failing in Python 2.4 on Windows and 2.5 on Fedora Core.
0
9933
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
11128
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10734
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
10838
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
1
7964
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
7119
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5988
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
4607
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
3230
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.