473,378 Members | 1,498 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,378 software developers and data experts.

Google Page Rank mystery

I have a hobby website at:

http://www.montana-riverboats.com
which also resolves as:
http://montana-riverboats.com ...without the www.

One address has a Google page rank of three.
The other address has a Google page rank of four.
(at least according to the Firefox page rank plugin).

How can this be? They are the same site.
Does Google count the two (with and without www in the domain name)
as separate sites? Does this mean all the external links
pointing to the www version don' count for the non-www version?
When they should?
I thought Google was smart. How can they be so transparently
stupid, and not fix it? If they store domain names in
a B-Tree like structure, it would be easy, and not all that
expensive, to determine duplicate addresses, and get right.

In fact, the runtime expense revolving around duplicated
and fractured site information would, you would think, be
more expensive than storing it once...the right way.

--
/* Sandy Pittendrigh >--oO0>
** http://montana-riverboats.com
*/
Aug 4 '05 #1
32 3484
"sandy" wrote:
I have a hobby website at:

http://www.montana-riverboats.com
(which has 1 inbound link)
which also resolves as:
http://montana-riverboats.com ...without the www.
(which has about 34)
One address has a Google page rank of three.
The other address has a Google page rank of four.
Probably because one has more inbound links than the other
How can this be? They are the same site.
If they're the same site, then why are you publishing it at two different
URLs?
I thought Google was smart. How can they be so transparently
stupid, and not fix it?


If this is bothering you, then stop publishing the same content at different
URLs.

What makes you think this has anything to do with HTML authoring?

--
phil [dot] ronan @ virgin [dot] net
http://vzone.virgin.net/phil.ronan/
Aug 4 '05 #2
sandy wrote:
How can this be? They are the same site.
Does Google count the two (with and without www in the domain name)
as separate sites? Does this mean all the external links
pointing to the www version don' count for the non-www version?
When they should?


Think about it! *Exactly* how is Google supposed to determine that
<http://www.montana-riverboats.com> and <http://montana-riverboats.com>
are in fact the same site? Since the "www" prefix is only a convention,
and not always followed, it *can't* -- because they are *not*
necessarily the same site! Even if both URLs resolve to the same set of
IP addresses, they could still be separate sites (since, under HTTP
v1.1, the "host" header allows many different virtual sites on the same
server).

If you want them to be treated as the same site, *you* have to tell the
world that they're the same. I'd suggest setting up
<http://montana-riverboats.com/...> to respond to all requests with a
"moved permanently" redirection status to
<http://www.montana-riverboats.com/...>. IIRC this is easy with Apache,
and probably with other web server software. Do *not* try to use the
<META REFRESH...> hack -- it won't do what you need (as well as being
less efficient).

Dave

Aug 4 '05 #3

Philip Ronan wrote:
If they're the same site, then why are you publishing it at two different
URLs? ....my shared host ISP does this in their http.conf, whether I like it
or
not. This is common. Many ISPs do that. So the implied "www" mapping
is something that happens millions of times, all over the net.
Both Urls resolve to the same IP address. It wouldn't be hard for
Google to figure that out.
What makes you think this has anything to do with HTML authoring?

....I could suppose I could have posted to a 'servers' group.
Someone there, however, would surely have complained that 'Google'
questions have nothing to do with servers. The author of a book
needs to know how to write, plus how to deal with publishers and
marketing.

Aug 4 '05 #4
On Thu, 04 Aug 2005 05:25:28 -0700, sandy
<sa********@slowtorture.spammers.com> wrote:
I have a hobby website at:

http://www.montana-riverboats.com
which also resolves as:
http://montana-riverboats.com ...without the www.

One address has a Google page rank of three.
The other address has a Google page rank of four.
(at least according to the Firefox page rank plugin).

How can this be? They are the same site.
Does Google count the two (with and without www in the domain name)
as separate sites?
This is off topic for an HTML group ... but here it goes:

Perhaps the same site but, through links pointing its way, two
different URLs shared for that same page. To help you unederstand,
www. - shared ina couple links pointing your way - is being treated
as a subdomain.

Does this mean all the external linkspointing to the www version don' count for the non-www version?
Yes. As a result you have split the PR for the main page across two
different URLs.
When they should?


Why should they when one or two is out there using www. as part of the
URL of where the link is pointing to?

You need to ask the people, linking your way using http://www.
(whether out of habit or not knowing you wanted the non-www URL being
linked to) , to adjust the links so they are pointing to the same URL
or you could try using a redirect from the www-URL to the non-www one.
.... as the other search engines treat www. as a subdomain also when
following inbound links; so that's why YOU, the site owner, needs to
address the problem - not the search engines that are following links
"out there" to your site.

[snip of rest]

Carol

Aug 4 '05 #5
I'm inclined to turn the flame-o-doom on and suggest that you write to
Google and explain your solution to them, but then you seem to make
nice boats so I won't 8-)

These are two entirely separate sites. Maybe they're not different, but
they are separate. As far as Google knows (as far as Google _can_
know), then they're entirely separate.

Why they're separate is up to your ISP. They've chosen to configure
their servers so that both URLs return a 200 and appear to be sites
that are separate (and could even be different).

What they perhaps ought to do is to re-configure things so that one
site (I'd suggest www.) sends a 301 "Moved Permanently" and redirects
to the other.

What you should do is to chill out, then fix it. I wouldn't even bother
badgering the ISP over it (maybe a light moleing, or a brief going over
with a couple of squirrels). You can sort this out for yourself with a
few minutes Googlejuice, and editing yourself a .htaccess file. Then
be grateful you're not on IIS, where only the one-and-only admin can do
stuff like this. Isn't Apache wonderful?

Aug 4 '05 #6
"Sa***************@gmail.com" wrote:
Philip Ronan wrote:
If they're the same site, then why are you publishing it at two different
URLs? ...my shared host ISP does this in their http.conf, whether I like it
or not.


So just put up with it.
This is common. Many ISPs do that. So the implied "www" mapping
is something that happens millions of times, all over the net.
And how long did it take you to check out the behaviour of all these
millions of websites?
Both Urls resolve to the same IP address. It wouldn't be hard for
Google to figure that out.
Figure *what* out, exactly? Different websites often exist at the same IP
address and websites frequently serve different content in the "www"
subdomain. Here are some random examples:

http://www.ritualistic.com = 69.17.116.118
http://www.steveforman.com = 69.17.116.118

http://www.com != http://www.www.com

Google has no way of telling if you ever plan to serve different content in
these domains, so it's indexed them both.
What makes you think this has anything to do with HTML authoring?

...I could suppose I could have posted to a 'servers' group.
Someone there, however, would surely have complained that 'Google'
questions have nothing to do with servers.


alt.internet.search-engines is a good place to ask questions about search
engines. And if you have a question about the functioning of your web host's
servers, perhaps you could try asking your web host.
The author of a book needs to know how to write, plus how to deal with
publishers and marketing.


Your point being?? Actually don't answer that -- just take your
complaints/queries somewhere more suitable.

--
phil [dot] ronan @ virgin [dot] net
http://vzone.virgin.net/phil.ronan/
Aug 4 '05 #7
di*****@codesmiths.com wrote:
I'm inclined to turn the flame-o-doom on and suggest that you write to
Google and explain your solution to them, but then you seem to make
nice boats so I won't 8-)

These are two entirely separate sites. Maybe they're not different, but
they are separate. As far as Google knows (as far as Google _can_
know), then they're entirely separate.

Why they're separate is up to your ISP. They've chosen to configure
their servers so that both URLs return a 200 and appear to be sites
that are separate (and could even be different).

What they perhaps ought to do is to re-configure things so that one
site (I'd suggest www.) sends a 301 "Moved Permanently" and redirects
to the other.

What you should do is to chill out, then fix it. I wouldn't even bother
badgering the ISP over it (maybe a light moleing, or a brief going over
with a couple of squirrels). You can sort this out for yourself with a
few minutes Googlejuice, and editing yourself a .htaccess file. Then
be grateful you're not on IIS, where only the one-and-only admin can do
stuff like this. Isn't Apache wonderful?


In order to save you from an overdose of Googlejuice:

I did this in my .htaccess file

--------------------------------
RewriteEngine On

RewriteCond %{HTTP_HOST} ^odahoda\.de$
RewriteRule ^(.*)$ http://www.odahoda.de/$1 [redirect=permanent,last]
--------------------------------

If you get a '500 Internal server error' after this change, mod_rewrite is
not installed and you'll have to drink another cup of Googlejuice...

--
Benjamin Niemann
Email: pink at odahoda dot de
WWW: http://www.odahoda.de/
Aug 4 '05 #8

Philip Ronan wrote:

....resonable stuff.

Ok I get it now. Thank you for the help.
I'll try to keep on topic in the future.

Aug 4 '05 #9
On Thu, 04 Aug 2005 14:19:48 GMT, Carol W <fr******@nomail.com> wrote:
On Thu, 04 Aug 2005 05:25:28 -0700, sandy
<sa********@slowtorture.spammers.com> wrote:
I have a hobby website at:

http://www.montana-riverboats.com
which also resolves as:
http://montana-riverboats.com ...without the www.

One address has a Google page rank of three.
The other address has a Google page rank of four.
(at least according to the Firefox page rank plugin).

How can this be? They are the same site.
Does Google count the two (with and without www in the domain name)
as separate sites?


This is off topic for an HTML group ... but here it goes:


You're back and you're all over, eh?

BB
--
www.kruse.co.uk/ se*@kruse.demon.co.uk
Elvis does my seo
--
Aug 5 '05 #10


Think about it! *Exactly* how is Google supposed to determine that
<http://www.montana-riverboats.com> and <http://montana-riverboats.com>
are in fact the same site?

$domainA = "www.montana-riverboats.com";
($domainB = $domainA) =~ s/www\.//;
$page1 = wget($domainA);
$page2 = wget($domainB);

if(diff page1 page2 eq "" && $ip_address1 == $ip_address2)
{
assumeSameSite($ip_address1, $ip_address2);
}

--
/* Sandy Pittendrigh >--oO0>
** http://montana-riverboats.com
*/
Aug 6 '05 #11
$domainA = "www.montana-riverboats.com";
($domainB = $domainA) =~ s/www\.//;
$page1 = wget($domainA);
$page2 = wget($domainB);

if(diff page1 page2 eq "" && $ip_address1 == $ip_address2)
{
assumeSameSite($ip_address1, $ip_address2);
}


....in other words, once you know diff($page1, $page2) eq ""
and both share the same IP address, then you (the spider)
have to make an assumption, whether you like it or not.
If you stilll assume the two domains are different sites,
you will be wrong more often than not.

The above (messy pseudo code) isn't foolproof.
But the idea is valid.
And it *would* be more accurate--over the long run--than
the ignorant assumption. That's how spidering
works: you don't always get it right (query 'rubbers' and
you get links to both galoshes and condoms).
The point is get it right more often than not.

--
/* float like a mayfly, sting like a hook >--oO0>
** http://montana-riverboats.com
*/
Aug 6 '05 #12
sandy wrote:
$domainA = "www.montana-riverboats.com";
($domainB = $domainA) =~ s/www\.//;
$page1 = wget($domainA);
$page2 = wget($domainB);

if(diff page1 page2 eq "" && $ip_address1 == $ip_address2)
{
assumeSameSite($ip_address1, $ip_address2);
}
...in other words, once you know diff($page1, $page2) eq ""
and both share the same IP address, then you (the spider)
have to make an assumption, whether you like it or not.
If you stilll assume the two domains are different sites,
you will be wrong more often than not.

The above (messy pseudo code) isn't foolproof.
But the idea is valid.


Ah, I see -- it doesn't matter to you if they screw up some unknown
number of validly-configured other sites, as long as they allow you to
be lazy.

I'm *very* glad that Google isn't stupid enough to do this.
And it *would* be more accurate--over the long run--than
the ignorant assumption. That's how spidering
works: you don't always get it right (query 'rubbers' and
you get links to both galoshes and condoms).
The point is get it right more often than not.


There's a *huge* difference between being unable to distinguish
different meanings of the same text and deliberately introducing errors.

In any case, why should *they* make an extra effort to handle badly
something that *you* could easily configure?

Dave

Aug 6 '05 #13
In article <77********************@bresnan.com>,
sandy <sa********@slowtorture.spammers.com> wrote:
I have a hobby website at:

http://www.montana-riverboats.com
which also resolves as:
http://montana-riverboats.com ...without the www.

One address has a Google page rank of three.
The other address has a Google page rank of four.
(at least according to the Firefox page rank plugin).

How can this be? They are the same site.
Does Google count the two (with and without www in the domain name)
as separate sites? Does this mean all the external links
pointing to the www version don' count for the non-www version?
When they should?
I thought Google was smart. How can they be so transparently
stupid, and not fix it? If they store domain names in
a B-Tree like structure, it would be easy, and not all that
expensive, to determine duplicate addresses, and get right.

In fact, the runtime expense revolving around duplicated
and fractured site information would, you would think, be
more expensive than storing it once...the right way.


My post is a little old in this thread, but I'd relax if I were you. A
google on "Montana riverboats" resolves to you. So does "riverboats in
Montana". That's what people look for in Google.

leo

--
<http://web0.greatbasin.net/~leo/
Aug 7 '05 #14
Ah, I see -- it doesn't matter to you if they screw up some unknown
number of validly-configured other sites, as long as they allow you to
be lazy.

I'm *very* glad that Google isn't stupid enough to do this.
I think we've beaten this off topic horse to death already.
But what the hell. You seem to be getting upset about this.
So I can't resist making it kicking you while you're down.
It's just to easy.

....."screw up some unknown number of validly-configured other sites?"
Something tells me you're not a programmer. Or if you are, not
an experienced one:

1) shared host ISPs often (usually do) alias www.domainname.suffix
to domainname.suffix. It happens all over the net. We're talking
about a common phenomenon here.

2) I have little or no control over the external links people make
to my site. People make links to my site and they often put the "www"
onto the front link out of habit. I can't control that.

A side effect is that my page rank doesn't get incremented by Google,
because Google counts www.domainname.suffix as different than
domanname.suffix. Even though it would be easy to figure out
a statistically more accurate assumption.

3) Yes, sometimes the same IP addresses map to entirely
different sites. But when
diff(wget(domain1), wget(domain2)) == "" and the two domains
share the same IP address, then the intelligent assumption is that
they are the same site. The chance that two symantically *different*
sites would diff to null is similar to the chance
two 6 byte strings might map to the same MD5 hash: it ain't gonna
happen, ever. Not in your lifetime or mine.

And it *would* be more accurate--over the long run--than
the ignorant assumption. That's how spidering
works: you don't always get it right (query 'rubbers' and
you get links to both galoshes and condoms).
The point is get it right more often than not.

There's a *huge* difference between being unable to distinguish
different meanings of the same text and deliberately introducing errors.

In any case, why should *they* make an extra effort to handle badly
something that *you* could easily configure?

Dave

--
/* Sandy Pittendrigh >--oO0>
** http://montana-riverboats.com
*/
Aug 14 '05 #15
On Sat, 13 Aug 2005 18:41:09 -0700, sandy
<sa********@slowtorture.spammers.com> wrote:

It happens all over the net. We're talking
about a common phenomenon here.
So why not take the common solutions to the problem instead of sitting
back and waiting for someone else to do it for you?
2) I have little or no control over the external links people make
to my site. People make links to my site and they often put the "www"
onto the front link out of habit. I can't control that.
Yes you DO and you CAN; you have been told the solution to that
dilemna by at leas 2, maybe 3, people a week or so ago but you prefer
to think the search engines should assume or guess versus you inform
the search engines, via the spiders, of your preferences for your
site.

www is treated as a subdomain - and _not_ just by google but also
other search engines. Not old news as it has been treated that way for
years. You chose to go the non-www route and so YOU need to inform the
search engines of that preference.
A side effect is that my page rank doesn't get incremented by Google,
because Google counts www.domainname.suffix as different than
domanname.suffix. Even though it would be easy to figure out
a statistically more accurate assumption.


Then, again, YOU, as the site owner, need to take time to inform
Google, and the other search engines, of your preferences - in this
case, not using the www subdomain being _your_ preference - through
using htacess, as already recommended to you, and-or having the sites,
mislinked your way to adjust those links for you, which was also
suggested.

It's up to YOU, not the search engine, to share the information about
your site. If you understood spidering and how it is done then you
would understand the site owner's role in search engine optimization -
which is why you are interested in page rank, isn't is?, due to
thinking it will help your ranking?

However, even if the problem created by the erroneous links is
resolved, I doubt your page rank will raise. You have a PR5 and a PR2
(split across two URLs) but that will not add up to PR7 but likely
remain at a PR5 as it only takes maybe 1 or 2 links to get a PR2
going (depending on if any mirrored usenet posts count as a backlinks
to the times you shared the "undesired" URL. A PR4 can beat out a PR5
or even a PR6 - so PR isn't the "end all to be all" thing it was a
year to year and a half ago. Part of the Google algo, yes, but not
like it once was.

So instead of complaining about Google not doing it "your way" (by
using some kind of script you thought up in a few minutes), work with
them and put in htaccess info that redirects to the non-www URL Will
take only 3 to 5 minutes of your time, including uploading it to your
site, versus the amount of time spent sharing complaints about
something that is and has been in your control all this time.

Follow-up set to the more appropiate group for this
thread/question/complaint.

HTH, HAND

Carol
Aug 14 '05 #16
On Sat, 13 Aug 2005 18:41:09 -0700, sandy
<sa********@slowtorture.spammers.com> wrote:
2) I have little or no control over the external links people make
to my site.


Yes you do. Serve 301 redirects to those you don't like, pointing to
those you do. This will fix everything you want.

Aug 14 '05 #17
Andy Dingley wrote:
...something

Alexa, ironically, treats the two exactly
the same.
--
/* Sandy >--oO0>
**
*/
Aug 16 '05 #18
sandy wrote:
I think we've beaten this off topic horse to death already.
I'm certainly getting tired of your apparent belief that everyone else
should do whatever you think is most convenient for you, regardless of
any harm done to others. Unless you come up with something new, or show
signs of being willing to solve your own problems, I won't be responding
again.
1) shared host ISPs often (usually do) alias www.domainname.suffix
to domainname.suffix. It happens all over the net. We're talking
about a common phenomenon here.
This is obviously true.
2) I have little or no control over the external links people make
to my site. People make links to my site and they often put the "www"
onto the front link out of habit. I can't control that.
This, however, is false. (Well, you can't control what URLs are
originally requested, but you *can* control the URLs to which those
requests eventually resolve -- which is all that you need.) You've been
told by several people, myself included, how you can easily do this.

If you don't understand what you've been told or how it applies to your
situation, you'd do better to admit it and ask for clarification than to
ignore the information.
A side effect is that my page rank doesn't get incremented by Google,
because Google counts www.domainname.suffix as different than
domanname.suffix. Even though it would be easy to figure out
a statistically more accurate assumption.
"Statistically more accurate" admits that your assumption is wrong part
of the time. Anyone with the overall good of the web at heart will try
to follow the principle of "first, do no harm" -- which knowingly
introducing errors violates. If there were some major advantage which
couldn't be achieved in some other way, it might be worth violating this
principle -- but your particular issue could easily be resolved if *you*
were willing to do a small amount of work.
3) Yes, sometimes the same IP addresses map to entirely
different sites. But when
diff(wget(domain1), wget(domain2)) == "" and the two domains
share the same IP address, then the intelligent assumption is that
they are the same site. The chance that two symantically *different*
sites would diff to null is similar to the chance
two 6 byte strings might map to the same MD5 hash: it ain't gonna
happen, ever. Not in your lifetime or mine.


Do you have *any* idea of just how much extra work you're asking search
engines to do? Not to mention that, since getting the two sites
wouldn't be simultaneous, the two versions of a site that is being
actively updated might well not match...

Dave

Aug 18 '05 #19
Do you have *any* idea of just how much extra work you're asking search
engines to do?


It's not much extra work at all actually.
The B-tree structures they use to keep track of all this stuff,
combined with nearly continual spidering they do
in any case, make it a snap.

As I already mentioned, Alexa already does this.
Do you have any idea what you're talking about?

--
/* Sandy Pittendrigh >--oO0>
** http://montana-riverboats.com
*/
Aug 20 '05 #20
begin quotation
from sandy <sa********@slowtorture.spammers.com>
in message <77********************@bresnan.com>
I have a hobby website at:

http://www.montana-riverboats.com
which also resolves as:
http://montana-riverboats.com ...without the www.

One address has a Google page rank of three.
The other address has a Google page rank of four.
(at least according to the Firefox page rank plugin).

How can this be? They are the same site.
No, they aren't. They are two different hostnames, thus two different
sites.

Redirect the latter to the former, and you should be fine. You really
should only use bare domains for mail, not as a separate hostname.
I thought Google was smart. How can they be so transparently
stupid, and not fix it?


Google is pretty smart. It's Webmasters (not just you) that are stupid
in expecting Google to aggregate results for two different hostnames.

--
___ _ _____ |*|
/ __| |/ / _ \ |*| Shawn K. Quinn
\__ \ ' < (_) | |*| sk*****@speakeasy.net
|___/_|\_\__\_\ |*| Houston, TX, USA
Aug 20 '05 #21

Shawn K. Quinn wrote:
Redirect the latter [www.] to the former [bare], and you should be fine.
Why that order? (yes, I always do it the other way)
You really
should only use bare domains for mail, not as a separate hostname.


Why?

Aug 22 '05 #22
begin quotation
from di*****@codesmiths.com <di*****@codesmiths.com>
in message <11**********************@g43g2000cwa.googlegroups .com>
posted at 2005-08-22T11:46

Shawn K. Quinn wrote:
Redirect the latter [www.] to the former [bare], and you should be fine.


Why that order? (yes, I always do it the other way)


Err, if this is really what I said, I got the order backwards. The bare
domain should redirect to hostname www.
You really should only use bare domains for mail, not as a separate
hostname.


Why?


It's better form.

--
___ _ _____ |*|
/ __| |/ / _ \ |*| Shawn K. Quinn
\__ \ ' < (_) | |*| sk*****@speakeasy.net
|___/_|\_\__\_\ |*| Houston, TX, USA
Aug 22 '05 #23
Shawn K. Quinn wrote:
Err, if this is really what I said, I got the order backwards.
No, sorry - I labelled them incorrectly
The bare domain should redirect to hostname www.
That's what I thought. But why ? www. is an old kludge and we're past
it now. I build sites to respond to either, with a 301 pointing from
the www. to the bare domain. The only time I use "www." is when I'm
printing stationery for a non-tech user community that won't recognise
a web address as one unless it begins with "www."

I certainly don't enter addresses with the www. I get quite surprised
these days when a site doesn't respond to it (and a Firefox extension
then tells me and goes there anyway).
It's better form.


Again, why? That answer is just tautological - _Why_ is it better form
? What's the advantage?

Aug 22 '05 #24
begin quotation
from di*****@codesmiths.com <di*****@codesmiths.com>
in message <11*********************@g47g2000cwa.googlegroups. com>
posted at 2005-08-22T13:18
Shawn K. Quinn wrote:
Err, if this is really what I said, I got the order backwards.
No, sorry - I labelled them incorrectly
The bare domain should redirect to hostname www.


That's what I thought. But why ?


Because "www" is the canonical hostname for Web server.
www. is an old kludge and we're past it now.
It's not an old kludge. There may come a time when you want to run other
services besides a Web server on that domain. You may wish to run more
than one site under the same domain.
I build sites to respond to either, with a 301 pointing from the www.
to the bare domain.
Surprising as it may be to you and some other people, a domain name is
not just a "Web address". Apparently you have fallen for some of the
marketing-speak some registrars have floated.
The only time I use "www." is when I'm printing stationery for a
non-tech user community that won't recognise a web address as one
unless it begins with "www."


Even then, the Web address (URL) begins with "http://", not "www.".
Getting in the habit of leaving off the scheme of a URL will bite you
when it comes time to make an external link and you forget there. This
is why I always type in "http://". Browsers let you get away with
entering just a hostname, but there is always the possibility they will
mis-guess the protocol, especially for a hostname such as "news" or
"ftp".

--
___ _ _____ |*|
/ __| |/ / _ \ |*| Shawn K. Quinn
\__ \ ' < (_) | |*| sk*****@speakeasy.net
|___/_|\_\__\_\ |*| Houston, TX, USA
Aug 22 '05 #25
Shawn K. Quinn wrote:
Because "www" is the canonical hostname for Web server.
Rubbish. There is absolutely _no_ notion of "www" as a canonical name
for a web server. Servers have canonical names, and CNAMEs may give
aliases for them, but there's nothing more than a vague and obsolete
old convention that web servers were named "www." and absolutely
nothing that binds the canonical name to the www. form of the name more
than any other.
It's not an old kludge. There may come a time when you want to run other
services besides a Web server on that domain.
That's what ports are for. Or subdomains, should I wish to do it that
way.
The only time I use "www." is when I'm printing stationery for a
non-tech user community that won't recognise a web address as one
unless it begins with "www."


Even then, the Web address (URL) begins with "http://", not "www.".


Only if you understand such things. I speak geek on usenet, but base
marketing on what works for the ignorant masses (and if I had a horse,
I'd probably speak German to it). Some of my best customers can't even
find the keys for "://"
Getting in the habit of leaving off the scheme of a URL will bite you
when it comes time to make an external link and you forget there.


Then don't make glaring mistakes, or at least test for them afterwards.
It's a poor excuse to paint "http://" on the side of a truck because a
careless HTML coder might otherwise forget to put it in!

Incidentally, do any other Brits rememember the late-90s haulage firm
whose red & white trucks had "email: jrt.co.uk" on their tailgates for
several years before anyone corrected them ?

Aug 22 '05 #26
begin quotation
from di*****@codesmiths.com <di*****@codesmiths.com>
in message <11**********************@z14g2000cwz.googlegroups .com>
posted at 2005-08-22T15:11
Shawn K. Quinn wrote:
Because "www" is the canonical hostname for Web server.
Rubbish. There is absolutely _no_ notion of "www" as a canonical name
for a web server. Servers have canonical names, and CNAMEs may give
aliases for them, but there's nothing more than a vague and obsolete
old convention that web servers were named "www." and absolutely
nothing that binds the canonical name to the www. form of the name more
than any other.


"Vague" and "obsolete" is a matter of opinion. FTP servers have
been named or aliased to "ftp.<domain>" since the beginning. Gopher
servers, back before the Web became what it is, were often named or
aliased to "gopher.<domain>". DNS nameservers are still, to this
day, named or aliased to names like "ns1.<domain>" (increasing the
number as appropriate). Usenet news servers are named or aliased to
"news.<domain>". I could go on and on here... but you see what I'm
getting at? If not, how many different services could you run on *one*
computer called "example.com"? Do you really want to pay $8 per year for
each *computer* on your network to have its own domain name, instead of
naming them all under one domain?
It's not an old kludge. There may come a time when you want to run other
services besides a Web server on that domain.


That's what ports are for. Or subdomains, should I wish to do it that
way.


I think you misunderstand what a subdomain really is, and meant to say
hostnames under the same domain.

Asking people to remember "http://example.com:8000 for software updates"
is ludicrous.
> The only time I use "www." is when I'm printing stationery for a
> non-tech user community that won't recognise a web address as one
> unless it begins with "www."


Even then, the Web address (URL) begins with "http://", not "www.".


Only if you understand such things.


No, URLs always begin with a scheme identifier. Always. RTFRFCs.
I speak geek on usenet, but base marketing on what works for the
ignorant masses (and if I had a horse, I'd probably speak German to
it). Some of my best customers can't even find the keys for "://"


Then they need typing lessons. Not your fault as World Wide Web site
author.
Getting in the habit of leaving off the scheme of a URL will bite you
when it comes time to make an external link and you forget there.


Then don't make glaring mistakes, or at least test for them afterwards.
It's a poor excuse to paint "http://" on the side of a truck because a
careless HTML coder might otherwise forget to put it in!


Poor excuse? I'd happily be the first to do business with a company that
actually knows what a URL is instead of just painting a hostname on
their trucks.

Relying on the browser to guess "http://" eventually leads to stuff
like:

<a href="www.yahoo.com">Go to Yahoo!</a>

which, for many good reasons, does *not* go to Yahoo!

--
___ _ _____ |*|
/ __| |/ / _ \ |*| Shawn K. Quinn
\__ \ ' < (_) | |*| sk*****@speakeasy.net
|___/_|\_\__\_\ |*| Houston, TX, USA
Aug 22 '05 #27
On Mon, 22 Aug 2005, Shawn K. Quinn wrote:
No, URLs always begin with a scheme identifier. Always. RTFRFCs.

Relying on the browser to guess "http://" eventually leads to stuff
like:
<a href="www.yahoo.com">Go to Yahoo!</a>
which, for many good reasons, does *not* go to Yahoo!


BTW:
On http://groups.google.com/group/alt.html , there's a link to

http://www.authoring.html/

--
Top-posting.
What's the most irritating thing on Usenet?

Aug 22 '05 #28
begin quotation
from Andreas Prilop <nh******@rrzn-user.uni-hannover.de>
in message <Pi**************************************@s5b004.r rzn-user.uni-hannover.de>
posted at 2005-08-22T16:13
On Mon, 22 Aug 2005, Shawn K. Quinn wrote:
No, URLs always begin with a scheme identifier. Always. RTFRFCs.

Relying on the browser to guess "http://" eventually leads to stuff
like:
<a href="www.yahoo.com">Go to Yahoo!</a>
which, for many good reasons, does *not* go to Yahoo!


BTW:
On http://groups.google.com/group/alt.html , there's a link to

http://www.authoring.html/


This is the result of brain-dead software that assumes everything with
"www" or "www." in it should be made into a link to a Web server with
that hostname, and a good reason for differentiating between real URLs
and hostname with a path slapped on the end (HNWAPSOTE).

--
___ _ _____ |*|
/ __| |/ / _ \ |*| Shawn K. Quinn
\__ \ ' < (_) | |*| sk*****@speakeasy.net
|___/_|\_\__\_\ |*| Houston, TX, USA
Aug 22 '05 #29
http://no-www.org/
http://www.yes-www.org/

FWIW, my site does not have www.

--
Henri Sivonen
hs******@iki.fi
http://hsivonen.iki.fi/
Mozilla Web Author FAQ: http://mozilla.org/docs/web-developer/faq.html
Aug 22 '05 #30
Henri Sivonen wrote:
http://no-www.org/
http://www.yes-www.org/

FWIW, my site does not have www.


I include www. in the URIs for my pages from old habit. I am going to
start omitting it.

--
James Pickering
http://jp29.org/

Aug 22 '05 #31
On Mon, 22 Aug 2005 11:06:01 -0500, "Shawn K. Quinn"
<sk*****@speakeasy.net> wrote:
begin quotation
from di*****@codesmiths.com <di*****@codesmiths.com>
in message <11**********************@z14g2000cwz.googlegroups .com>
posted at 2005-08-22T15:11
Shawn K. Quinn wrote:
Because "www" is the canonical hostname for Web server.
"Vague" and "obsolete" is a matter of opinion.
Indeed. Which is why your very vehement claim that only a "www" name is
correct is over-stating things.
You really
should only use bare domains for mail, not as a separate hostname.
Assuming RFC 2119, then that's a strong statement. There's nothing to
back this up.
FTP servers have been named or aliased to "ftp.<domain>" since the beginning. Gopher
servers, back before the Web became what it is, were often named or
aliased to "gopher.<domain>".
FTP or Gopher have never been the mainstream protocol. Back in the day,
email was the thing and so the bare name was conventionally used for the
most significant protocol. Less convenient names, such as
ftp.example.org were used for the less important protocols. In the early
'90s, this upstart new web protocol was obviously

Nowadays though, the web has supplanted email as the primary protocol.
So it's only reasonable that it should encroach upon the "unadorned"
name (and after all, our machines understand ports). The world now
contains "The Intaweb" and "the Internerd" - they're different places
and although one is implemented by the rules of another, the wetware and
its user experience is quite different.

It's not an old kludge. There may come a time when you want to run other
services besides a Web server on that domain.
So do it. Name them ftp.example.org if you wish. We're talking about a
convention for naming web servers.

That's what ports are for. Or subdomains, should I wish to do it that
way.


I think you misunderstand what a subdomain really is, and meant to say
hostnames under the same domain.


No, if I'd meant hostnames I'd have said hostname. Your meaning of
hostname is incorrect anyway - DNS doesn't understand hostnames (as
"zaphod" in the machine named zaphod.example.org) it just sees all of
these parts as "subdomains", whether they represent a physical machine
or not, and whether they're the terminal part of the full name. As far
as DNS is concerned, it's all just fragments that it must treat as a
tree of idenitifiers.

Additionally, the term "hostname" is an old term defined by RFC952 which
has clearly been superceded when they are now permitted to begin with a
digit (contrary to 952). "8ball.example.org" is a legal subdomain name
of "8ball", but it certainly can't be a "hostname"

Asking people to remember "http://example.com:8000 for software updates"
is ludicrous.
Indeed so - but who is suggesting this straw man argument ? That URL
would better be provided as http://example.com/updates/ or similar.
There's no reason to confuse a navigational structure with a transport
protocol and it's bad usability design to have done this.

Of course you _could_ do this. It's valid, and indeed many ASP
developers do exactly that (IIS has poor support for DNS and is easier
to configure if you select by ports, particularly during development)

> The only time I use "www." is when I'm printing stationery for a
> non-tech user community that won't recognise a web address as one
> unless it begins with "www."

Even then, the Web address (URL) begins with "http://", not "www.".


"Web addresses" aren't URLs. URLs make the protocol work (and have
rigid rules), "web addresses" are a human-factors question and are typed
into browsers.

It is incorrect to enter a URL without the protocol scheme -- but how
many times have you typed that into a browser lately ? I dont type
"HELO" much either - I have machines to do that for me.

The only question here, apart from some geek posturing, is whether the
user usability issues favour "example.org" or "www.example.org" as the
target of site-name redirections, and for printed text versons of URLs.
Relying on the browser to guess "http://" eventually leads to stuff
like:

<a href="www.yahoo.com">Go to Yahoo!</a>


Rubbish. Conscious thought about usability for your users is no excuse
for sloppy doing, nor is it contagious.
--
Cats have nine lives, which is why they rarely post to Usenet.
Aug 22 '05 #32


Andy Dingley wrote:
The only question here, apart from some geek posturing, is whether the
user usability issues favour "example.org" or "www.example.org" as the
target of site-name redirections, and for printed text versons of URLs.


One of the most valid reasons for using "www" is that you can do this:

example.com. IN NS nameserver-01.example.net.
example.com. IN NS nameserver-02.example.net.
www.example.com. IN CNAME webserver-01.example.net.

....but not this:

example.com. IN NS nameserver-01.example.net.
example.com. IN NS nameserver-02.example.net.
example.com. IN CNAME webserver-01.example.net.

Thor

--
http://www.anta.net/OH2GDF
Aug 24 '05 #33

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
by: while_1 | last post by:
If I have a links page that uses php header calls, for each link, to jump to an external page, does Google see those links as "internal to my site" or do they get counted as links to the redirect?...
5
by: adrianTNT | last post by:
Hello; Anyone knows how can I create a scrip that returns the google page rank of a given web page? I have seen this on many web pages but I want to host my own script and I dont know how it...
44
by: john bailo | last post by:
I microcrap is buying Google -- then I want a GNU search engine ! ( Cant that decrepit creep keep his hands off of anything ? )
3
by: Johann Blake | last post by:
This aticle presents factual evidence that Google's PageRank, inbound links and keywords are irrelevent to your placement in the search results. It presents several case studies that show...
9
by: Ray5531 | last post by:
Sorry if this is irrelevant to this website,but I didn't find a better place to ask this question.I sent an email to google as well which I didn't recieve the answer.I'm creating a website and...
5
by: tshad | last post by:
Is there a way to carry data that I have already read from the datagrid from page to page? I am looking at my Datagrid that I page through and when the user says get the next page, I have to go...
0
globalguideline
by: globalguideline | last post by:
Hi all here any one knows how to get the Google page rank of any web site with out using the Google tool bar mean using any web language like PHP, ASP, .Net, JavaScript etc. at run time mean by...
3
by: veejaykrishna | last post by:
I am developing a website(domain website) FOR CALCULATE DOMAIN WORTH LIKE http://estibot.com/ Here let me find Alexa rank,Google page rank,Back links of websites(I am developing the site...
2
by: pavanip | last post by:
Hi, I am developing Domain/Website project. I have to calculate worth of a site based on Alexa rank,page rank, and Google search results for a website. Please give me some idea about how...
1
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.