469,281 Members | 2,450 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,281 developers. It's quick & easy.

Do splash pages make you invisible to search engines

Here's a question I don't know the answer to:

I have a friend who makes very expensive, hand-made bamboo
flyrods. He's widely recognized (in the fishing industry) as one of
the 3-5 'best' rod makers in the world. He gets (sic) close to $5000
per custom made flyrod. A surprising number of people buy these
fishing rods and never use them....they buy them as art-like
investments. He is, after all, the best there is.

But if you search on Google for 'bamboo flyrod' or 'split cane flyrod'
he doesn't even show up in the first ten pages of links.
Typing in a colon, followed by his domain name show 534 sites
link to his site. This doesn't make sense. It isn't supposed to
work that way.

His site does have a graphical splash page, with only one link
on it, that says "enter"

Is that splash page related to his search engine invisibility, despite
his lofty stature, and despite the large number of links pointing to
his site?

Nov 10 '05
67 5502
On Mon, 21 Nov 2005 14:11:18 +0000, Stewart Gordon <sm*******@yahoo.com>
wrote:
Just because someone's not on Apache doesn't automatically mean that
whatever he/she/it is on instead is inferior.


Apache permits .htaccess

..htaccess puts configuration in the hands of the person buying the
hosting, not centralising it with an uncooperative adminn.

Apart from Apache, I don't know of another web server that would permit
this same level of control, for people buying cheap shared hosting.

So why is anything else _not_ inferior to Apache? Just this reason
alone is a major factor (as there are few others) in judging the merits
of one server vs. another.

Nov 23 '05 #51
On Mon, 21 Nov 2005, Robert Latest wrote:
Oh. I didn't know sh*t about .htaccess (except that it could be used
for password-protecting pages).


Yes, it's indeed *named* for its traditional role in containing
access restriction statements of various kinds. But already in
NCSA HTTPD, and certainly in Apache, there are many other kinds of
configuration statements which can be placed there. Look at the
individual statements in the Apache documentation to see where they
are eligible, e.g if we're looking at Apache 1.3
http://httpd.apache.org/docs/1.3/mod/directives.html
http://httpd.apache.org/docs/1.3/mod...tml#addcharset

The advantage, over putting them in the server configuration, is that
per-directory configurations can be done on the fly, and will be
honoured without reloading the server.

The disadvantage is that reading and parsing .htaccess files is
somewhat of an overhead, relative to just reading static configuration
files at server startup. And of course anyone with access to the
document tree can go and scribble in the .htaccess file, which could
be embarrassing if they don't know what they are doing.

*Some* kinds of stuff that can go there do have server security and/or
privacy implications. Which I suppose is why some server admins
disable the feature altogether. But mostly, the kind of features
which we're discussing here (AddType, AddCharset and friends,
MultiViews, Addlanguage and so on) seem to be left enabled. It's sure
worth a try, anyway.
Nov 23 '05 #52
Andy Dingley wrote:
.htaccess puts configuration in the hands of the person buying the
hosting, not centralising it with an uncooperative adminn.

Apart from Apache, I don't know of another web server that would permit
this same level of control, for people buying cheap shared hosting.


IIS could be managed site per site (see the operators tab in each site
properties).
It could be easyly used for corporates webservers, but I can't imagine a
web hosting company providing such a service : all the administration
tasks are made via an MMC snapin.
Nov 23 '05 #53
On Tue, 22 Nov 2005 11:00:12 +0100, Pierre Goiffon
<pg******@free.fr.invalid> wrote:
IIS could be managed site per site (see the operators tab in each site
properties).
It could be easyly used for corporates webservers,
There's nothing easy about IIS configuration.

IMHE, IIS is far too fond of corrupting its metabase and refusing to
re-start. For that reason I never use the admin tool (which is ghastly
anyway) and instead use a few lines of nasty Perl that open IIS as an
ActiveX object and read the config into an XML file. I hand-edit this,
then use the tool to write it back. This is also far better for
deploying sites, or replicating server configs.
I can't imagine a web hosting company providing such a service


Indeed.
Nov 23 '05 #54
Philip Ronan said the following on 11/18/2005 11:01 +0200:
"Harrie" wrote:
I've looked at Apache's mod_alias for the redirect directive and was
able to figure out how to redirect one domain to another, but without
setting up a new virtualhost for a domainname with and without the www
prefix, I don't see how I can achief something like this, since the
first argument is a URL-path, not an URL itself.


This ought to do it:

RewriteCond %{HTTP_HOST} !^www\..*$ [NC]
RewriteCond %{HTTP_HOST} !^$
RewriteRule ^/(.*) http://www.your-site.example.com/$1 [L,R]


Thanks Philip.

It's not working though, but I've never played with mod_rewrite before,
so I must be doing something wrong (although "apachectl configtest" says
the syntax is OK and I know the module is loaded). I'll do some reading
about mod_rewrite and if I can't get it to work I'll ask in an Apache group.

I'll have to go to an Apache group anyhow, another friend of mine his
Apache server has been compromised (as far as we know only as far as the
account Apache runs on) and I need some hints to secure things up. I
suspect his PHP config is not restrictive enough (which is not Apache
itself, but I would expect his logs to reveal something, which it
doesn't (maybe the root account is compromised after all)).

Oh well, life goes on (and I'm very happy it's not my server) ..

--
Regards
Harrie
Nov 24 '05 #55
On 10 Nov 2005 09:37:24 -0800, "Sa***************@gmail.com"
<Sa***************@gmail.com> wrote:
Here's a question I don't know the answer to:

I have a friend who makes very expensive, hand-made bamboo
flyrods. He's widely recognized (in the fishing industry) as one of
the 3-5 'best' rod makers in the world. He gets (sic) close to $5000
per custom made flyrod. A surprising number of people buy these
fishing rods and never use them....they buy them as art-like
investments. He is, after all, the best there is.

But if you search on Google for 'bamboo flyrod' or 'split cane flyrod'
he doesn't even show up in the first ten pages of links.
Typing in a colon, followed by his domain name show 534 sites
link to his site. This doesn't make sense. It isn't supposed to
work that way.

His site does have a graphical splash page, with only one link
on it, that says "enter"

Is that splash page related to his search engine invisibility, despite
his lofty stature, and despite the large number of links pointing to
his site?


He can check the keywords on other people's pages by clicking
"view" on an internet explorer menu, and then source. Then all he has
to do is add the keywords commonly used on other people's sites to his
own, plus make sure he's got a few more, and tell him also to make
sure that he's got ALL the words up that someone wanting to buy one of
his fly-fishing rods would use, to the best of his ability.
Jan 13 '06 #56
Thu, 12 Jan 2006 22:20:35 -0800 from Scotius <wo******@mnsi.net>:
On 10 Nov 2005 09:37:24 -0800, "Sa***************@gmail.com"
<Sa***************@gmail.com> wrote:
Here's a question I don't know the answer to:

I have a friend who makes very expensive, hand-made bamboo
flyrods. He's widely recognized (in the fishing industry) as one of
the 3-5 'best' rod makers in the world. He gets (sic) close to $5000
per custom made flyrod. A surprising number of people buy these
fishing rods and never use them....they buy them as art-like
investments. He is, after all, the best there is.

But if you search on Google for 'bamboo flyrod' or 'split cane flyrod'
he doesn't even show up in the first ten pages of links.
Typing in a colon, followed by his domain name show 534 sites
link to his site. This doesn't make sense. It isn't supposed to
work that way.

His site does have a graphical splash page, with only one link
on it, that says "enter"


Well, there's your (his) problem.

Search engines can't index Flash. And many visitors hate wasting time
on splash pages.

I never cease being amazed at how many Web sites shoot themselves in
the foot by being fancy instead of useful. If I want to buy a fly
rid, I don't want to watch a movie about fly rods before I can get on
with buying.

--
Stan Brown, Oak Road Systems, Tompkins County, New York, USA
http://OakRoadSystems.com/
HTML 4.01 spec: http://www.w3.org/TR/html401/
validator: http://validator.w3.org/
CSS 2.1 spec: http://www.w3.org/TR/CSS21/
validator: http://jigsaw.w3.org/css-validator/
Why We Won't Help You:
http://diveintomark.org/archives/200..._wont_help_you
Jan 14 '06 #57
I think the point that you're all missing here is that the web is an
evolving entity. Many sites were built in the late 90's using poor
markup, because markup and the lack of good CSS support made it
IMPOSSIBLE to do complicated, beautiful designs without writing
atrocious HTML. Yes, I've done it. Additionally, for most sites built
in the 90s, SEO meant keywords and descriptions. That's it. Things are
different now.

Build me a site that's going to be perfectly compatible with all known
SEO techniques, present and future, that's guaranteed to work well for
the search engines ad infinitum. You can't do it. You can use what you
know now, but future search engines might want very different things
from what works today. (PageRank, anyone?)

For-profit businesses simply cannot afford to reprogram client websites
every two years for free just because the technology has changed. Would
any of you do that? Are you going back and converting all of your
client's websites to Valid XHTML 1.1 Strict from HTML 4.01? For free?
Because the technology has changed?

It's very easy to critique other people's work, but be careful....You
might have a site you did five years ago that looks bad and is coded
worse, and you'd be the next one torn apart on this list.

Jan 26 '06 #58
In article <11**********************@o13g2000cwo.googlegroups .com>,
st*************@gmail.com wrote:
I think the point that you're all missing here is that the web is an
evolving entity. Many sites were built in the late 90's using poor
markup, because markup and the lack of good CSS support made it
IMPOSSIBLE to do complicated, beautiful designs without writing
atrocious HTML. Yes, I've done it. Additionally, for most sites built
in the 90s, SEO meant keywords and descriptions. That's it. Things are
different now.


So that seems to argue for not hand coding pages, but for generation of
pages from content in databases. If a makeover is required, a business
gets a new look design (from a contract designer) they can pass to the
program contractor to alter the content generator.

This actually seems to be arguing for increased separation of content
from appearance, with web pages just one subset of documents all
produced from original source. Isn't that how really large companies are
already doing it?

--
http://www.ericlindsay.com
Jan 26 '06 #59
Of course. But site construction is always a matter of budget, and
sites that are out of date from a technology standpoint often didn't
have such systems available when they were constructed, or the systems
required to manage content in this fashion were prohibitively
expensive. A lot of companies that I've worked with essentially
anticipate that sites will have fixed lifetimes, and they will
replace/update/migrate their site into new systems as technology
evolves. Sites cannot be built with an indefinite lifespan and be
expected to conform with all future standards when they don't even
exist yet.

Matt

Jan 27 '06 #60
Of course. But site construction is always a matter of budget, and
sites that are out of date from a technology standpoint often didn't
have such systems available when they were constructed, or the systems
required to manage content in this fashion were prohibitively
expensive. A lot of companies that I've worked with essentially
anticipate that sites will have fixed lifetimes, and they will
replace/update/migrate their site into new systems as technology
evolves. Sites cannot be built with an indefinite lifespan and be
expected to conform with all future standards when they don't even
exist yet.

Matt

Jan 27 '06 #61
Tim
On Thu, 26 Jan 2006 12:45:54 -0800, stockliasteroid sent:
For-profit businesses simply cannot afford to reprogram client websites
every two years for free just because the technology has changed. Would
any of you do that? Are you going back and converting all of your client's
websites to Valid XHTML 1.1 Strict from HTML 4.01? For free? Because the
technology has changed?


Isn't necessary, and the way that HTML "works" hasn't changed that much
over time. The HTML specs are *OLD*, what's HTML 4.01 now will continue
to be HTML 4.01 in years to come. The pages that do stupid things now,
were usually just as broken back then.

Authors still write crap pages now, they don't write them to the specs
(which don't change), they kludge them to the perceived behaviour of the
clients (which continually changes).

If you do things right in the first place, and I mean "right" not just
"seems to do what you wanted" there isn't any of this "pages need to be
rewritten" silliness.

What any entity really cannot afford is to kludge together some crap in
the hope that it'll work. *That's* the sort of thing that will need
rewriting, again and again.

--
If you insist on e-mailing me, use the reply-to address (it's real but
temporary). But please reply to the group, like you're supposed to.

This message was sent without a virus, please destroy some files yourself.

Jan 27 '06 #62
What about browser quirks? I've had plenty of experiences having to
write something that I didn't like, just to accomplish a design that I
was given. Believe it or not, coders don't always have the luxury of
deciding what their stuff is going to look like. That's what designers
are for, and they don't always "get" how the web works. If there were
no designers and coders ruled, the web wouldn't have much to look at,
but it would all be beautiful markup.

Once the browsers perfectly support the *same* behavior, all of the
time, I'll stop using the Holly Hack and all kinds of other kludges
that I despise and yet are sadly necessary.

Yep, nested tables used to be necessary. Thank God they aren't anymore.

Jan 27 '06 #63
On 26 Jan 2006 12:45:54 -0800, st*************@gmail.com wrote:
Many sites were built in the late 90's using poor
markup, because markup and the lack of good CSS support made it
IMPOSSIBLE to do complicated, beautiful designs without writing
atrocious HTML.


No - the old broswers (lets assume Netscape 4 for the "popular" period)
never required "atrocious" HTML. Tables and 1 pixel gifs might be an
ugly _use_ of HTML, but it's not _invalid_ HTML.

Jan 27 '06 #64
Note: Atrocious != Invalid. Atrocious ~= Invalid.

Well, that's a question of form versus function. And the question is,
are you a Zeldman type who cares about both, or do you only care about
validation (seems common on this list), or are you a Mike Davidson who
says that if it produces good visual results, damn the validation.
Sounds like a matter of opinion. I'd like to be a Zeldman, and that's
finally becoming possible while still getting correct visual results.

marginwidth and marginheight were proprietary MS attributes that
essentially generated "invalid" HTML. But if you wanted to get it to
look right, you gotta be invalid.

Oh, and by the way Andy, your home page isn't exactly a miracle of
validation either. As long as we're hashing other people's work, we
should at least be honest about our own.

Matt

Jan 27 '06 #65
On Fri, 27 Jan 2006, Andy Dingley wrote:
No - the old broswers (lets assume Netscape 4 for the "popular"
period) never required "atrocious" HTML.
I think that's a matter of opinion, Andy.
Tables and 1 pixel gifs might be an
ugly _use_ of HTML, but it's not _invalid_ HTML.


Oh, but HTML can be 100% syntactically valid - and even pass formal
WAI verification checks - and yet be 100% semantically bogus.

I *did* once find what I could rate as an acceptable use for the
ubiquitous 1x1 pixel transparent GIF. In fact, I'm still using it for
that purpose - although, thanks to modern browsers, I could now do
that task better, and, one day when my back-order of Round Tuits comes
in, I'll deal with that. It wasn't for the things that the
pixel-layout merchants used it for, though.

It *certainly* wasn't for the purpose used by this still-live page -
which I energetically protested at the time it was published, several
years ago now (excerpt only), and was ruled out of order for trying
to interfere:

Text Only
university of glasgow
guide to web publishing
for layout only
for layout only
About WWW Service
Web Policy
Access Stats
for layout only
Web Publishing Help
Web Page Templates
Web Accessibility
Web Forms
Computing Service Training
for layout only

[...big snip...]

Disability and Information Systems in Higher Education
for layout only
for layout only
for layout only
[top of page]
for layout only
Level A conformance icon, W3C-WAI Web Content Accessibility Guidelines
1.0 for layout only

[fx: shudder]

- and whose dead hand (via a "mandatory" design template for
centrally-published pages) is still visible - this page, for example,
is linked as "Current Activities and Projects", although its mandatory
title element is EMPTY, which is another piece of semantic bogosity:

university of glasgow
skip navigation
Computing Service
for layout only
is a to z services
for layout only
for layout only
for layout only
Help & Frequently Asked Questions
IT Purchases
Training
Software Downloads
Student Computing
for layout only
for layout only
About Us

etc. etc.

Sigh.
Jan 27 '06 #66
Tim
On Fri, 27 Jan 2006 07:15:34 -0800, stockliasteroid sent:
What about browser quirks?


And which browser quirk do you really *need* to exploit?

And while you're exploiting that quirk, are you going to force each
visitor to use the web browser that you kludged things for it to work in?
(You can't, of course.) Two weeks after you've bastardised HTML, some
update to a browser invalidates your handiwork, so you kludge things
again. And even before then, the attempt to make a pixel perfect page
fails, because there's already three or more different browsers in main
use by everyone one the WWW, each behaving differently.

Most over-designed pages that I see fail miserably on a collection of web
browsers (midget text, oversized text, overlapping text, text cut off
inside rigidly sized elements, pages wider than my screen, and a plethora
of other stupid faults). Most well made pages work fine on anything I try
them with.

Designers are like lawyers, arguing crap just to earn more money.

--
If you insist on e-mailing me, use the reply-to address (it's real but
temporary). But please reply to the group, like you're supposed to.

This message was sent without a virus, please destroy some files yourself.

Jan 28 '06 #67
Lol...Nice points, of course. However, I don't live in an ideal world
where I can just tell a client "Yes, I know that's a pretty design, but
I can't do it. Well, actually I can, but it won't be really valid, and
I'll have to hack it together. And I refUSE to use hacks, kludges, or
anything else that conflicts with my markup morality." Client response:
"What the heck does 'valid' mean, and why do I care? Oh wait. I don't.
Fine, I'll go find someone who can get the job done." And then someone
else will get it done, using the same hacks that I would have had to
use.

If I wait for browsers to behave perfectly, I'll not only be waiting a
long time, but I'll be so far behind the leading edge that I won't be
able to deliver anything that my clients want. (and no, I'm not a
designer. I'm an implementor)

It seems that each generation of browsers allow us to surrender one
type of hack (1px clear GIFs), only to trade it for another (Holly
Hack). Fortunately, it seems to be moving in the right direction, and
the HTML/CSS I'm able to use to generate the same types of designs
improves all the time, and might just someday be perfectly semantically
valid.

For now, I must continue trying to get the designers I work with to
realize the evils of graphical text elements (among many other evils
that they commit in the name of good design), and use browser quirk
workarounds when I can't convince them to design otherwise.
Unfortunately or fortunately, depending on your point of view, my
company is known for leading edge design, which means that I must
implement what I am given.

Matt

Jan 30 '06 #68

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

reply views Thread by R. Rajesh Jeba Anbiah | last post: by
64 posts views Thread by Manfred Kooistra | last post: by
8 posts views Thread by Sandy Pittendrigh | last post: by
1 post views Thread by CARIGAR | last post: by
reply views Thread by suresh191 | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.