473,771 Members | 2,297 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Opinion: Do web standards matter?

Just out of curiosity, while checking on a site I was working on, I
decided to throw a couple of the web's most popular URLs into the W3C
Markup Validator.

Out of microsoft.com, google.com, amazon.com, yahoo.com, aol.com, and
mozilla.org, only Mozilla's site came back "Valid HTML".

So if all these places, with their teams of web developers don't seem to
care, should the rest of us small time web devs concern ourselves with
standards? I do, but sometimes I feel it's a wasted effort. What do yinz
think?

P.S. Slashdot returned a 403 Forbidden to the validator but when I saved
the homepage locally, it failed too.
--
[ Sugapablo ]
[ http://www.sugapablo.net <--personal | http://www.sugapablo.com <--music ]
[ http://www.2ra.org <--political | http://www.subuse.net <--discuss ]

http://www.subuse.net : text-only, low bandwidth, anonymous web forums
Jul 23 '05
250 10477
Jim Moe wrote:
Benjamin Niemann wrote:

You should note that it can be pretty hard to get such pages (I mean the
large portal like microsoft, yahoo...) valid. Such pages are dynamically
constructed with content from various sources. [...]
Not so. Large sites use a Content Management System of some sort. Once
the CMS is set up to generate valid code, having a standards-compliant
site is automatic.

But what, if the WCMS allows the author to enter tagsoup that is directly
stored in the DB without validation. Many WCM systems do and most of them
did a few years ago. You'd still have to plow through thousands of
documents and correct them.
Really it's some combination of hubris, sloth, poor education and
ignorant management.

The vast amount of existing content makes it a great challange for these
sites. And much of this content and the software behind it (their probably
homegrown WCM system) originates from a time when standards unfortunatly
did no matter...

I totally agree that it would indeed be a very good signal, if they did
undertake this challange (especially MS that co-authored the web standards
they are violating or neglecting).
But I think it is only natural that such giant sites have a much higher
inertia and it would be rather surprising, if they reached the goal before
the bulk of small sites do.

If I would like to know, if these big players care about web standards, I
would look at their newer pages. E.g. what about the new MSN search? It
just recently went public. THAT would be a shame, if they'd still use
tagsoup for these sites. (In fact, it looks pretty good. XHTML 1.0 strict
with just a few glitches).
--
Benjamin Niemann
Email: pink at odahoda dot de
WWW: http://www.odahoda.de/
Jul 23 '05 #31
Barbara de Zoete wrote:
What I miss in your explanation to me, is the passion. The reasoning
about Transitional is what they can read or figger out themselves. The
passion tells them about reaching all people on earth that are somehow
connected to the internet. No matter what machine, what browser, what
ever means, if someone is connected, you can reach them.


Very difficult to get the passion across in a text format like this.
And when I teach on-line, as I often do, it is difficult to get that
across as well, this fall will be in the classroom and the students will
experience my passion. I take pride in doing the best job I can do, no
matter what that job may be.

Far from perfect, my pride (soon to get a facelift, I hope) is
http://alamo.nmsu.edu/, much of my 40 hour a week job. I use the
difference between editing one of the older table layout pages with the
newer 4.01 strict with CSS pages as an example of ease of updating.

I am really looking forward to the promised development of a new layout.
Others, better at design will come up with the layout/design, while
I'll be involved telling them what is practical/feasible and will
eventually implement the facelift. Exciting times.

--
Stan McCann "Uncle Pirate" http://stanmccann.us/pirate.html
Webmaster/Computer Center Manager, NMSU at Alamogordo
Coordinator, Tularosa Basin Chapter, ABATE of NM; AMA#758681; COBB
'94 1500 Vulcan (now wrecked) :( http://motorcyclefun.org/Dcp_2068c.jpg
A zest for living must include a willingness to die. - R.A. Heinlein
Jul 23 '05 #32
Sugapablo wrote:

That's what a CMS (Content Management System) is for. You write the web
code, and all they do is fill in the content into web forms (etc.) and no
technical knowledge is needed on their part beyond just using the website.

I deal with CMSs all the time. Once you have it layed out, they simply
spit out all kinds of good code.


Problem is expense of a good one. I'll be probably piecing it together
a little at a time as I've been doing for years. Although not the best
coder, I've converted several things into dynamic pages over the years
at NMSU-A. Although somewhat buggy (error_log entries all the time),
http://alamo.nmsu.edu/cgi-bin/directory.pl works well for what it was
designed for and is much easier to maintain than editing several web
pages containing that information as I used to. And, I'm not afraid to
find code so that I don't have to reinvent ... Kinda why I'm looking
into php right now. Lots of stuff written out there that I don't quite
understand yet.

--
Stan McCann "Uncle Pirate" http://stanmccann.us/pirate.html
Webmaster/Computer Center Manager, NMSU at Alamogordo
Coordinator, Tularosa Basin Chapter, ABATE of NM; AMA#758681; COBB
'94 1500 Vulcan (now wrecked) :( http://motorcyclefun.org/Dcp_2068c.jpg
A zest for living must include a willingness to die. - R.A. Heinlein
Jul 23 '05 #33
accooper wrote:
I too try and follow the standards but I don't take much stock in the W3C
validator. Sometimes it will say stupid stuff like " a space is not allowed
here".
It is the validator's job to report *all* syntax errors, not to decide
"Oh, I think this is a minor error, I won't bother the author by
mentioning it this time".
I mean is that really gunna make a difference.


It may not make a difference to you, as someone reading the source code;
but, to an SGML parser, a space (or any other character) where one is
not expected is a syntax error, from which a parser would have to employ
possibly undefined error handling technniques to recover. In the case
of XML, such errors will be fatal, so yes, it really does make a difference.

--
Lachlan Hunt
http://lachy.id.au/
http://GetFirefox.com/ Rediscover the Web
http://GetThunderbird.com/ Reclaim your Inbox
Jul 23 '05 #34
Sugapablo wrote:
Out of microsoft.com, google.com, amazon.com, yahoo.com, aol.com, and
mozilla.org, only Mozilla's site came back "Valid HTML".
At least the microsoft.com home page is getting very close (only 3
errors: 1 missing alt attr, proprietary nowrap attribute and using
checked="true" instead of checked="checke d").
So if all these places, with their teams of web developers don't seem to
care, should the rest of us small time web devs concern ourselves with
standards? I do, but sometimes I feel it's a wasted effort. What do yinz
think?
This sounds like an attempt to justify the presence of errors simply
because they're made by many other organisations, whereas this really
should be a case of learning from other's mistakes, so you don't make
them yourself.

Many people attempt to ignore standards, conformance and validation by
saying that it doesn't matter and/or it doesn't affect anything.
However, the simple fact is that there is little chance we will ever see
a main-stream browser that conforms 100% to HTML 4 simply because doing
so would "break" many more existing (broken) pages than it would
benefit. i.e. Because there are so many poorly coded web pages out there
that *don't* conform to the standards, we will never see a main-stream
browser that does; thus non-conformance has had, and *does have*, a very
detrimental effect.

The SHORTTAG NET features of SGML are one example I can think of, which
will not be implemented for this reason, at least not in Mozilla any
time soon [1].
P.S. Slashdot returned a 403 Forbidden to the validator but when I saved
the homepage locally, it failed too.


That's very strange, at first I thought that might be related to bug
1069 [2] (some hosts reject UAs with libwww-perl in their User-Agent
string), although that didn't work in this case, so slashdot must be
rejecting based on some other factor. However, I was able to validate
with the latest development version of the validator which reported
invalid HTML 3.2 with 130 errors.

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=94284
[2] http://www.w3.org/Bugs/Public/show_bug.cgi?id=1069

--
Lachlan Hunt
http://lachy.id.au/
http://GetFirefox.com/ Rediscover the Web
http://GetThunderbird.com/ Reclaim your Inbox
Jul 23 '05 #35
Lachlan Hunt wrote:
That's very strange, at first I thought that might be related to bug
1069 [2] (some hosts reject UAs with libwww-perl in their User-Agent
string), although that didn't work in this case, so slashdot must be
rejecting based on some other factor.


Slashdot blocks the W3C validator by IP address. They've been doing it for
quite some time. (Embarrassed about poor quality code.)

--
Toby A Inkster BSc (Hons) ARCS
Contact Me ~ http://tobyinkster.co.uk/contact

Jul 23 '05 #36
Toby Inkster wrote:
Lachlan Hunt wrote:
That's very strange, at first I thought that might be related to
bug 1069 [2] (some hosts reject UAs with libwww-perl in their
User-Agent string), although that didn't work in this case, so
slashdot must be rejecting based on some other factor.


Slashdot blocks the W3C validator by IP address. They've been doing
it for quite some time. (Embarrassed about poor quality code.)


Heh, they forgot this one. <g>
<URL:http://www.htmlhelp.co m/cgi-bin/validate.cgi?ur l=http%3A%2F%2F slashdot.org%2F &warnings=ye s>

"The maximum number of errors was reached. Further errors in the
document have not been reported."

In all fairness to Slashdot, the "maximum number of errors" was 50.

--
-bts
-This space intentionally left blank.
Jul 23 '05 #37
Barbara de Zoete wrote [in part]:

On Sat, 26 Mar 2005 11:21:14 -0800, David Ross <no****@nowhere .not> wrote:
If you
don't care, then I don't choose to view your page.


Maybe you have learned something on authoring markup and styles for pages. Now
it is about time to learn on replying in this newsgroup:

- quote the part you reply to;
- attribute the quote.

Just to show you care about the realm you've entered. If you don't care, then I
don't choose to read your postings.


Each newsgroup seems to have its own, distinct conventions for
replying. Some want top-posting of replies; others want
bottom-posting. Some want complete quoting of the entire thread
when replying; others want only the relevant portion of the
previous message. I can't possibly remember all the rules for each
of the 21 newsgroups where I frequently participate. (For Web
standards, however, there is only one HTML 4.01 specification and
one one CSS1 specification; and I still have to read the
specifications for details.)

In this case, I was replying to the Subject, which indeed did
appear as part of my reply.

--

David E. Ross
<URL:http://www.rossde.com/>

I use Mozilla as my Web browser because I want a browser that
complies with Web standards. See <URL:http://www.mozilla.org/>.
Jul 23 '05 #38
in comp.infosystem s.www.authoring.html, David Ross wrote:
Each newsgroup seems to have its own, distinct conventions for
replying. Some want top-posting of replies; others want
bottom-posting.
Yet to find out group that prefers top posting or insensible quoting.
Some groups that accept it (only gets noted on bottom of message)

I subscribe and participate on about 60 groups in 3 languages, of which
less than half technical stuff, and all follow same guidelines for
posting. Not all are as strict on posting style as this group, but all
prefer same style. (and some are stricter.)

Usually, it has much to do with volume of group. This is not excactly
quiet group, and many people also follow other ciwa* groups...
Some want complete quoting of the entire thread
when replying; others want only the relevant portion of the
previous message. I can't possibly remember all the rules for each
of the 21 newsgroups where I frequently participate.
The ones I have used have been accepted in all groups I subscribe.
(For Web
standards, however, there is only one HTML 4.01 specification and
one one CSS1 specification; and I still have to read the
specifications for details.)
There is many, many websites out there telling usenet netiquette.
In this case, I was replying to the Subject, which indeed did
appear as part of my reply.


Common courtacy would be include subject to body to make it clear.

--
Lauri Raittila <http://www.iki.fi/lr> <http://www.iki.fi/zwak/fonts>
Utrecht, NL.
Jul 23 '05 #39
Quick funny follow-up should you care to look at it. Being that it's
political in nature, I'll simply provide the link:

http://www.subuse.net/article.php?g=14&id=4
--
[ Sugapablo ]
[ http://www.sugapablo.net <--personal | http://www.sugapablo.com <--music ]
[ http://www.2ra.org <--political | http://www.subuse.net <--discuss ]

Jul 23 '05 #40

This thread has been closed and replies have been disabled. Please start a new discussion.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.