By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
438,788 Members | 1,115 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 438,788 IT Pros & Developers. It's quick & easy.

XHTML, JavaScript, and CSS incompatability

P: n/a
I am trying to develop very standards-compliant content using XHTML
and CSS. I am using CSS positioning and thus need to only include my
stylesheet on browsers I have tested to make sure they display
correctly. The page is readable without the CSS though the formatting
is not pretty, but browsers like Netscape 4.x bungle the positioning
if I include the stylesheet, making it unreadable. I am using the
JavaScript Browser Sniffer by Eric Krok, Andy King, Michel Plungjan
(http://www.webreference.com/) to determine what browser my users are
accessing the page with and only writing the stylesheet link if the
browser passes muster, as follows:

if ((is_mac && is_ie5up) || (is_win && is_ie6up) || is_nav6up ||
is_opera7up || is_khtml || is_gecko) {
var CSS = true;
} else {
var CSS = false;
}
if (CSS) {
document.write('<link href="style.css" rel="stylesheet"
type="text/css"/>');
}

This seems to work on everything but konqueror on linux. Konqueror
will display the CSS properly, so I want it to have the link to the
stylesheet, but konqueror refuses to write the stylesheet link.
Apparently, from what I have read elsewhere, this is because it is
invalid in xhtml to write the page on the fly with javascript, and
konqueor is the only browser that strictly enforces this.

I want to comply with the standards, but don't know how to now. I am
also complying with an educational technology web interoperability
standard called SCORM, and this prevents me from using server-side
script to do the browser check. Has anyone else encountered this
problem? Has anyone come up with a better solution?

Thanks,
Rob
Jul 20 '05 #1
Share this Question
Share on Google+
10 Replies


P: n/a
Rob Fentress wrote:
I am trying to develop very standards-compliant content using XHTML
and CSS. I am using CSS positioning and thus need to only include my
stylesheet on browsers I have tested to make sure they display
correctly. The page is readable without the CSS though the formatting
is not pretty, but browsers like Netscape 4.x bungle the positioning
if I include the stylesheet, making it unreadable. I am using the
JavaScript Browser Sniffer by Eric Krok, Andy King, Michel Plungjan
(http://www.webreference.com/) to determine what browser my users are
accessing the page with and only writing the stylesheet link if the
browser passes muster, as follows:
So people with Javascript disabled or happen to use a browser you haven't
heard of get no styling? CSS hiding is an established concept - don't
reinvent the wheel badly.

<URL:http://w3development.de/css/hide_css_from_browsers/>
[snip] This seems to work on everything but konqueror on linux. Konqueror
will display the CSS properly, so I want it to have the link to the
stylesheet, but konqueror refuses to write the stylesheet link.
Apparently, from what I have read elsewhere, this is because it is
invalid in xhtml to write the page on the fly with javascript, and
konqueor is the only browser that strictly enforces this.

[snip]

That's incorrect; Konqueror doesn't have a clue about XHTML. There's an
open bug report relating to this issue:

<URL:http://bugs.kde.org/show_bug.cgi?id=52665>

--
Jim Dabell

Jul 20 '05 #2

P: n/a
Thanks for your help.

Jim Dabell <ji********@jimdabell.com> wrote in message news:<D-********************@giganews.com>...
Rob Fentress wrote:
So people with Javascript disabled or happen to use a browser you haven't
heard of get no styling? CSS hiding is an established concept - don't
reinvent the wheel badly.

<URL:http://w3development.de/css/hide_css_from_browsers/>

Those methods are not specific enough. They hide css from browsers I
don't want to hide it from and allow it for browsers that I don't want
to use it. The main concern is that the content be readable for all
viewers, its formatting being secondary. Therefore, I would prefer to
err on the side of caution and just ditch the formatting on the
miniscule percentage of users who are using browsers I haven't tested.
Also, the context I am operating in requires javascript to be enabled
for other reasons.

[snip]
This seems to work on everything but konqueror on linux. Konqueror
will display the CSS properly, so I want it to have the link to the
stylesheet, but konqueror refuses to write the stylesheet link.
Apparently, from what I have read elsewhere, this is because it is
invalid in xhtml to write the page on the fly with javascript, and
konqueor is the only browser that strictly enforces this.

[snip]

That's incorrect; Konqueror doesn't have a clue about XHTML. There's an
open bug report relating to this issue:

<URL:http://bugs.kde.org/show_bug.cgi?id=52665>

Yes. It turns out I misdiagnosed the problem. The reason the CSS
wasn't working in konqueror is that I set a variable in one external
javascript that was a flag indicating whether to write the link to the
stylesheet. When, in a second external javascript, I used that
variable to determine whether to write the link, it was always
evaluated to false, because it was out of scope. Worked in all other
browsers though. Strange.

Rob
Jul 20 '05 #3

P: n/a
Jim Dabell <ji********@jimdabell.com> wrote in message news:<D-********************@giganews.com>...

So people with Javascript disabled or happen to use a browser you haven't
heard of get no styling? CSS hiding is an established concept - don't
reinvent the wheel badly.

<URL:http://w3development.de/css/hide_css_from_browsers/>


But the real problem is that those are mostly hacks that are dependent
on the peculiarties of particular browsers. What if, in the next
version of the prohibited browser, they fix the peculiarity that you
are depending on to block CSS, but they don't fix the actual problem
with the CSS? What if a new browser comes out that doesn't support
CSS well but also doesn't have a peculiarity that you are depending on
to block CSS? I understand the desire to not depend on javascript and
the unending (though infrequent) updating that is required to make a
javascript-based browser checker continue to work, but I think there
are greater problems using hacks. IMO, it is better to give them a
readable unformatted page and tell them to turn on javascript and get
a better browser. Actually, it is probably best to use the hacks in
addition to the javascript to deal with issues of browser spoofing.
The important thing is to err on the side of caution (read no CSS).

Rob
Jul 20 '05 #4

P: n/a
Robert Fentress wrote:
But the real problem is that those are mostly hacks that are
dependent on the peculiarties of particular browsers.
All attempts at browser sniffing are either very unreliable or are hacks
& probably still quite unreliable.
What if, in the next version of the prohibited browser, they fix the
peculiarity that you are depending on to block CSS, but they don't
fix the actual problem with the CSS?
What if another such modification leads to your sniffing failing?
You'll alienate a lot of visitors who know they could access your site,
but are being denied because of stupid assumptions that it is making &
getting wrong.
What if a new browser comes out that doesn't support CSS well but
also doesn't have a peculiarity that you are depending on to block
CSS? I understand the desire to not depend on javascript and the
unending (though infrequent) updating that is required to make a
javascript-based browser checker continue to work,
You cut off what you consider to be valuable features from any user of a
compliant browser that you don't know about & consider it acceptable as
they're only small in number. This is discriminatory. I won't bother
drawing detailed parallels with physical access being denied to many
disabled people to shops, cafes & so forth, for very similar reasons.

Many areas have laws which make affected sites (typically any of
governmental, educational, etc origin) illegal, were they to adopt your
approach.
but I think there are greater problems using hacks. IMO, it is
better to give them a readable unformatted page and tell them to turn
on javascript and get a better browser.
No. Even if you persist in doing everything else the same, at the very
most explain why your site isn't fully accessible. Do not tell any
visitor what to do. Most won't or can't follow your instructions &
you'll piss off a great many people with your directive attitude.
Actually, it is probably best to use the hacks in addition to the
javascript to deal with issues of browser spoofing. The important
thing is to err on the side of caution (read no CSS).
CSS may 'just' be about presentation, however for a great many sites it
is more than an extra nicety. Form can't be more important than
function, but it can be a key part of it, imo.

Please think about what you're doing & do some research. Try contacting
some organisations involved with accessibility issues & ask their opinions.
Rob

--
Michael
m r o z a t u k g a t e w a y d o t n e t
Jul 20 '05 #5

P: n/a
Michael Rozdoba <mr**@nowhere.invalid> wrote in message news:<40*********************@lovejoy.zen.co.uk>.. .

All attempts at browser sniffing are either very unreliable or are hacks
& probably still quite unreliable.
I disagree. Though imperfect, they are less overtly hacks, as they
depend on standard variables such as userAgent and appVersion rather
than unrelated implementation features. I recognise that this is a
judgement call.

What if another such modification leads to your sniffing failing?
You'll alienate a lot of visitors who know they could access your site,
but are being denied because of stupid assumptions that it is making &
getting wrong.
As mentioned, it requires steady though infrequent maintenance of one
file. Unless you are negligent, it should never result in many people
being denied access to the styles who should have access to them.

You cut off what you consider to be valuable features from any user of a
compliant browser that you don't know about & consider it acceptable as
they're only small in number.
And you potentially present them with a page that is unreadable.
Which is more frustrating?

This is discriminatory. I won't bother drawing detailed parallels with physical access being denied to many
disabled people to shops, cafes & so forth, for very similar reasons.

Many areas have laws which make affected sites (typically any of
governmental, educational, etc origin) illegal, were they to adopt your
approach.

That is absolutely untrue. Pages designed to gracefully degrade
should still be readable without styles and that is, infact, a part of
accessible web design. The Web Content Accessibility Guidelines 1.0
state:

6.1 Organize documents so they may be read without style sheets. For
example, when an HTML document is rendered without associated style
sheets, it must still be possible to read the document. [Priority 1]
(Checkpoint 6.1)

Rule (d) of the Section 508, 1194.22 Web-based intranet and internet
information and applications states:

(d) Documents shall be organized so they are readable without
requiring an associated style sheet.

This implies that the document's readability is more important than
its presentation.

No. Even if you persist in doing everything else the same, at the very
most explain why your site isn't fully accessible. Do not tell any
visitor what to do. Most won't or can't follow your instructions &
you'll piss off a great many people with your directive attitude.

Perhaps, but your method presents the same problem. What happens to
people who view YOUR site in Netscape 4.7? Are they not directed to
upgrade their browser?

Other parameters of this project require javascript to access key
features of the site. We have a captive audience and can mandate that
they enable js. JS is mandatory because it is a part of another
standard which we must comply with, SCORM, and the SCORM doesn't allow
server-side script. In other projects we would use server-side script
to sniff the browser.

CSS may 'just' be about presentation, however for a great many sites it
is more than an extra nicety. Form can't be more important than
function, but it can be a key part of it, imo.
It is true that form can be important to function. However, I ask you
to consider which is the better scenario:
1. The majority of users receive a well-formatted page and a minority
a poorly formatted, but readable page.
2. The majority of users receive a well-formatted page and a minority
a page that cannot be read because images and text overlap one
another.

Please think about what you're doing & do some research. Try contacting
some organisations involved with accessibility issues & ask their opinions.


That is what I am doing. However, as someone quite familiar with
Section 508 and the WAI Guidelines, I fail to see how presenting a
subset of users with an unreadable page is better than presenting them
with one they can read.

Respectfully,
Rob
Jul 20 '05 #6

P: n/a
On Wed, 21 Jan 2004, Robert Fentress wrote:
Michael Rozdoba <mr**@nowhere.invalid> wrote in message news:<40*********************@lovejoy.zen.co.uk>.. .

All attempts at browser sniffing are either very unreliable or are hacks
& probably still quite unreliable.
I disagree. Though imperfect, they are less overtly hacks, as they
depend on standard variables such as userAgent and appVersion rather
than unrelated implementation features.


I disagree. The vast majority of web pages have no reason to be
uncacheable, and, as such, cache proxy servers will serve them out to
a whole range of different client agents. You'll only get to see the
client agent string that was presented by the first retrieval, and the
cache server will then serve that out to whoever requests it. Unless
you prevent that happening by making them uncacheable, which brings
its own disbenefits.

One proper way to serve out different page variants to different
client agents is to use content negotiation. BUT client agent strings
are very variable, so by trying to negotiate on a client agent string
you would make the page effectively uncacheable anyway, and we're back
with the above-mentioned disbenefits.

On top of that, one has to recognise that due to widespread misuse on
the server side of client agent strings, there has developed a
widespread practice of client agents faking these strings in their
requests.

So much for user agent strings. As for javascripting, who guarantees
that scripting will be enabled? So the page needs to work without it.
I recognise that this is a judgement call.
I don't disagree: but an accurate judgment can only come from full
knowledge of the circumstances, and I don't believe I have that full
knowledge, nor would I be prepared to believe anyone who claimed that
they did. "When in doubt, leave it out" - unless its necessity has
been proven. I say that attempts to cure the perceived disease have
produced negative effects far worse than the original symptoms.
That is absolutely untrue. Pages designed to gracefully degrade
should still be readable without styles and that is, infact, a part of
accessible web design.


Correct.

It remains the case that if you've provided a challenging stylesheet,
and are using some mechanism to shield less-capable browsers from it -
or from some part of it - then if that mechanism goes wrong, the user
might end up with a page that they cannot use, in the form in which it
is presented. This remains true no matter what "mechanism" is being
used - the ones favoured by you, or the ones promoted by the previous
poster (the latter being my preferred approach, but that's by the by).

Now, whereas the "disabled" readership has probably met this kind of
mess many times before, and knows how to deal with it, I'd surmise
that many of the "normals" (no offence intended) meet such problems
only rarely, and would not have a clue what to do to find their
browser's "disable stylesheets" option. They'd leave.

all the best.
Jul 20 '05 #7

P: n/a
Robert Fentress wrote:
Michael Rozdoba <mr**@nowhere.invalid> wrote in message news:<40*********************@lovejoy.zen.co.uk>.. .
All attempts at browser sniffing are either very unreliable or are hacks
& probably still quite unreliable.

I disagree. Though imperfect, they are less overtly hacks, as they
depend on standard variables such as userAgent and appVersion rather
than unrelated implementation features. I recognise that this is a
judgement call.


The trouble with the standard variables is that many browsers lie about
them. As a long time user of browsers on the RISC OS platform that most
of the world has never heard of, if I didn't have them configured to lie
about their identity, I would be cut out of many sites telling me to
upgrade to IE or NS; browsers which don't even exist for that platform.
What if another such modification leads to your sniffing failing?
You'll alienate a lot of visitors who know they could access your site,
but are being denied because of stupid assumptions that it is making &
getting wrong.


As mentioned, it requires steady though infrequent maintenance of one
file. Unless you are negligent, it should never result in many people
being denied access to the styles who should have access to them.


Barring such oversights, even if the numbers aren't large, certain
groups will be cut out for the simple reason that you don't know they
are capable of accessing those features; if they know they are, this
will be a very frustrating experience for them & create a lot of bad
feeling towards your site due to the perceived unfairness.

This is quite different from discovering one isn't able to access such
features because ones browser doesn't support them. Unless reliance on
those features is seen to be gratuitous, in that case the browser source
will tend to bear the brunt of any annoyance.
You cut off what you consider to be valuable features from any user of a
compliant browser that you don't know about & consider it acceptable as
they're only small in number.


And you potentially present them with a page that is unreadable.
Which is more frustrating?


I'm no expert on html or css design & am unaware of how bad such
problems can be. Is it not possible to offer a manual option for those
afflicted to turn off style sheets for your site?
This is discriminatory. I won't bother
drawing detailed parallels with physical access being denied to many
disabled people to shops, cafes & so forth, for very similar reasons.

Many areas have laws which make affected sites (typically any of
governmental, educational, etc origin) illegal, were they to adopt your
approach.

That is absolutely untrue. Pages designed to gracefully degrade
should still be readable without styles and that is, infact, a part of
accessible web design.


I get the impression that both yourself & the other respondant to your
post know far more about this matter than myself, so I withdraw my claim
& bow to your judgement.

I realise a well designed site should degrade gracefully in the absence
of css & if the laws concerned make a clear distinction between content
& presentation /and/ define these terms in the same way as the w3c, then
you will certainly be correct.

However I still can't help but feel uneasy. I suspect that there will
come a time, if it has yet to pass, where removal of these so called
presentational features might render many a site virtually unusable -
even though all content is still there to be seen.

[snip sound argument in support of...]
This implies that the document's readability is more important than
its presentation.
It would certainly seem to & I have no arguement against this.
No. Even if you persist in doing everything else the same, at the very
most explain why your site isn't fully accessible. Do not tell any
visitor what to do. Most won't or can't follow your instructions &
you'll piss off a great many people with your directive attitude.


Perhaps, but your method presents the same problem. What happens to
people who view YOUR site in Netscape 4.7? Are they not directed to
upgrade their browser?


No. Hopefully either it will be clear to them that there is a problem &
they can either disable style sheets for the site or upgrade their
browser - at their own decision. If it is possible they might experience
problems, but not realise the cause, I'd aim to provide some sort of
indicator or warning, with guidelines as to what they could do about it.

If I ever write a directive instructing users to modify their system in
order to access my content, I'd have to beat myself to death with a
large blunt object & I don't really fancy that. A personal promise to
myself :)
Other parameters of this project require javascript to access key
features of the site. We have a captive audience and can mandate that
they enable js. JS is mandatory because it is a part of another
standard which we must comply with, SCORM, and the SCORM doesn't allow
server-side script. In other projects we would use server-side script
to sniff the browser.
An audience captive by their own choice (or that of a parent
organisation such as an employer) makes it a totally different story. If
you know or can dictate the users' systems to that degree (that is, the
individuals within aren't free to choose otherwise), then anything goes
within those limits.
CSS may 'just' be about presentation, however for a great many sites it
is more than an extra nicety. Form can't be more important than
function, but it can be a key part of it, imo.


It is true that form can be important to function. However, I ask you
to consider which is the better scenario:
1. The majority of users receive a well-formatted page and a minority
a poorly formatted, but readable page.
2. The majority of users receive a well-formatted page and a minority
a page that cannot be read because images and text overlap one
another.


If it was that simple, I might well agree. I'm going on my gut feeling &
something just seems wrong. To draw an over the top analogy, if we could
cut the murder rate by 90%, by some process which involved choosing to
execute a very much smaller group, some of whom we'd expect to be
innocent, I couldn't support it. Even if numerically it would mean far
fewer died unjustly.

It's about making a choice which penalises those who needn't be
penalised, for arbitrary reasons, as opposed to allowing a group to
suffer because we can't find a means to help them without hurting others.

If I'm making no sense, or just seem to have lost the plot, please
ignore my ramblings :)
Please think about what you're doing & do some research. Try contacting
some organisations involved with accessibility issues & ask their opinions.

That is what I am doing.
[snip]

Yes, that's clear. I apologise for suggesting otherwise without any
evidence to back my case.
Respectfully,
Rob


Regards,

--
Michael
m r o z a t u k g a t e w a y d o t n e t
Jul 20 '05 #8

P: n/a
Alan J. Flavell wrote:
On Wed, 21 Jan 2004, Robert Fentress wrote:

Michael Rozdoba <mr**@nowhere.invalid> wrote in message news:<40*********************@lovejoy.zen.co.uk>.. .
All attempts at browser sniffing are either very unreliable or are hacks
& probably still quite unreliable.


I disagree. Though imperfect, they are less overtly hacks, as they
depend on standard variables such as userAgent and appVersion rather
than unrelated implementation features.

I disagree. The vast majority of web pages have no reason to be
uncacheable, and, as such, cache proxy servers will serve them out to
a whole range of different client agents. You'll only get to see the
client agent string that was presented by the first retrieval, and the
cache server will then serve that out to whoever requests it. Unless
you prevent that happening by making them uncacheable, which brings
its own disbenefits.


Thanks for your feedback Alan. I was not aware of this issue. Can you
point me in the direction of some sort of tutorial material that
explains this process in more detail? I'm having a little bit of a hard
time following you. I'm not checking the user agent string on the
server. Doesn't a javascript browser checker just get the user-agent
string from the browser locally? Why would it get that information from
some cached source on a remote server, when it is right there on the
client locally? If you are correct, then why does my script work when I
view a page using the script in different browsers within one minute of
each other? Am I misunderstanding you?

Respectfully,
Rob
Jul 20 '05 #9

P: n/a
On Tue, 27 Jan 2004, Robert Fentress wrote:
Thanks for your feedback Alan. I was not aware of this issue.
I see there's been a bit of a misunderstanding here. I was talking
about the server side of things...
Can you point me in the direction of some sort of tutorial material
that explains this process in more detail?
Well, while we're about it, the "standard" tutorial on-line is Mark
Nottingham's excellent work at http://www.mnot.net/cache_docs/ -
but as you say, that's about server-side negotiation.
I'm having a little bit of a hard time following you. I'm not
checking the user agent string on the server.
Sorry, I didn't realise that you were aiming to make your technique
entirely javascript-dependent. Estimates of the proportion of web
clients which either don't implement JS, have it disabled for whatever
reason, or maybe filtered out by their corporate firewall, seem to
range from 10% (probably an under-estimate) to above 20%.
Am I misunderstanding you?


Us both, I'm afraid. Sorry.

Jul 20 '05 #10

P: n/a
Robert Fentress wrote:
Alan J. Flavell wrote:
The vast majority of web pages have no reason to be uncacheable,
and, as such, cache proxy servers will serve them out to a whole
range of different client agents. You'll only get to see the
client agent string that was presented by the first retrieval,
and the cache server will then serve that out to whoever requests
it.


I'm not checking the user agent string on the server. Doesn't a
javascript browser checker just get the user-agent string from the
browser locally?


Yes, but even then I wouldn't count on it. Consider this story:

I had a devil of a time installing a java engine for Mozilla for my
mother. After installing the java runtime engine, and then restarting
Mozilla, I got a warning message not to use this particular java
engine with IE, only with Mozilla. I spent a couple of hours
uninstalling and reinstalling the runtime engine. Finally, I noticed
that I had set up Mozilla to lie by declaring itself to be -- yes, you
guessed it -- MSIE. When I changed it to identify itself as Mozilla,
the error disappeared.

Java is not javascript. And the java engine is an external plug-in,
unlike javascript. Still, I would not be surprised if Mozilla,
configured to lie to web servers, also lies to javascript sniffers.

--
Brian (follow directions in my address to email me)
http://www.tsmchughs.com/

Jul 20 '05 #11

This discussion thread is closed

Replies have been disabled for this discussion.