473,769 Members | 1,882 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Browser inconsistencies : what is the most efficient development regime?

I seem to spend far too much of my time struggling with browser
inconsistencies concerning Javascript (not to mention CSS).

What do you think is the most efficient development regime, including both
PC and Mac? In particular, two questions come to mind:

1 In which browser should I develop to ensure that the maximum amount of
code is correct, so that it would work the maximum number of browsers.

2 In which browsers should I test the code to ensure that it will work for
practically all users? I'd like to think that this is a short list of three
but I hope it's not more than five.

I've been driven to ask these questions after a very frustrating week
building a multi-level navigation bar which the customer can design and
build. The structure is held in an SQL database which is transferred into
Javascript arrays when the page loads. I've had lots of problems between IE
and NN and particularly between NN7 and NN8. I'm beginning to think that NN7
is rubbish.

If I were to answer above, I think I would say NN8 to Q1 and NN8 (on PC and
Mac), IE and Safari to Q2.

I'd be most interested in your views.


Posted Via Usenet.com Premium Usenet Newsgroup Services
----------------------------------------------------------
** SPEED ** RETENTION ** COMPLETION ** ANONYMITY **
----------------------------------------------------------
http://www.usenet.com
Oct 10 '05 #1
10 1797
"Roger Withnell" <ro*********@TH ISupperbridge.c o.uk> writes:
1 In which browser should I develop to ensure that the maximum amount of
code is correct, so that it would work the maximum number of browsers.
Use more than one browser all the way through the development. It's much
easier to fix problems when they appear, than when layers of code has
been added on top of them.
2 In which browsers should I test the code to ensure that it will work for
practically all users? I'd like to think that this is a short list of three
but I hope it's not more than five.
I was about to give a list, but really, supporting specific browsers
apart from IE 6 is less of a problem than supporting browsers with
Javascript turned off. There are more people browsing without
Javascript than there are using any single non-IE6 browser. That's
what you have to support first.

Then you can try to fix the, relatively minor, differences between
browsers with javascript afterwards. Using standard compliant HTML
and DOM code will make most things easy.
If I were to answer above, I think I would say NN8 to Q1 and NN8 (on PC and
Mac), IE and Safari to Q2.


Don't forget Opera (not just because it's what I use :)

/L
--
Lasse Reichstein Nielsen - lr*@hotpop.com
DHTML Death Colors: <URL:http://www.infimum.dk/HTML/rasterTriangleD OM.html>
'Faith without judgement merely degrades the spirit divine.'
Oct 10 '05 #2
You're bound to make a decision: would you support generation 3
browsers, and generation 4?
If you do, you'll have to test on Netscape4 and 3, Netscape 5+, IE4,
IE5+, Firefox, Opera 4, Opera 7+, no way out.

It also depends on what you mean by supporting older browsers. I do
not support anything below generation 5. What is below that, if one
turns css off, can see the full page as just text.

Users that browse without javascript enabled do not exist. They are
like dragons and unicorns, fantasy creatures one may believe in.
There exist Search Engine Bots that do. If a blue herring like a user
browsing without JS exists, you can lose that safely: he/she probably
never realized he/she was unable to make any proper transaction online
or use email web based stuff like hotmail or gmail, so that's just not
the type of user that would do anything meaningful with your page.

My suggestion is: develop using Firefox 1.x The ALSO check on IE6. You
just cannot work testing on one browser alone, sorry.

If your scripts work on BOTH, your chances that they work everywhere as
long as it is a generation 5 browser, are near to 100% by a few
fractions.

Please as for the story about W3C compliance, have a look at:
http://www.unitedscripters.com/spell...texplorer.html
scroll till HALF way of that file, you do NOT have to read it. Just
locate the middle of the page where there is a list of over 200 sites
from Google to Intel, from Yahoo to Logitech, from Amazon to
McGrowHill, from alpha to omega, that do NOT pass the W3C test by
several hundreds of errors each.

That is where the true importance of full compliance with W3C
guidelines stays.

They'll eat you alive if you sponsor this position around for the
vulgata goes that W3C compliance is paramount, but those links PROVE
that nearly ALL that we have online today is NOT W3C compliant in the
LEAST and yet those are all sites browsed by MILLIONS of users everyday
each, with ALL the possible platforms, versions, browsers, operative
systems and paraphernalia: none validates, ALL are succesfully used.
NONE validates in the LEAST.

Oct 11 '05 #3
> My suggestion is: develop using Firefox 1.x The ALSO check on IE6. You
just cannot work testing on one browser alone, sorry.


I do that. But I also check it every now and then, sometimes only when
I've finished a project, in Opera. If it works in Mozilla then Opera
usually seems fairly happy. Occasionally I have adjust some CSSand a bit
or markup for cosmetic reasons to keep Opera happy but otherwise its fine.

Javascript is a notible exception. My big sexy book on JS says that only
FireFox has implemented support for DOM level 3. IE and Opera have
implementation of level 1 but not much more so if you want safe JS code
i'd stick to the features of DOM level 1. DOM level 2 has note iterators
and treewalking methods and stuff which all sounds very cool but you
shouldn't use yet and level 3 is too new for anyone to care about. As
for ECMAScript (the core languague of JS) I think all browsers are
fairly unified there.

There are enough inconsistancies between Fox1 and IE6 to give me enough
headaches so I don't see the point supporting too many more. IE6 is used
by 90%+ so make that the priority but I like to respect those with
better browser tastes like myself by testing in Mozilla and opera.

If you want to support minorities then support the partially sighted,
colorblind, blind and physically impaired. Here's how:
http://diveintoaccessibility.org/table_of_contents.html

I and btw my name is Ollie from the UK I should be around on this
newsgroup a fair bit now. I've just started my own web
design/development business. I'm shouting out to all you group regulars!

Ollie
Oct 11 '05 #4
On 11/10/2005 01:33, va*****@gmail.c om wrote:
[...] would you support generation 3 browsers,
Those browsers are entirely obsolete, and are not suitable for use on
the modern Web.
and generation 4?
These just about qualify. However, serious effort to make them look or
behave like modern browsers is probably wasted. Any CSS that makes these
browsers fall over can be hidden[1]. Competent script design and
implementation will retain core functionality in any browser.

[snip]
Users that browse without javascript enabled do not exist. [...]
Yes, they do. They might be a minority, but they do exist. Even so, a
browser that has client-side scripting disabled is not the only reason
to understand how to write decent scripts, and provide fall backs when
support is lacking.
There exist Search Engine Bots that do.
Most search engines don't. I remember a rumour that an experimental
GoogleBot does, but I don't know its planned capabilities.

If you want search engines to index your content, then this is reason
enough not to depend on client-side scripting.
Please as for the story about W3C compliance, have a look at:
http://www.unitedscripters.com/spell...texplorer.html


I assume that your point is that many sites don't validate, so
attempting to do so is a waste of time? This has been debated many times
in HTML groups, so I do /not/ want to start anew here[1]. However, I
will say that your decision to write bad markup is your decision and
your responsibility. Don't recommend it to others.

Incidentally, your example for an apparent error in Firefox (wrapped for
posting):

<div style="border:# 000000 1px solid; color:#ff0000;
width:200; height:20; overflow:visibl e;">
<br>a<br>b<br>c <br>d<br>e<br>f <br>g</div>

is bogus (and contains broken CSS). The height of block elements with
explicit height property declarations do /not/ expand with their
content. That behaviour is achieved using the min-height property. As
for why Opera displays your desired behaviour: add units to the length
values (which are required for non-zero lengths) and it renders identically.

[snip]

Mike
[1] Read Google archives before you consider starting any discussion.
Unless you have something new to add, don't bother. If you do,
post to alt.html or comp.infosystem s.www.authoring.html where
such a debate is on-topic, not clj.

--
Michael Winter
Prefix subject with [News] before replying by e-mail.
Oct 11 '05 #5
va*****@gmail.c om wrote:
You're bound to make a decision: would you support generation 3
browsers, and generation 4?
If you do, you'll have to test on Netscape4 and 3, Netscape 5+, IE4,
IE5+, Firefox, Opera 4, Opera 7+, no way out.
Any list of browsers will be incomplete, but browsers on devices such as
PDAs and phones will become increasingly popular and important so they
should be considered too. Many 3G networks are offering free internet
browsing during certain off-peak times to encourage usage, the biggest
drawback is the lack of suitable content.

[...]
Users that browse without javascript enabled do not exist.
I don't think that is correct - there are a good many surfers who, when
frightened by various security issues, turned scripting off and have
never turned it back on. 10% appears to be the accepted number, but
there is no way of knowing for certain.

[...]
Please as for the story about W3C compliance, have a look at:
http://www.unitedscripters.com/spell...texplorer.html
scroll till HALF way of that file, you do NOT have to read it. Just
locate the middle of the page where there is a list of over 200 sites
from Google to Intel, from Yahoo to Logitech, from Amazon to
McGrowHill, from alpha to omega, that do NOT pass the W3C test by
several hundreds of errors each.

That is where the true importance of full compliance with W3C
guidelines stays.

In deference to Mike's request, I'll just say that the statement is
plain wrong.

The page incorrectly reports some sites as invalid, if offers no
analysis of site errors, what their cause or effect might be, nor does
it delve below the home page. It is at best superficial and no
meaningful conclusion can be drawn from it.
They'll eat you alive if you sponsor this position around for the
vulgata goes that W3C compliance is paramount, but those links PROVE
that nearly ALL that we have online today is NOT W3C compliant in the
LEAST and yet those are all sites browsed by MILLIONS of users everyday
each, with ALL the possible platforms, versions, browsers, operative
systems and paraphernalia: none validates, ALL are succesfully used.
NONE validates in the LEAST.


There is a large gulf between 'valid' and 'compliant' - getting a tick
from a validator does not prove compliance.

While compliance is laudable goal, it is not of itself a sign of a well
designed or implemented site. It is quite possible to have a site that
passes the W3C validator yet is accessible to less than 10% of browsers.

Sites with few a trivial errors are not a problem; of far more concern
are sites that are coded to suit the vagaries of a particular browser
without regard for, and to the exclusion of, other browsers. That is
possible with a great many browsers, not just those in common use.

A second major concern is sites that should be publicly accessible yet
aren't because they don't consider disabled or disadvantaged users. A
simple validator test does not offer any guidance as to accessibility.

--
Rob
Oct 11 '05 #6
Michael Winter schrieb:
On 11/10/2005 01:33, va*****@gmail.c om wrote:
[...] would you support generation 3 browsers,

Those browsers are entirely obsolete, and are not suitable for use on
the modern Web.


Just a little story from me: It's two years ago, i got a phone call from a user,
he wanted to ask why his computer allways hangs when visiting our company site.
So I was really surprised why any computer should hang when visiting our site...
Asking him about his browser he said: Netscape 2! After some minutes, i could
explain him, that netscape 2 isn't a good idea for browsing through todays
internet. Oh and his computer not just hanged up on visiting our site but on
most sites :o)
Users that browse without javascript enabled do not exist. [...]

Yes, they do. They might be a minority, but they do exist. Even so, a
browser that has client-side scripting disabled is not the only reason
to understand how to write decent scripts, and provide fall backs when
support is lacking.


And besides of people annoyed by boring JS-gadgets, there a groups of people
that are getting trouble when main functionality is hidden in JS. There are not
just the common Browsers but Useragents or supporting tools for disabled people
in example.
Please as for the story about W3C compliance, have a look at:
http://www.unitedscripters.com/spell...texplorer.html


I assume that your point is that many sites don't validate, so
attempting to do so is a waste of time? This has been debated many times
in HTML groups, so I do /not/ want to start anew here[1]. However, I
will say that your decision to write bad markup is your decision and
your responsibility. Don't recommend it to others.


Yes, and to force this point: Allways validate your code. If your code doesn't
validate, all browsers can do whatever they want with your code. So if you're
building invalid code and the actual versions render it like you expects to,
that doesn't mean that the next generation browser will do so and you'll
probably have to start then again, asking in some NGs why these buggy new
browsers can't display your site as expected and as the former version did -
although these browsers are just exactly doing what you told them to do... Errors.

greetings,

martin
Oct 11 '05 #7
On Tue, 11 Oct 2005 02:38:50 GMT, RobG wrote:
va*****@gmail.c om wrote:

Users that browse without javascript enabled do not exist.


I don't think that is correct - there are a good many surfers who, when
frightened by various security issues, turned scripting off and have
never turned it back on. 10% appears to be the accepted number, but
there is no way of knowing for certain.


One nice feature of Firefox is a plugin called noscript. By default
javascript is off for a site unless I turn it on. I can temporarily
turn on javascript for that site (lets me test out sites I'm
interested in). If I visit a site with no content without turning
javascript on I usually go no further. Remember I'm there for the
content not the javascript.

BTW, I make sure javascript is off for big news sites such as CNN. I
don't need the pop-ups. They're annoying!

--
Linux Home Automation Neil Cherry nc*****@comcast .net
http://home.comcast.net/~ncherry/ (Text only)
http://hcs.sourceforge.net/ (HCS II)
http://linuxha.blogspot.com/ My HA Blog
Oct 11 '05 #8
Thanks for all the comments on this.

I like to idea of testing on Firefox and IE. But what about Apple
Macintosh? Is it necessary to test on Safari also? Don't want to make a
significant investment unnecessarily (ie an Apple Mac).

<va*****@gmail. com> wrote in message
news:11******** **************@ g49g2000cwa.goo glegroups.com.. .
You're bound to make a decision: would you support generation 3
browsers, and generation 4?
If you do, you'll have to test on Netscape4 and 3, Netscape 5+, IE4,
IE5+, Firefox, Opera 4, Opera 7+, no way out.

It also depends on what you mean by supporting older browsers. I do
not support anything below generation 5. What is below that, if one
turns css off, can see the full page as just text.

Users that browse without javascript enabled do not exist. They are
like dragons and unicorns, fantasy creatures one may believe in.
There exist Search Engine Bots that do. If a blue herring like a user
browsing without JS exists, you can lose that safely: he/she probably
never realized he/she was unable to make any proper transaction online
or use email web based stuff like hotmail or gmail, so that's just not
the type of user that would do anything meaningful with your page.

My suggestion is: develop using Firefox 1.x The ALSO check on IE6. You
just cannot work testing on one browser alone, sorry.

If your scripts work on BOTH, your chances that they work everywhere as
long as it is a generation 5 browser, are near to 100% by a few
fractions.

Please as for the story about W3C compliance, have a look at:
http://www.unitedscripters.com/spell...texplorer.html
scroll till HALF way of that file, you do NOT have to read it. Just
locate the middle of the page where there is a list of over 200 sites
from Google to Intel, from Yahoo to Logitech, from Amazon to
McGrowHill, from alpha to omega, that do NOT pass the W3C test by
several hundreds of errors each.

That is where the true importance of full compliance with W3C
guidelines stays.

They'll eat you alive if you sponsor this position around for the
vulgata goes that W3C compliance is paramount, but those links PROVE
that nearly ALL that we have online today is NOT W3C compliant in the
LEAST and yet those are all sites browsed by MILLIONS of users everyday
each, with ALL the possible platforms, versions, browsers, operative
systems and paraphernalia: none validates, ALL are succesfully used.
NONE validates in the LEAST.


Posted Via Usenet.com Premium Usenet Newsgroup Services
----------------------------------------------------------
** SPEED ** RETENTION ** COMPLETION ** ANONYMITY **
----------------------------------------------------------
http://www.usenet.com
Oct 11 '05 #9
Neil Cherry said the following on 10/11/2005 9:28 AM:
On Tue, 11 Oct 2005 02:38:50 GMT, RobG wrote:
va*****@gmail .com wrote:
Users that browse without javascript enabled do not exist.


I don't think that is correct - there are a good many surfers who, when
frightened by various security issues, turned scripting off and have
never turned it back on. 10% appears to be the accepted number, but
there is no way of knowing for certain.

One nice feature of Firefox is a plugin called noscript. By default
javascript is off for a site unless I turn it on. I can temporarily
turn on javascript for that site (lets me test out sites I'm
interested in). If I visit a site with no content without turning
javascript on I usually go no further. Remember I'm there for the
content not the javascript.


So many people claim that they are "there for the content" when most
don't even realize how the content got there in the first place.

And, most sites that employ javascript simply do not function properly
without it. That leaves you surfing where?
BTW, I make sure javascript is off for big news sites such as CNN. I
don't need the pop-ups. They're annoying!


You don't need a FF plug-in then, you need to learn how to use the FF
popup blocker.

--
Randy
comp.lang.javas cript FAQ - http://jibbering.com/faq & newsgroup weekly
Oct 11 '05 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

4
6124
by: Linus Nikander | last post by:
Having recently load-tested the application we are developing I noticed that one of the most expensive (time-wise) calls was my fetch of a db-connection from the defined db-pool. At present I fetch my connections using : private Connection getConnection() throws SQLException { try { Context jndiCntx = new InitialContext(); DataSource ds = (DataSource)
2
3891
by: Belmin | last post by:
Hi all, Wanted to know what is the most efficient way of doing a select query for mysql that only returns one value. For example: $mysqli->query('select count(*) from log'); $temprec = $result->fetch_assoc(); $count = $temprec; That doesn't seem efficient. How should I do it? Or is this as efficient
2
2313
by: Jim Kitterman | last post by:
I am looking for the most efficient way of searching a large xml document (> 14mg). If I could get some pointers in the right direction. I am using VB.NET. It is readonly.
0
1075
by: Chris Pels | last post by:
What would be the most efficient way to serialize a group of the same objects? For example, if I have 10 Organization objects I want to return from a WebMethod in a "container", what type of container would be most efficient (ArrayList, Collection, etc.)? It is a requirement that the container be dynamic in size and be able to add each object dynamically. Also, if an ArrayList or something similar is used, is there a way to have the root...
6
1602
by: José Joye | last post by:
I have to compare 2 byte and I must be sure that they are fully identic. I have to perform this check about 1000 times per minute and on arrays that are between 100-200K in size. In that sense, what is the most efficient method? Sincerely, José
13
2705
by: chrisben | last post by:
Hi, I need to insert more than 500,000 records at the end of the day in a C# application. I need to finish it as soon as possible. I created a stored procedure and called it from ADO to insert one by one. It is kind of slow (seems slower than using a DTS package to import from a file). Just a general question, in ADO, what will be the MOST efficient way to do this work. I normally do it as I described. I am using .NET framework 1.1
1
1506
by: cwertman | last post by:
I have a document like so (Its actually a serilization of an Object) <Person xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <BirthDate>8/31/1971</BirthDate> <CurrentEmployer>NSA</CurrentEmployer> <Kids> <Person> <BirthDate>8/31/1981</BirthDate> <CurrentEmployer>NSA</CurrentEmployer>
1
3897
by: =?Utf-8?B?UVNJRGV2ZWxvcGVy?= | last post by:
Using .NET 2.0 is it more efficient to copy files to a single folder versus spreading them across multiple folders. For instance if we have 100,000 files to be copied, Do we copy all of them to a single folder called 'All Files' Do we spread them out and copy them to multiple folders like Folder 000 - Copy files from 0 to 1000 Folder 001 - Copy files from 1000 to 2000 Folder 002 - Copy files from 2000 to 2999
8
3843
by: secutos | last post by:
Programming Language: C#, .NET Framework 3.5 In this context, Form and App both describe a Microsoft Windows desktop application i'm creating. I'm creating a wordlist generator. I need to be able to store each word in an efficient manner. I don't need to search the collection or modify it - I just need to be able to write every entry in the collection back into a text file. So, what is the best collection to use? I heard ArrayLists were not...
0
10199
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10032
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
9849
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8861
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
6661
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5293
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
5433
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3948
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
2810
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.