By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,835 Members | 1,454 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,835 IT Pros & Developers. It's quick & easy.

C 99 compiler access

P: n/a
I have access to a wide variety of different platforms here at JPL
and they all have pretty good C 99 compilers.

Some people claim that they have not moved to the new standard
because of the lack of C 99 compliant compilers.
Is this just a lame excuse for back-sliding?
Nov 14 '05
Share this Question
Share on Google+
233 Replies


P: n/a
Chris Hills wrote:
If any of them could see any commercial advantage in producing a C99
compile they would. It is for this reason that most of the worlds major
embedded compiler writers are making their compilers MISRA-C compliant
(MISRA-C was first launched in 1997) and they are eager to tp become
MISRA-C2 compliant even before the new version has been published. (on
the 13th October BTW)


Last time I looked, MISRA-C was a set of programming guidelines,
not a compiler specification.

Anyway, there are several compilers that have incorporated C99
features, on a path to full compliance.
Nov 14 '05 #101

P: n/a
Chris Hills wrote:
I have heard several C standards people express exactly that thought.


I haven't heard any.
Nov 14 '05 #102

P: n/a
Chris Hills wrote:
I am told there are a lot of problems with the maths in C99. this was by
a mathematician who's arguments was way beyond my ken...


No, there aren't "a lot of problems with the maths".
There were a couple of minor issues with the
specification of some functions which were addressed
by technical corrigenda. The actual mathematics is
okay.

You appear to have a personal agenda that includes
torpedoing the C standard. Could this be related to
your notion that we should have allowed type int to
be only 8 bits wide?
Nov 14 '05 #103

P: n/a
Chris Hills wrote:
I know some people with real concerns about
some parts of the C99 model. (their job is in validation)


I'm not sure what you mean by "C99 model", but there
is no essential difference from C89 in any of the
fundamentals, so whatever criticism there may be
would not be specific to C99.

My guess is that this refers to the continual UK
complaint about "too much undefined and unspecified
behavior". There are solid reasons for most of the
existing specifications involving these, which have
been explained in comp.std.c on numerous occasions.
As a result, most implementations have C99 extensions, but this doesn't
help people writing portable code, as there is no commonly supported
subset of C99 (with the possible exception of // comments).

Not even the // are supported in the same way. :-(


Really? If they can't follow the spec for even such
a simple feature, why would you trust those compilers
at all?
Nov 14 '05 #104

P: n/a
Chris Hills wrote:
.... snip ...
If any of them could see any commercial advantage in producing a
C99 compile they would. It is for this reason that most of the
worlds major embedded compiler writers are making their compilers
MISRA-C compliant (MISRA-C was first launched in 1997) and they
are eager to tp become MISRA-C2 compliant even before the new
version has been published. (on the 13th October BTW)


I expect that is because there is significant competition in that
field, inasmuch as gcc is apparently not very tolerant of
restricted devices, such as 16 bit integers.

--
"A man who is right every time is not likely to do very much."
-- Francis Crick, co-discover of DNA
"There is nothing more amazing than stupidity in action."
-- Thomas Matthews
Nov 14 '05 #105

P: n/a
Chris Hills wrote:
Dan Pop <Da*****@cern.ch> writes
.... snip ...
two tiny businesses, Comeau and Dinkumware have produced an
implementation claiming C99 conformance since quite a while.
The bigger vendors could have done it a lot faster.


Also the Tasking Tricore compiler.

Has anyone actually validated these two/three compilers?


Never heard of TT. I believe the Dinkum effort is a library, not
a compiler. Comeau uses it. I think Comeaus system outputs C90
tuned to specific compilers, not executable code.

--
"A man who is right every time is not likely to do very much."
-- Francis Crick, co-discover of DNA
"There is nothing more amazing than stupidity in action."
-- Thomas Matthews
Nov 14 '05 #106

P: n/a
Douglas A. Gwyn wrote:
Kelsey Bjarnason wrote:
Granted. However, it seems to me that VLAs are sort of a "lazy man's
malloc", but without the error handling ability.


Used with caution they're no worse than any other auto variable.


Not true. Because auto variables are constant size and recursion can be
limited, it may be the case on a particular platform that a given program
will *never* have undefined behaviour due to a stack overflow. This is
very unlikely to be true if the program uses VLAs, because the main point
of using VLAs is to allocate objects of arbitrary sizes.

Preferring malloc to VLAs because of this error handling problem is
entirely justified.

David Hopwood <da******************@blueyonder.co.uk>
Nov 14 '05 #107

P: n/a
Joseph Myers wrote:
Suppose that you got adequate funding tomorrow.
How long would it take to produce an ANSI/ISO compliant C 99 compiler?


The C99 implementation could be completed in a few months. (That is
for targets with sane floating-point hardware; not 387 floating point
on x86 which makes it hard to do computations in the range and
precision of float / double or round those done in long double to the
range and precision of float / double without storing to memory.)


Why do you think you need true float or true double support?
Set FLT_EVAL_METHOD to 2 and do (almost) everything in long double.
The only time (that I know of) you need true float or true double is
for: assignment (which is going to be a stor to memory anyway) and
cast (which I grant you is a pain to do a store/load via memory).
---
Fred J. Tydeman Tydeman Consulting
ty*****@tybor.com Programming, testing, numerics
+1 (775) 287-5904 Vice-chair of J11 (ANSI "C")
Sample C99+FPCE tests: ftp://jump.net/pub/tybor/
Savers sleep well, investors eat well, spenders work forever.
Nov 14 '05 #108

P: n/a
In article <qc******************@fe2.news.blueyonder.co.uk> ,
David Hopwood <da******************@blueyonder.co.uk> wrote:
Preferring malloc to VLAs because of this error handling problem is
entirely justified.


Unfortunately, on all the platforms I - and many others - use,
malloc() has just the same problem.

-- Richard
Nov 14 '05 #109

P: n/a
Richard Tobin wrote:
David Hopwood <da******************@blueyonder.co.uk> wrote:
Preferring malloc to VLAs because of this error handling problem is
entirely justified.


Unfortunately, on all the platforms I - and many others - use,
malloc() has just the same problem.


You mean malloc() causes undefined behaviour when there is insufficient
memory? Get a better platform.

David Hopwood <da******************@blueyonder.co.uk>
Nov 14 '05 #110

P: n/a
In article <dq*******************@fe2.news.blueyonder.co.uk >,
David Hopwood <da******************@blueyonder.co.uk> wrote:
You mean malloc() causes undefined behaviour when there is insufficient
memory?
I mean malloc() may return a non-null value and then fail when you
try to actually use the memory. Presumably you already know about this.
Get a better platform.


For me, the behaviour of malloc() is not the only consideration in
choosing a platform.

-- Richard
Nov 14 '05 #111

P: n/a
In article <41***************@tybor.com>,
Fred J. Tydeman <ty*****@tybor.com> wrote:
Joseph Myers wrote:
>Suppose that you got adequate funding tomorrow.
>How long would it take to produce an ANSI/ISO compliant C 99 compiler?


The C99 implementation could be completed in a few months. (That is
for targets with sane floating-point hardware; not 387 floating point
on x86 which makes it hard to do computations in the range and
precision of float / double or round those done in long double to the
range and precision of float / double without storing to memory.)


Why do you think you need true float or true double support?
Set FLT_EVAL_METHOD to 2 and do (almost) everything in long double.
The only time (that I know of) you need true float or true double is
for: assignment (which is going to be a stor to memory anyway) and
cast (which I grant you is a pain to do a store/load via memory).


Assignment isn't necessarily going to be a store to memory in the
presence of optimisation. (There is indeed an existing option
-ffloat-store which forces it to be a store to memory, at significant
performance cost; it isn't currently enabled by the conformance
options.) There are old architectural issues around the x86 floating
point support, and while I have some notion of how they can be fixed,
I don't have the expertise in those areas of the compiler to allow for
fixing them (and producing a design for a fix that gets generally
accepted) in a time estimate for C99 support. FLT_EVAL_METHOD is set
to 2, though it should actually be set to -1 (another consequence of
those issues: sometimes the excess precision of intermediate
computations can be lost), but casts of expressions to the same type
do not do anything. (While C99 seems to say that when a float being
stored with the range and precision of long double is converted to
double, the excess range and precision is not removed, though it would
be when a double is converted to double.) So the estimate instead
considers everything not specific to a particular CPU target; almost
all CPUs supported by GCC do not have these problems.

--
Joseph S. Myers
js***@gcc.gnu.org
Nov 14 '05 #112

P: n/a
Richard Tobin wrote:
David Hopwood <da******************@blueyonder.co.uk> wrote:
You mean malloc() causes undefined behaviour when there is insufficient
memory?


I mean malloc() may return a non-null value and then fail when you
try to actually use the memory. Presumably you already know about this.


You mean overcomittment of virtual memory, then? The behaviour in that
case is not entirely undefined; most platforms that overcommit virtual
memory will start killing processes, but will not cause them to have
otherwise undefined behaviour.

David Hopwood <da******************@blueyonder.co.uk>
Nov 14 '05 #113

P: n/a
ri*****@cogsci.ed.ac.uk (Richard Tobin) wrote in message news:<ch**********@pc-news.cogsci.ed.ac.uk>...
In article <dq*******************@fe2.news.blueyonder.co.uk >,
David Hopwood <da******************@blueyonder.co.uk> wrote:
You mean malloc() causes undefined behaviour when there is insufficient
memory?


I mean malloc() may return a non-null value and then fail when you
try to actually use the memory. Presumably you already know about this.


On the platform I use at work (IRIX on an SGI machine) that behavior
is controlled by a configuration parameter. The first time it caused a
problem, we tracked it down, roundly criticized the idiot who turned
that feature on, and got the SAs to turn it off.
Nov 14 '05 #114

P: n/a
On Fri, 03 Sep 2004 18:00:10 +0000, Joseph Myers wrote:
In article <ch**********@nntp1.jpl.nasa.gov>,
E. Robert Tisdale <E.**************@jpl.nasa.gov> wrote:
GCC does not have a conforming C90 implementation either.
Note that this is a direct contradiction of Robert Gamble's assertion:

'The documentation implies full conformance to c90:

"GCC supports three versions of the C standard, although
support for the most recent version is not yet complete."'


ERT: Not really an assertion, just pointing out that from the
documentation a reasonable person could easily walk away with the
implication that full support is there. I think that even after reading
the parts referenced below this is a reasonable interpretation, although
apparently incorrect.
I recommend reading the whole of the relevant section of the manual
<http://gcc.gnu.org/onlinedocs/gcc/Standards.html> rather than just an
isolated extract. In particular

For each language compiled by GCC for which there is a standard,
GCC attempts to follow one or more versions of that standard,
possibly with some exceptions, and possibly with some extensions.

and

GCC aims towards being usable as a conforming freestanding
implementation, or as the compiler for a conforming hosted
implementation.

- note the "attempts to follow" and "aims towards".

Support for C99 is not feature-complete (nor fully correct for some
features where there is approximate support for the standard).
Support for C90 is not fully correct, though the language features are
there and most users who do not read comp.std.c may not notice that
constant expression constraints don't work.


I read all of the document but didn't find anything that really
contradicted my interpretation. Even the parts you pointed out don't
clearly state anything one way or another. Is there a place where these
exceptions are documented? Making this information more prevalent may
prevent this type of confusion in the future.

Rob Gamble

Nov 14 '05 #115

P: n/a
Joseph Myers wrote:
.... snip ... (While C99 seems to say that when a float being stored with the
range and precision of long double is converted to double, the
excess range and precision is not removed, though it would be
when a double is converted to double.) ... snip ...


That make absolutely no sense. I don't believe in quantum
creation of bits on demand.

--
Chuck F (cb********@yahoo.com) (cb********@worldnet.att.net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net> USE worldnet address!
Nov 14 '05 #116

P: n/a
In article <41***************@null.net>, Douglas A. Gwyn
<DA****@null.net> writes
Chris Hills wrote:
If any of them could see any commercial advantage in producing a C99
compile they would. It is for this reason that most of the worlds major
embedded compiler writers are making their compilers MISRA-C compliant
(MISRA-C was first launched in 1997) and they are eager to tp become
MISRA-C2 compliant even before the new version has been published. (on
the 13th October BTW)
Last time I looked, MISRA-C was a set of programming guidelines,
not a compiler specification.


It is a coding guideline.

My point was that most of the worlds embedded compiler suites are making
their library code MISRA-C compliant because they think it is a Good
Idea (commercially).

Remember that MISRA-C was launched, from nowhere, as a small industry
specific guide, in 1998.

On the same line very few of the worlds compiler writers are making
their compilers C99 compliant (with any real effort) despite the fact
they knew it was coming and despite the fact that in theory it is the
specification for their compilers.

This is VERY sad.

There is no impetus or requirement by the majority of C programmers, or
their companies to have a C99 (or an ISO-C per-say) compiler.

BTW this is one of the reasons why MISRA-C2 (launched next month) is
still referring to C90+TC's because in reality that is what 98% of its
target market is using.

So the question is how do re create an environment in the Industry in
general where ISO-C is automatically considered a prerequisite for a
compiler?

We need to foster the idea among programmers and their managers that
ISO-C [C99] is a necessity for their compilers.

Anyway, there are several compilers that have incorporated C99
features, on a path to full compliance.


"several" "features" this is not good enough. We need "most" and "full
compliance"

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #117

P: n/a
In article <41***************@yahoo.com>, CBFalconer
<cb********@yahoo.com> writes
Chris Hills wrote:

... snip ...

If any of them could see any commercial advantage in producing a
C99 compile they would. It is for this reason that most of the
worlds major embedded compiler writers are making their compilers
MISRA-C compliant (MISRA-C was first launched in 1997) and they
are eager to tp become MISRA-C2 compliant even before the new
version has been published. (on the 13th October BTW)


I expect that is because there is significant competition in that
field, inasmuch as gcc is apparently not very tolerant of
restricted devices, such as 16 bit integers.


Yes, there is a lot of competition in the embedded world. From 4 to 128
bit. There is also Gcc and many other free tools for most embedded
platforms.

However whilst they are "rushing" to make their library code MISRA-C
complient (in as much as it can be) they are not rushing to make their
code their compilers C99 compliant...

This is strange. MISRA-C is "just a coding guideline" NOTE "guideline"
not "Standard" whereas ISO-C is THE standard for their compiler.

The problem is that the ISO-c standard is not seen as important or a
prerequisite for a compilers by the industry.

How do we change this perception? How do we get the industry to demand
ISO-C compliant compilers?
Part of the problem is a large part fot he industry uses GCC because
"its free and you get the source" rather than for any Engineering
reasons.

I suspect that until the law or the insurance companies REQUIRE ISC-C
complient compilers they will not become essential in the industry.

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #118

P: n/a
In article <ln************@nuthaus.mib.org>, Keith Thompson <kst-
u@mib.org> writes
Chris Hills <ch***@phaedsys.org> writes:
[...]
This is one of the major problems. There is no requirement to have an
ISO-C compiler. For some reason, unlike the ADA crowd the C community
seems to be a lot less interested in standards and "SW Engineering" as
opposed to "programming"


A small off-topic quibble: It's Ada, not ADA. (The name isn't an
acronym.)


Thanks

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #119

P: n/a
In article <41***************@null.net>, Douglas A. Gwyn
<DA****@null.net> writes
Chris Hills wrote:
I have heard several C standards people express exactly that thought.


I haven't heard any.


It was at a UK panel meeting.

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #120

P: n/a
In article <41***************@null.net>, Douglas A. Gwyn
<DA****@null.net> writes
Chris Hills wrote:
I am told there are a lot of problems with the maths in C99. this was by
a mathematician who's arguments was way beyond my ken...
No, there aren't "a lot of problems with the maths".
There were a couple of minor issues with the
specification of some functions which were addressed
by technical corrigenda. The actual mathematics is
okay.


We will have to agree to differ as I do not know the maths well enough
by I know several experts who are still complaining about the C99 maths.
(I think you know Nick)
You appear to have a personal agenda that includes
torpedoing the C standard.
On the contrary. I want a C standard that is used widely. It fills me
with great sadness that ISO-C compliant compilers are not seen as a
requirement by most programmers.

How do we go about creating the climate where ISO-C (is the current
version) is automatically considered necessary?
Could this be related to
your notion that we should have allowed type int to
be only 8 bits wide?


No. Why have an int that is 8bits? that would not be as much use
generally. unsigned and signed char are fine for 8 bits.

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #121

P: n/a
In article <41***************@null.net>, Douglas A. Gwyn
<DA****@null.net> writes
Chris Hills wrote:
>As a result, most implementations have C99 extensions, but this doesn't
>help people writing portable code, as there is no commonly supported
>subset of C99 (with the possible exception of // comments).

Not even the // are supported in the same way. :-(


Really? If they can't follow the spec for even such
a simple feature, why would you trust those compilers
at all?

Because both compilers hold an 80% market share in their own markets.
(different platforms) and both are the the "standard" compiler for their
market.

I had a problem that someone was trying our some code in one development
environment that he wanted to move to the other.

The problem (I will see if I can dig it out) involved multi line macros
and // style comments. I am not sure which one was more correct to the
standard.

The problem is that in both development areas (world wide) both groups
of developers are happy with their system and for neither group
generally is "ISO-C compliant a major point.
I have been told on more than one occasion by a compiler user that "the
ISO-C committee has got it wrong" if the standard does not behave the
way their "industry standard" compiler does.
This is the problem. How do we change the perception of programmers so
that they insist in their compiler conforming to the current ISO-C
standard and not saying their standard is wrong for not tracking their
compiler?

It is all well and good creating a "Good Standard" technically. It is a
commercial world.

Who is responsible for ensuring that it is a required standard? ISO? and
locally BSI? ANSI?, DIN?

or the professional bodies IEE, IEEE, BCS etc

or governments?

I don't think it fits the UN field somehow.
/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #122

P: n/a

Note, for ISO-C below read "Current version of ISI-C, what ever that may
be at the point in question"

In article <41***************@yahoo.com>, CBFalconer
<cb********@yahoo.com> writes
Chris Hills wrote:
Dan Pop <Da*****@cern.ch> writes
... snip ...
two tiny businesses, Comeau and Dinkumware have produced an
implementation claiming C99 conformance since quite a while.
The bigger vendors could have done it a lot faster.


Also the Tasking Tricore compiler.

Has anyone actually validated these two/three compilers?


Never heard of TT.


Tasking, unlike Comeau and Dinkumware, are a major compiler company
with development offices in several countries. The do a wide range of
embedded cross compilers. They are in the top 2 or 3 in most of the
areas they do compilers for.

The Infineon Tricore is a 32bit MCU that is used widely in the
automotive sector for ECU's and I have come across it in image
processing.
I believe the Dinkum effort is a library, not
a compiler.
Just out of curiosity can a C90 compiler compile C99 libraries?
Comeau uses it. I think Comeaus system outputs C90
tuned to specific compilers, not executable code.


So this means that only the Tasking Tricore compiler is claiming C99
compliance for the a complete compiler suite? Mind you I have not
tested this claim...

Which brings me on to another point... there seems to be no interest in
the industry for validated ISO-C compilers.

The industry does not really care if the compiler ic ISO-C (90/99 or
abc) GNU C or splat C as long as "the product" gets out the door on time
and makes a profit... (OK, there are exceptions some of us could name
but I am talking generally)

How do we change this?
It is down, partly to individual Engineers and Programmers.
You are using C because that is what programmers wanted. Managers,
lawyers, accountants, marketing people don't care what language you use.

So now we need to create the environment where programmers and engineers
start insisting on ISO-C compilers. If they do compiler writers will
start producing them.
A technically good standard that no one uses is pointless.
How do you get programmers at large (and not just the infinitesimally
small group who read this NG) to think about and want ISO-C compilers
and tools?

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #123

P: n/a
In article <41***************@yahoo.com>,
CBFalconer <cb********@worldnet.att.net> wrote:
Joseph Myers wrote:

... snip ...
(While C99 seems to say that when a float being stored with the
range and precision of long double is converted to double, the
excess range and precision is not removed, though it would be
when a double is converted to double.) ... snip ...


That make absolutely no sense. I don't believe in quantum
creation of bits on demand.


6.3.1.5 says

[#1] When a float is promoted to double or long double, or a
double is promoted to long double, its value is unchanged.

[#2] When a double is demoted to float, a long double is
demoted to double or float, or a value being represented in
greater precision and range than required by its semantic
type (see 6.3.1.8) is explicitly converted to its semantic
type, if the value being converted can be represented
exactly in the new type, it is unchanged. If the value
being converted is in the range of values that can be
represented but cannot be represented exactly, the result is
either the nearest higher or nearest lower representable
value, chosen in an implementation-defined manner. If the
value being converted is outside the range of values that
can be represented, the behavior is undefined.

So if FLT_EVAL_METHOD is 2, and x and y are of type float, then (x*y)
has type float but is represented with the range and precision of long
double. (float)(x*y) is represented with the range and precision of
float; any excess bits are removed by the explicit cast. But
(double)(x*y) is represented with the range and precision of long
double; it may have excess bits beyond those in the precision of
double (i.e. be a value not exactly representable as a double).
However (double)(double)(x*y) has only the range and precision of a
double.

--
Joseph S. Myers
js***@gcc.gnu.org
Nov 14 '05 #124

P: n/a
In article <pa****************************@hotmail.com>,
Robert Gamble <ro**********@hotmail.com> wrote:
I read all of the document but didn't find anything that really
contradicted my interpretation. Even the parts you pointed out don't
clearly state anything one way or another. Is there a place where these
exceptions are documented? Making this information more prevalent may
prevent this type of confusion in the future.


http://gcc.gnu.org/bugzilla/
(specifically the dependencies of bugs 16620 and 16989)

http://gcc.gnu.org/c99status.html

It would be a bit odd to have documentation that alternates between
saying "complete" and "not complete" whenever conformance bugs are
found and fixed.

--
Joseph S. Myers
js***@gcc.gnu.org
Nov 14 '05 #125

P: n/a
"Chris Hills" <ch***@phaedsys.org> wrote in message
news:5i**************@phaedsys.demon.co.uk...
In article <41***************@yahoo.com>, CBFalconer
<cb********@yahoo.com> writes
Chris Hills wrote:

... snip ...

If any of them could see any commercial advantage in producing a
C99 compile they would. It is for this reason that most of the
worlds major embedded compiler writers are making their compilers
MISRA-C compliant (MISRA-C was first launched in 1997) and they
are eager to tp become MISRA-C2 compliant even before the new
version has been published. (on the 13th October BTW)


I expect that is because there is significant competition in that
field, inasmuch as gcc is apparently not very tolerant of
restricted devices, such as 16 bit integers.


Yes, there is a lot of competition in the embedded world. From 4 to 128
bit. There is also Gcc and many other free tools for most embedded
platforms.

However whilst they are "rushing" to make their library code MISRA-C
complient (in as much as it can be) they are not rushing to make their
code their compilers C99 compliant...

This is strange. MISRA-C is "just a coding guideline" NOTE "guideline"
not "Standard" whereas ISO-C is THE standard for their compiler.

The problem is that the ISO-c standard is not seen as important or a
prerequisite for a compilers by the industry.

How do we change this perception? How do we get the industry to demand
ISO-C compliant compilers?
Part of the problem is a large part fot he industry uses GCC because
"its free and you get the source" rather than for any Engineering
reasons.

I suspect that until the law or the insurance companies REQUIRE ISC-C
complient compilers they will not become essential in the industry.


So I can go back to my world of K&R2 and tell Keith and Dan to stuff it?
;-)

--
Mabden
Nov 14 '05 #126

P: n/a
Joseph Myers wrote:
CBFalconer <cb********@worldnet.att.net> wrote:
Joseph Myers wrote:
... snip ...
(While C99 seems to say that when a float being stored with the
range and precision of long double is converted to double, the
excess range and precision is not removed, though it would be
when a double is converted to double.) ... snip ...


That make absolutely no sense. I don't believe in quantum
creation of bits on demand.


6.3.1.5 says

[#1] When a float is promoted to double or long double, or a
double is promoted to long double, its value is unchanged.

[#2] When a double is demoted to float, a long double is
demoted to double or float, or a value being represented in
greater precision and range than required by its semantic
type (see 6.3.1.8) is explicitly converted to its semantic
type, if the value being converted can be represented

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^ exactly in the new type, it is unchanged. If the value
being converted is in the range of values that can be
represented but cannot be represented exactly, the result is ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ either the nearest higher or nearest lower representable
value, chosen in an implementation-defined manner. If the
value being converted is outside the range of values that ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ can be represented, the behavior is undefined.

So if FLT_EVAL_METHOD is 2, and x and y are of type float, then (x*y)
has type float but is represented with the range and precision of long
double. (float)(x*y) is represented with the range and precision of
float; any excess bits are removed by the explicit cast. But
(double)(x*y) is represented with the range and precision of long
double; it may have excess bits beyond those in the precision of
double (i.e. be a value not exactly representable as a double).
However (double)(double)(x*y) has only the range and precision of a
double.


See the underlined clauses above. Also the following in
5.2.4.2.2. (N869)

14)The floating-point model is intended to clarify the
description of each floating-point characteristic and
does not require the floating-point arithmetic of the
implementation to be identical.

--
Chuck F (cb********@yahoo.com) (cb********@worldnet.att.net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net> USE worldnet address!
Nov 14 '05 #127

P: n/a
Chris Hills wrote:
.... snip ...
I suspect that until the law or the insurance companies REQUIRE ISC-C
complient compilers they will not become essential in the industry.


That in turn requires a freely (in practice) available standard
test suite for evaluation purposes. Then every purchaser can
check compliance.

The suite should have various levels. To do anything it must use
at least getchar and putchar with stdin and stdout. After that
levels can check compliance with each possible #include of
standard headers.

The suite could be developed in much the same manner as the late
Pascal test suite, circa 1980, was created. Unfortunately that
collection was handed over to a firm for maintenance, with rights,
and is now lost. Each test monitored compliance with an
identifiable (via test no.) clause in the standard and was a
complete program. For example
p2-3-4t5 was a program implementing a 5th test on clause 2.3.4.

--
Chuck F (cb********@yahoo.com) (cb********@worldnet.att.net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net> USE worldnet address!
Nov 14 '05 #128

P: n/a
Chris Hills wrote:
.... snip ...
Who is responsible for ensuring that it is a required standard?
ISO? and locally BSI? ANSI?, DIN?

or the professional bodies IEE, IEEE, BCS etc

or governments?

I don't think it fits the UN field somehow.


If you can implant the idea in GWBs head, he can then incarcerate
any who fail to comply in the Gitmo gulag forever. You have about
2 months. :-)

Seriously, it means some form of legality. Such as allowing
conformance with the standard to be a defence against certain
liabilities. That also requires insisting that software vendors,
contractors, etc. assume liability for their products. Similarly
hardware. My head hurts.

--
Chuck F (cb********@yahoo.com) (cb********@worldnet.att.net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net> USE worldnet address!
Nov 14 '05 #129

P: n/a
On Sat, 04 Sep 2004 09:59:12 +0000, Joseph Myers wrote:
In article <pa****************************@hotmail.com>,
Robert Gamble <ro**********@hotmail.com> wrote:
I read all of the document but didn't find anything that really
contradicted my interpretation. Even the parts you pointed out don't
clearly state anything one way or another. Is there a place where these
exceptions are documented? Making this information more prevalent may
prevent this type of confusion in the future.


http://gcc.gnu.org/bugzilla/
(specifically the dependencies of bugs 16620 and 16989)

http://gcc.gnu.org/c99status.html

It would be a bit odd to have documentation that alternates between
saying "complete" and "not complete" whenever conformance bugs are
found and fixed.


If gcc is beleived to be c90 conformant aside from the bugs that pop up
from time to time, I would try to make this fact more clear in the
documentation and point the user to the bugzilla site for the latest
possible conformance issues.

If there are known conformance problems when a new major version is
released, this should be documented. Maybe a small table similiar to the
c99status page that just lists the areas where the last x releases do not
conform to the c90 standard?

Rob Gamble
Nov 14 '05 #130

P: n/a
"Mabden" <mabden@sbc_global.net> writes:
[...]
So I can go back to my world of K&R2 and tell Keith and Dan to stuff it?
;-)


What the hell are you talking about?

An insult with a smiley is still an insult, you know, and if there's
no apparent motivation for it (why would you want to tell me to "stuff
it" in this context?), it's just bizarre.

Yes, it's common to engage in good-natured insults among friends and
acquaintances, but based on what I've seen here you don't seem to have
the knack for it. On several occasions, you've written something that
was apparently intended as harmless teasing, but it's come across as a
seriously offensive insult. I mean no offense; you could be perfectly
charming and witty in person.

A grossly exaggerated example might be, "So-and-so is an ax-murdering
pedophile -- just kidding!" (No, you're not nearly that bad.)

You might consider just avoiding irony and sarcasm when posting to
technical newsgroups. Or just avoid posting anything with a smiley
that you wouldn't post without one.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Nov 14 '05 #131

P: n/a
In comp.std.c Joseph Myers <js***@gcc.gnu.org> wrote:

So if FLT_EVAL_METHOD is 2, and x and y are of type float, then (x*y)
has type float but is represented with the range and precision of long
double. (float)(x*y) is represented with the range and precision of
float; any excess bits are removed by the explicit cast. But
(double)(x*y) is represented with the range and precision of long
double; it may have excess bits beyond those in the precision of
double (i.e. be a value not exactly representable as a double).


I think you've found a bug in the standard -- the intent was that casts
(and assignments) to a narrower type than the representation should
scrape off the extra bits.

-Larry Jones

You know how Einstein got bad grades as a kid? Well MINE are even WORSE!
-- Calvin
Nov 14 '05 #132

P: n/a
Joseph Myers wrote:

So if FLT_EVAL_METHOD is 2, and x and y are of type float, then (x*y)
has type float but is represented with the range and precision of long
double. (float)(x*y) is represented with the range and precision of
float; any excess bits are removed by the explicit cast. But
(double)(x*y) is represented with the range and precision of long
double; it may have excess bits beyond those in the precision of
double (i.e. be a value not exactly representable as a double).
However (double)(double)(x*y) has only the range and precision of a
double.


I understand your example and see why you have that interpretation
(since Standard C is talking about types, not representations, in
most cases). I think Defect Report 290, which addresses a similar
case, needs to be revisited with this example in mind.
---
Fred J. Tydeman Tydeman Consulting
ty*****@tybor.com Programming, testing, numerics
+1 (775) 287-5904 Vice-chair of J11 (ANSI "C")
Sample C99+FPCE tests: ftp://jump.net/pub/tybor/
Savers sleep well, investors eat well, spenders work forever.
Nov 14 '05 #133

P: n/a
Chris Hills wrote:
How do we change this?
It is down, partly to individual Engineers and Programmers.
Why would a competent engineer/programmer introduce an unnecessary dependency
of a project on C99 when compilers for C99 are so thin on the ground? At most,
they will use C99 features that are supported by the compiler they want to use
for other reasons.
So now we need to create the environment where programmers and engineers
start insisting on ISO-C compilers. If they do compiler writers will
start producing them.


C99 doesn't provide enough benefit over C90 for that to happen.
It's not *just* a "marketing" issue, it's a technical issue as well.

To break the chicken-and-egg cycle, someone has to create a conforming C99
implementation *despite* the fact programmers are not asking for it. For
example, someone could fund Gnu to solve gcc's remaining C99 conformance
issues.

Incidentally, this is exactly why IETF don't standardize things that haven't
been implemented.

--
David Hopwood <da******************@blueyonder.co.uk>
Nov 14 '05 #134

P: n/a
Douglas A. Gwyn wrote:
Dan Pop wrote:
Then, please explain the lack of interest in C99 among the C programming
community at large.


How do you know there is a "lack of interest"? Have you taken
a scientifically valid poll?

Anyway, until C99 compliance is sufficiently widely available,
it is unlikely to be a project requirement. That alone would
be sufficient to explain any apparent "lack of interest".


Surely this is putting the chicken before the horse.

The following is my analysis. It is not based on scientifically
validated evidence and may be wrong.

Around the time when the first C standard was introduced, C was a
very popular programming language. The C world had a strong reason
to avoid the proliferation of dialects and the standard was
welcome. Compiler vendors had a strong incentive to produce
standard-compatible compilers.

By the time C99 came out, C had lost a fair amount of ground to
C++ and Java (and now it, and/or the latter languages, are losing
ground to C#). Microsoft has no particular reason to invest a lot
of resources in a C99 compiler, since it has hitched its fortunes
to C# and .NET.

Nonetheless, C remains the lingua franca of open-source programming.
I suspect that the great bulk of newly-written C code is open
source. In that context, people are mostly compiling it with gcc,
so the relationship between gcc and C99 is of key importance.
But as Dan has said, C99 to a large extent reinvents wheels that
were already available as gcc extensions. Hence C99 is in
trouble.

Just how much of this was foreseeable prior to 1999, I'm not sure.

Allin Cottrell

Nov 14 '05 #135

P: n/a
Chris Hills wrote:
<DA****@null.net> writes
Chris Hills wrote:
... most of the worlds major
embedded compiler writers are making their compilers MISRA-C compliantLast time I looked, MISRA-C was a set of programming guidelines,
not a compiler specification.

It is a coding guideline.
My point was that most of the worlds embedded compiler suites are making
their library code MISRA-C compliant because they think it is a Good
Idea (commercially).
On the same line very few of the worlds compiler writers are making
their compilers C99 compliant (with any real effort) despite the fact
they knew it was coming and despite the fact that in theory it is the
specification for their compilers.


You aren't making sense. You agree that MISRA C is a coding
guideline. Apparently it assumes that such code will be
compiled etc. using a C90 conforming implementation. The
same code can be compiled etc. unchanged using a C99
implementation.
So the question is how do re create an environment in the Industry in
general where ISO-C is automatically considered a prerequisite for a
compiler?


Standard conformance is normally a contractual requirement
such as found in the U.S. FIPS. If you don't specify anything
for a product you buy, then you get whatever the vendor wants
to deliver.
Anyway, there are several compilers that have incorporated C99
features, on a path to full compliance.

"several" "features" this is not good enough. We need "most" and "full
compliance"


My point was that the 1999 standard is playing a role in the
evolution of C compilers. It is unrealistic to expect fully
conforming implementations on day 1 of the standard.

Nov 14 '05 #136

P: n/a
CBFalconer wrote:
That in turn requires a freely (in practice) available standard
test suite for evaluation purposes. Then every purchaser can
check compliance.


No, in fact historically compiler validation has been
done by profesional validation services, and not for
free. The U.S. government purchased a validation
suite for use in validating Federal C compiler
procurements against the initial C FIPS.

Nov 14 '05 #137

P: n/a
Chris Hills wrote:
I have been told on more than one occasion by a compiler user that "the
ISO-C committee has got it wrong" if the standard does not behave the
way their "industry standard" compiler does.


Many people would find that grounds for switching to
another vendor with a better appreciation of standards.

Nov 14 '05 #138

P: n/a
David Hopwood wrote:
Not true. Because auto variables are constant size and recursion can be
limited, it may be the case on a particular platform that a given program
will *never* have undefined behaviour due to a stack overflow. This is
very unlikely to be true if the program uses VLAs, because the main point
of using VLAs is to allocate objects of arbitrary sizes.


Actually it is to allow *parametric* array sizes,
not *arbitrarily large* sizes. Even if the entire
app were coded using constant, largest supported
sizes for every array, in general you still wouldn't
know whether the stack will overflow at run time,
unless you do careful analysis (and happen to have
an algorithm that is not too dynamic).

Nov 14 '05 #139

P: n/a
Richard Tobin wrote:
I mean malloc() may return a non-null value and then fail when you
try to actually use the memory. Presumably you already know about this.
I know that such an implementation is badly broken.
For me, the behaviour of malloc() is not the only consideration in
choosing a platform.


What, reliable execution of carefully written programs
is not important?

Nov 14 '05 #140

P: n/a
"Douglas A. Gwyn" <DA****@null.net> writes:
[...]
My point was that the 1999 standard is playing a role in the
evolution of C compilers.
Agreed. Even if you can't assume that a given implementation will
support, say, complex numbers, it's probably safe at this point to
assume that an implementation that does support them will do so in a
manner consistent with the C99 specification.
It is unrealistic to expect fully
conforming implementations on day 1 of the standard.


That's a straw man. Nobody expects fully conforming implementations
on day 1 of the standard. A lot of us expected more conforming
implementations that we have on day 1740 of the standard (if I've done
the math right). (And the reasons for that have been discussed at
length here.)

I suspect the best way to improve the situation would be to devote the
necessary resources to make gcc support C99 at least as well as it
currently supports C90. If that happened, other vendors might feel
more pressure to be gcc-compatible than they now feel to be
C99-compatible.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Nov 14 '05 #141

P: n/a
"Douglas A. Gwyn" wrote:
CBFalconer wrote:
That in turn requires a freely (in practice) available standard
test suite for evaluation purposes. Then every purchaser can
check compliance.


No, in fact historically compiler validation has been done by
profesional validation services, and not for free. The U.S.
government purchased a validation suite for use in
validating Federal C compiler procurements against the
initial C FIPS.


So what? The presence of a free suite would result in many users
checking their system, and complaining about failures. It would
advance the cause of C99 compliance. It could be revised for C09
or whatever, and aid in promulgating any future versions. An open
source style licence could maintain control. Even if the suite
has failings, such would be published and generally known, and
future revisions would incorporate corrections.

The presence of the Pascal suite in 1980 for the cost of media
certainly aided in compliance by most. Borland being the glaring
exception.

--
"A man who is right every time is not likely to do very much."
-- Francis Crick, co-discover of DNA
"There is nothing more amazing than stupidity in action."
-- Thomas Matthews
Nov 14 '05 #142

P: n/a
In article <Vu********************@comcast.com>, Douglas A. Gwyn
<DA****@null.net> writes
Chris Hills wrote:
<DA****@null.net> writes
Chris Hills wrote:
... most of the worlds majorAnyway, there are several compilers that have incorporated C99
features, on a path to full compliance.

"several" "features" this is not good enough. We need "most" and "full
compliance"


My point was that the 1999 standard is playing a role in the
evolution of C compilers. It is unrealistic to expect fully
conforming implementations on day 1 of the standard.


However we are now 4 years down the line. Not "day 1" As has been
pointed out one compiler is C99 ( The tasking one and a couple of
smaller players) so it is possible to do. However there seems to be no
commercial impetus to do it.

If C99 was really a requirement anywhere they would all be producing C99
compilers.

Somehow we need to generate that feeling amongst programmers that it is
important.

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #143

P: n/a
In article <Np*******************@newssvr29.news.prodigy.com> , Mabden
<mabden@sbc_global.net> writes
"Chris Hills" <ch***@phaedsys.org> wrote in message
news:5i**************@phaedsys.demon.co.uk...

I suspect that until the law or the insurance companies REQUIRE ISC-C
complient compilers they will not become essential in the industry.


So I can go back to my world of K&R2 and tell Keith and Dan to stuff it?
;-)


Probably, unfortunately.

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #144

P: n/a
In article <41***************@yahoo.com>, CBFalconer
<cb********@yahoo.com> writes
Chris Hills wrote:
... snip ...

I suspect that until the law or the insurance companies REQUIRE ISC-C
complient compilers they will not become essential in the industry.


That in turn requires a freely (in practice) available standard
test suite for evaluation purposes. Then every purchaser can
check compliance.


The standard is freely available from a multitude of sources.. The
recognised test suites are not. they cost a lot of money. However there
is a lot of work in them.
The suite should have various levels. To do anything it must use
at least getchar and putchar with stdin and stdout. After that
levels can check compliance with each possible #include of
standard headers.


See Plum-Hall and Perennial.
/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #145

P: n/a
In article <41***************@yahoo.com>, CBFalconer
<cb********@yahoo.com> writes
"Douglas A. Gwyn" wrote:
CBFalconer wrote:
That in turn requires a freely (in practice) available standard
test suite for evaluation purposes. Then every purchaser can
check compliance.
No, in fact historically compiler validation has been done by
profesional validation services, and not for free. The U.S.
government purchased a validation suite for use in
validating Federal C compiler procurements against the
initial C FIPS.


So what? The presence of a free suite would result in many users
checking their system, and complaining about failures.


No it wouldn't. No one is asking for ISO-C compliance much anyway let
alone testing it. There are a few fields where they do look at the
compiler but these tend to be on micros where there are extensions and
you would not run it in ISO-C mode anyway even if it could do it.
It would
advance the cause of C99 compliance. It could be revised for C09
or whatever, and aid in promulgating any future versions. An open
source style licence could maintain control. Even if the suite
has failings, such would be published and generally known, and
future revisions would incorporate corrections.
OK... you write it and post it here.
The presence of the Pascal suite in 1980 for the cost of media
certainly aided in compliance by most. Borland being the glaring
exception.


And their compiler became Delphi. AFAIK Borland had one of the more
successful Pascal compilers.

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #146

P: n/a
In article <ch***********@f1n1.spenet.wfu.edu>, Allin Cottrell
<co******@wfu.edu> writes
Douglas A. Gwyn wrote:
Dan Pop wrote:
Then, please explain the lack of interest in C99 among the C programming
community at large.
How do you know there is a "lack of interest"? Have you taken
a scientifically valid poll?

Anyway, until C99 compliance is sufficiently widely available,
it is unlikely to be a project requirement. That alone would
be sufficient to explain any apparent "lack of interest".


Surely this is putting the chicken before the horse.

The following is my analysis. It is not based on scientifically
validated evidence and may be wrong.

Around the time when the first C standard was introduced, C was a
very popular programming language. The C world had a strong reason
to avoid the proliferation of dialects and the standard was
welcome. Compiler vendors had a strong incentive to produce
standard-compatible compilers.


So far so good..
By the time C99 came out, C had lost a fair amount of ground to
C++ and Java (and now it, and/or the latter languages, are losing
ground to C#). Microsoft has no particular reason to invest a lot
of resources in a C99 compiler, since it has hitched its fortunes
to C# and .NET.
Yes.
Nonetheless, C remains the lingua franca of open-source programming.
I suspect that the great bulk of newly-written C code is open
source.
Not by a long long way... most of the C programming is in the embedded
world where they use a lot of commercial compilers because GCC is not
efficient enough.
In that context, people are mostly compiling it with gcc,
so the relationship between gcc and C99 is of key importance.
But as Dan has said, C99 to a large extent reinvents wheels that
were already available as gcc extensions. Hence C99 is in
trouble.


I have heard it said that C will tend to the GCC "standard" because it
is the most common single version on many platforms and targets.

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #147

P: n/a
In article <41***************@yahoo.com>, CBFalconer
<cb********@yahoo.com> writes
Chris Hills wrote:

... snip ...

Who is responsible for ensuring that it is a required standard?
ISO? and locally BSI? ANSI?, DIN?

or the professional bodies IEE, IEEE, BCS etc

or governments?

I don't think it fits the UN field somehow.


If you can implant the idea in GWBs head, he can then incarcerate
any who fail to comply in the Gitmo gulag forever. You have about
2 months. :-)

Seriously, it means some form of legality. Such as allowing
conformance with the standard to be a defence against certain
liabilities. That also requires insisting that software vendors,
contractors, etc. assume liability for their products. Similarly
hardware. My head hurts.


I think I said way back in this thread (and in several articles in
magazines over the years) that standardisation and improvements in Sw
engineering will come about via the legal and insurance professions not
the software industry.

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #148

P: n/a
In article <Vu********************@comcast.com>, Douglas A. Gwyn
<DA****@null.net> writes
Chris Hills wrote:
I have been told on more than one occasion by a compiler user that "the
ISO-C committee has got it wrong" if the standard does not behave the
way their "industry standard" compiler does.


Many people would find that grounds for switching to
another vendor with a better appreciation of standards.

I am confused.. the USERS of the compiler were saying "the ISO committee
had got it wrong" because the standard was not tracking their compiler
which should be used as the reference....

The user in question was using an MS VC++ compiler to write and test
embedded code and wondered why it did not work when they tried to re-
compile it with an 8-bit embedded compiler....

In other words there are many users out there who think the standards
committee should be writing the standard to fit the compilers! Where the
compiler and standard disagree the standard is wrong.

So how do you sell the importance of C99 to people like that and there
are a LOT of them.

What was your comment about the vendor meant to mean? There wan no
vendor mentioned in the quote.

/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys.org www.phaedsys.org \/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Nov 14 '05 #149

P: n/a
On Sun, 05 Sep 2004 11:58:28 +0000, CBFalconer wrote:
So what? The presence of a free suite would result in many users
checking their system, and complaining about failures. It would
advance the cause of C99 compliance. It could be revised for C09
or whatever, and aid in promulgating any future versions. An open
source style licence could maintain control. Even if the suite
has failings, such would be published and generally known, and
future revisions would incorporate corrections.
What economic incentive is there for someone to develop a "free" C test
suite? For that matter, who is going to pay for development of a C99
compiler?

It simply isn't going to happen. People developing "free" software are
content with GCC, which does a fairly good job of approaching C99
compliance -- in comparison to most commercial compilers, that is. If GCC
is to attain full C99 compliance, someone is going to have to fund it (or
find a lot of available graduate students?)
The presence of the Pascal suite in 1980 for the cost of media
certainly aided in compliance by most. Borland being the glaring
exception.


More code was written in Borland Pascal/Delphi than in any other
dialect of Pascal, including Standard Pascal.

Beyond the numerical community (of which I am one), most programmers would
be hard-pressed to find anything in C99 that is a "must have" -- and as
such, there is no demand for C99 compilers.

If C99 *does* define non-numeric "must have" features (and I have
reviewed the Standard extensively), then those who want C99 compilers need
to do a bit of evangelizing.

--
Scott Robert Ladd
Coyote Gulch Productions (http://www.coyotegulch.com)
Software Invention for High-Performance Computing
Nov 14 '05 #150

233 Replies

This discussion thread is closed

Replies have been disabled for this discussion.