473,855 Members | 2,150 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Is C99 the final C?

I was just thinking about this, specifically wondering if there's any
features that the C specification currently lacks, and which may be
included in some future standardization .

Of course, I speak only of features in the spirit of C; something like
object-orientation, though a nice feature, does not belong in C.
Something like being able to #define a #define would be very handy,
though, e.g:

#define DECLARE_FOO(bar ) #define FOO_bar_SOMETHI NG \

I'm not sure whether the features of cpp are even included in the C
standard though (and GCC has definitely taken quite a nonstandard approach
with regards to certain token expansions and whatnot), but that's one area
of improvement I see.

I would also like to see something along the lines of C++ templating,
except without the really kludgy implementation that the C++ folks decided
to go to ( and without the OOP ).

.... Mike pauses for the sound of a thousand *plonks*

Templates save a lot of time when it comes to commonly-used data
structures, and as they are entirely implemented at compile-time and don't
include, by their definition, OOP (although they can be well suited to
it), I think they would be a nice addition and in the spirit of C.

Your thoughts? I'm sure there's some vitriol coming my way but I'm
prepared 8)

Mike's Patented Blocklist; compile with gcc:

i=0;o(a){printf ("%u",i>>8*a&25 5);if(a){printf (".");o(--a);}}
main(){do{o(3); puts("");}while (++i);}

Nov 13 '05
193 9688
In article <3F************ ***@saicmodis.c om>, James Kuyper
<ku****@saicmod is.com> writes
No - I think that &&& and ||| would be so rarely used that their
meanings would become trivia questions.

see Rule 36 of the MISTRAY-C C coding standard at

It is the antidote to MISRA-C and should raise a smile or three.

I am looking for some code examples to illustrate some of the rules.


\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\
/\/\/ ch***@phaedsys. org www.phaedsys.org \/\/
Nov 13 '05 #51
Sidney Cadot <si****@jigsaw. nl> wrote in message news:<bq******* ***@news.tudelf t.nl>...
Arthur J. O'Dwyer wrote:
The last statement I do not understand. Care to elaborate?

const int foo = 5;
char arr[foo];

Legal C++ (because in C++, const objects are really "constant") ,
but illegal C (because 'foo' is an object whose value needn't
be stored [or even computed] at compile time).

This is legal at least in C99, and I think also in C89 (at least my gcc
doesn't even warn on it).

That's because C99 now allows variable-length arrays, so an integer
constant expression is no longer required in that context. However,
they were required in C89 in this context, so if a compiler fails to
issue a diagnostic, it isn't a conforming implementation of C89. All
of the following uses of foo except the one that is marked are still
illegal in C99, because they still require integer constant

struct bf_holder{
int bit_field:foo;
} x[foo+1] = // Only legal use of foo in this section.
{[foo] = {10}}; // Subscript for designated initializer

enum {enumeration_va lue=foo};

case foo:

struct bf_holder *p=x+foo;
Nov 13 '05 #52
"Arthur J. O'Dwyer" <aj*@nospam.and rew.cmu.edu> wrote in message news:<Pi******* *************** ************@un ix44.andrew.cmu .edu>...
is easier for me to grasp. And I think (but this is just IMH
and uneducated O) that C++ compilers need a lot of extra baggage
to deal with constant "consts," and I wouldn't want to foist all
that on C compilers.

I'm curious - why do you think that? I don't know that you're wrong,
but I can't think of any reason why it would be a significant cost.
Nov 13 '05 #53
On Mon, 01 Dec 2003 13:24:23 GMT, in comp.lang.c , pete
<pf*****@mindsp ring.com> wrote:
Mark McIntyre wrote:

On Sun, 30 Nov 2003 13:54:02 GMT, in comp.lang.c , pete
<pf*****@mindsp ring.com> wrote:
>'long' might be a better choice
>for when you need an exactly 32-bit integer type.

There's nothing that requires long to be exactly 32 bits any more
than int. Both would be equally nonportable assumptions.

A type which is guaranteed to have at least 32 bits, is a better choice
than one which isn't guaranteed to have at least 32 bits,
for when you need an exactly 32-bit integer type.

*shrug*. I understand your point, but when in implementation-dependent
territory, its quite immaterial what ISO requires, its never going to
safely port anyway so abandon all hope, ye who enter here....
Mark McIntyre
CLC FAQ <http://www.eskimo.com/~scs/C-faq/top.html>
CLC readme: <http://www.angelfire.c om/ms3/bchambless0/welcome_to_clc. html>
----== Posted via Newsfeed.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeed.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= 19 East/West-Coast Specialized Servers - Total Privacy via Encryption =---
Nov 13 '05 #54
In <3F***********@ mindspring.com> pete <pf*****@mindsp ring.com> writes:
Dan Pop wrote:

In <3F**********@m indspring.com> pete <pf*****@mindsp ring.com> writes:
>Arthur J. O'Dwyer wrote:
>> I think // comments are obscene; I have no use for variable-width
>> arrays; any time I need an exactly 32-bit integer type, I'm probably
>> not writing completely portably anyway and might as well use 'int'
>> itself; and so on.
>'long' might be a better choice
>for when you need an exactly 32-bit integer type.
Nope. long should be a better choice for when you need a 64-bit
integer type.
The only unwasteful type assignation for the usual processors
currently used in hosted implementations is:

8-bit: char
16-bit: short
32-bit: int
64-bit: long

and there are C89 implementations that do the integral types this way.
No need for a long long at all.

Having int as an "alias" for either short or long was a historical
mistake that should have been exposed and corrected long ago. The C
standards keep blessing it, instead...

That might be the case, if you're talking about implementing C,
but I beleive that Arthur J. O'Dwyer,
was talking about the the C programmer's choice
for when it comes to choosing a type, while writing C code.

The C programmer's choice is not that clear. See below.
I use long to implement psuedorandom number generators
which generate 32 bit values, portably.
I can't do that with int.

It depends on your definition of portability. For current hosted
implementations , int will give you 32 bits, while long may give you more
than that. If you think that being portable to MSDOS is a great idea,
than you have to use long and accept the fact that you're wasting memory
on other platforms. No big deal for scalars, but it may become a
performance issue when dealing with arrays.

Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Nov 13 '05 #55
Sidney Cadot wrote:
Mark Gordon wrote:

.... snip ...

That was why I suggested something other than enum. I'm not
sure what else the person who said he wanted enums to be more
special might have meant.

That would have been me. I was aiming for about the same status
for 'enum' types as enumeration types have in Pascal. However,
in C you can of course assign definite integer values to members
of an enum, including duplicates - this complicates matters

I guess the only thing that could be done without breaking too
much code is mandating a compiler diagnostic on implicit
conversion of an enum value to an int value. Personally, I think
that would be a good idea.

IMO you cannot correct the existing enum without breaking valid
code. To gain the abilities of the Pascal enumeration requires an
entirely new type. This could be conveniently combined with the
addition of sub-range types, none of which would break old code,
but would provide much additional safety.

Chuck F (cb********@yah oo.com) (cb********@wor ldnet.att.net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home .att.net> USE worldnet address!
Nov 13 '05 #56
Dan Pop wrote:
Mark Gordon <sp******@fla sh-gordon.me.uk> writes:

.... snip ...

Definitely not. CHAR_BIT==9 may be rare these days, but
CHAR_BIT==16 is not once you start looking at DSP processors
which often don't have the ability to access less that 16
bits (or or) in one operation.

But those are used in freestanding implementations only, and we
ignore such implementations by default, here.

"We" don't, but maybe you do. Such implementations are among the
most important users of C today.

Chuck F (cb********@yah oo.com) (cb********@wor ldnet.att.net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home .att.net> USE worldnet address!
Nov 13 '05 #57
Michael B. wrote:
[Are there any] features that the C specification currently lacks
and which may be included in some future standardization .

The future of C is C++. The question now is,
"Will any future C++ standard adopt the features introduced in C99?"

restricted pointers,
variable-length arrays,

Nov 13 '05 #58
glen herrmannsfeldt <ga*@ugcs.calte ch.edu> wrote:
Morris Dovey wrote:
Michael B. wrote:
I was just thinking about this, specifically wondering if there's any
features that the C specification currently lacks, and which may be
included in some future standardization .
Of course there are. Can you imagine that people will /ever/ stop fixing
things just because they ain't broke?
There was a quote some years ago, I believe by one of the authors of the
original Fortran compiler, though it is hard to verify the source.
Nah, not really hard at all. It was Tony Hoare (now living here in

"I don't know what the language of the year 2000 will look like, but
it will be called Fortran."
- C. A. R. Hoare, 1982 Something like: "I don't know what the language of the year 2000 will look like, but it
will be called Fortran." (That is from memory, hopefully it is close.)


Nov 13 '05 #59
Chris Torek wrote:
* triple-&& and triple-|| operators: &&& and ||| with semantics
like the 'and' and 'or' operators in python:

a &&& b ---> if (a) then b else a
a ||| b ---> if (a) then a else b

(I think this is brilliant, and actually useful sometimes).
Besides the syntactic shift (from being parsed as "&& &b" today),
I think it is worth pointing out that if "a" is false, it must compare
equal to 0; so assuming "a &&& b" means "if a then b else a", it
also means "if a then b else 0", which can be expressed today as
"a ? b : 0".

(GCC offers "a ||| b" as "a ?: b", which means "a ? a : b" without
evaluating "a" twice.)

Especially this latter form is quite useful, expressing the idea that
'a' is to be used if it has a non-zero value, else use 'b' as a
fallback. This could be threaded, as in 'a ?: b ?: c'... I'd seriously
hope the next committee would consider this. This actually /is/ useful.
* a way to "bitwise invert" a variable without actually
assigning, complementing "&=", "|=", and friends. The expression "x = ~x" can always be transformed into "x ^= ~0U",
although the constant may need to be written as 0UL, 0ULL,
(uintmax_t)0, or some such. (Using ^= ~(uintmax_t)0 probably
always works, but probably draws warnings on some compilers.)
Ok, then I'd rather stick to 'x=~x', which is clearer. It has always
sort-of annoyed me that the binary operators '+' '-' .... and so on have
assignment forms, while the unary '~' hasn't.
* 'min' and 'max' operators (following gcc: ?< and ?>)

It is also worth noting that Dennis Ritchie's early C compiler(s)
*had* min and max operators, spelled \/ and /\ respectively. They
were dropped, most likely from lack of use.
Funny, I didn't know that - weird syntax.

Personally, I would like to see min/max operators make a comeback. They
are quite often needed in practice, supported by some hardware, and
have clear algebraic foundations (max-plus algebras, and all that).
Someone else asked (further down in the thread) whether some CPUs
might have "min" and "max" instructions. I have never seen precisely
this myself, but many architectures have "synthetic" (and branchless)
min/max sequences -- usually involving the carry bit (many CPUs)
or "set if less than" instructions (MIPS) or the like -- and even
the x86 now has conditional-move. GCC will generate these for the
branch-ful "a < b ? a : b" sequence, e.g.:

int imin(int a, int b) { return a < b ? a : b; }

compiles to, e.g.:

cmpl %eax, %ecx
cmovle %ecx, %eax

when gcc is given the right options (-msse). (V9 sparc also has
conditional move, and ARM has conditional *everything*. :-) )

Interesting. The only processor I've seen that has this is the Philips
Trimedia, an embedded processor optimized for multimedia streaming. It's
a VLIW processors with five parallel instruction units. Branch
prediction failure rollback is quite expensive on these beasts.

Best regards,


Nov 13 '05 #60

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

by: Anthony Martin | last post by:
I've been reading the Java Language Specification, and in Chapter 16 there's an interesting topic called Definite Assignment. http://tinyurl.com/3fqk8 I'm wondering about the idea of "Deferred Final Automatic Variables" like the following: void unflow(boolean flag) { final int k;
by: Medi Montaseri | last post by:
Hi, I think my problem is indeed "how to implement something like java's final in C++" The long version.... I have an abstract base class which is inherited by several concrete classes. I have a group of methods that I'd like to implement in the base class
by: David J Patrick | last post by:
I'm trying to rewrite the CSS used in http://s92415866.onlinehome.us/files/ScreenplayCSSv2.html. using the w3.org paged media standards as described at http://www.w3.org/TR/REC-CSS2/page.html The ScreenplayCSS is flawed, for several reasons; -overuse of <div id= tags -doesn't scale screen resolutions (convert from px to in, pt ?) -no media="print" (how much coule be shared between "screen" & "print") -no automatic page breaks (with...
by: Bezalel Bareli | last post by:
I know I have seen some threads on the subject long time ago and it was using a virtual base class ... in short, what is the nicest way to implement the Java final class in c++ Thanks.
by: My4thPersonality | last post by:
Has the fact that both Java and C# are garbage collected, and C++ in not, anything to do with the fact that there is no language item to prevent a class from being inherired from? I once read that Java and C# implement this feature for preformance, but the C++ creators said it was not worse the effort. So because Java and C# are garbage collected, in their case is it worse the effort? What is the connection?
by: silverburgh.meryl | last post by:
I am trying to convert this code from java to c++: public final class Type { public static final int DEFAULT = 1; private static int index = 2; public static final int COLUMN1 = (int) Math.pow(2, index++); public static final int COLUMN2 = (int) Math.pow(2, index++); public static final int COLUMN3 = (int) Math.pow(2, index++); public static final int COLUMN4 = (int) Math.pow(2, index++);
by: Anthony Baxter | last post by:
On behalf of the Python development team and the Python community, I'm happy to announce the release of Python 2.4.3 (final). Python 2.4.3 is a bug-fix release. See the release notes at the website (also available as Misc/NEWS in the source distribution) for details of the more than 50 bugs squished in this release, including a number found by the Coverity Scan project. Assuming no major bugs pop up, the next release of Python will be...
by: Rahul | last post by:
Hi Everyone, I was searching for final class in c++, and i came across few links which suggests to have the constructor of the, to be final class, as private so that any derived class's constructors can't access the same. class C { private:
by: Rajib | last post by:
Not that this serves any real purpose, but gcc allows me to do some hack like this: class hide_A { public: class A { public: virtual int final() { return 42; } };
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.