473,326 Members | 2,182 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,326 software developers and data experts.

Clearly, it is too late to fix c99 - C is dead

Back in 2002, Harry H. Cheng wrote:
Agreed. gcc is a C compiler for different platforms.
VC++ is a C/C++ compiler for Windows.
SCC is a C compiler for Cray machine.
Ch is an embeddable C interpreter for different platforms.
They use different names and have different extensions to C.
However, they all conform to the ISO C90 standard.
I think they will conform to C99 eventually.

In reply to a rant I posted on comp.lang.c. The point of my rant was
that a large majority of existing compilers should have conformed with
a large percentage of C99 back in '99. The purpose of releasing a
standard is to codify existing practice. So when C99 was released it
should have taken a few weeks for the majority of existing compilers
to be tweaked to conform. That wasn't the case. The committee didn't
codify existing practice, they made up a new language and they
released it as a standard.

It is now 2004, almost 5 years since the release of C99. One of the
most popular C compilers in the world, GCC, has yet to implement the
standard (see http://gcc.gnu.org/c99status.html) and they never will.
How can I say this with such certainty? In fact, I'll say more:
no-one intends to ever implement the standard. If you look at what
Fergus Henderson said when refering to a test program written by Tony
Finch which exercises GCC's implementation of variable length structs
you can appreciate the problem:
I think this is a defect in C99. C99 does not match existing practice
here. I note that Sun C, Compaq C, and GNU C all use the same layout
for this struct. This layout is more efficient (uses less space) than
the layout mandated by C99. I don't think the committee intended to
force implementations to be inefficient in this way and I don't think
the committee intended to force implementations to break binary
compatibility with previous releases.


- http://gcc.gnu.org/ml/gcc/2002-05/msg02858.html

GCC still produces the output Tony Finch discovered back in 2002, as
do the other compilers Fergus Henderson mentioned. I don't doubt
there is a compiler somewhere that implements this part of the
standard correctly but the existing practice of a vast majority of C
compilers is to ignore the standard and, as this is the only standard
we have, I believe that to be the death kneel of the language.

Trent Waddington
Nov 14 '05 #1
96 2553
In order to kill C, you would antecendently have to kill me, which is not so
easily done. If c99 falls flat, there will be a c2006 oder something
similar. Lack of standards makes everyone pull their hair out. It borders
on language-bashing for me to ask: if not C, then what? MPJ
Nov 14 '05 #2
There is nothing in the C99 standard that I find especially appealing or,
how shall I put it, a required to use C99. I'll be using C89 for the
forseeable future. C99 atempts to help with internationalisation, but the
fact is to write portable, internationalised programs one still has to make
ones own way, and not rely on the compiler system to support anything since
it doesn't have to provide for any UTF encodings or anything particularly,
concretely, useful.

These are only my musings; they may change, but I doubt it.
Nov 14 '05 #3
QuantumG wrote:
In reply to a rant I posted on comp.lang.c. The point of my rant was
that a large majority of existing compilers should have conformed with
a large percentage of C99 back in '99. The purpose of releasing a
standard is to codify existing practice. So when C99 was released it
should have taken a few weeks for the majority of existing compilers
to be tweaked to conform. That wasn't the case. The committee didn't
codify existing practice, they made up a new language and they
released it as a standard.


ANSI C did make some significant changes. I think part of it also has to
do with demand. In 1990, C was pretty much *the* main language in
industry, and so people cared about it a lot more. Now Java, C++, Visual
Basic, and others are (each!) more prevalent. If any of them changed
significantly, I think its changes would be picked up more quickly by
compilers and developers alike (this may have already happened with
VB6->VB.NET).
--
Derrick Coetzee
I grant this newsgroup posting into the public domain. I disclaim all
express or implied warranty and all liability. I am not a professional.
Nov 14 '05 #4
Derrick Coetzee <dc****@moonflare.com> wrote in message news:<cj**********@news-int.gatech.edu>...

ANSI C did make some significant changes. I think part of it also has to
do with demand. In 1990, C was pretty much *the* main language in
industry, and so people cared about it a lot more. Now Java, C++, Visual
Basic, and others are (each!) more prevalent.


I believe each of these languages are now more prevalent because the
greatest strength of C - standardization - was deliberately sabotaged
by an uncaring, unthinking committee that exceeded its charter. The
results were easy to predict, and now, 5 years later, have come to
pass exactly as predicted.
Nov 14 '05 #5
QuantumG wrote:
Derrick Coetzee <dc****@moonflare.com> wrote in message news:<cj**********@news-int.gatech.edu>...
ANSI C did make some significant changes. I think part of it also has to
do with demand. In 1990, C was pretty much *the* main language in
industry, and so people cared about it a lot more. Now Java, C++, Visual
Basic, and others are (each!) more prevalent.

I believe each of these languages are now more prevalent because the
greatest strength of C - standardization - was deliberately sabotaged
by an uncaring, unthinking committee that exceeded its charter. The
results were easy to predict, and now, 5 years later, have come to
pass exactly as predicted.


I don't think so. There are certain kinds of programs that ought to
be written in C. Neither Java nor Visual Basic are alternatives for
these programs. Some will claim that C++ is 'a better C' and can be
used in its place. There is not broad agreement on this point.

C lives.
--
Joe Wright mailto:jo********@comcast.net
"Everything should be made as simple as possible, but not simpler."
--- Albert Einstein ---
Nov 14 '05 #6
"Derrick Coetzee" <dc****@moonflare.com> wrote in message
news:cj**********@news-int.gatech.edu...
ANSI C did make some significant changes. I think part of it also has to
do with demand. In 1990, C was pretty much *the* main language in
industry, and so people cared about it a lot more.
Now Java, C++, Visual
Basic, and others are (each!) more prevalent.


Your computing sphere is very limited if you believe that.

-Mike
Nov 14 '05 #7
On Sat, 25 Sep 2004 19:57:26 -0400, Derrick Coetzee wrote:
ANSI C did make some significant changes. I think part of it also has to
do with demand. In 1990, C was pretty much *the* main language in
industry, and so people cared about it a lot more. Now Java, C++, Visual
Basic, and others are (each!) more prevalent. If any of them changed
significantly, I think its changes would be picked up more quickly by
compilers and developers alike (this may have already happened with
VB6->VB.NET).


This is really, really narrow of you. C, Java, and Visual Basic all have
their own, usually mutually-exclusive, problem domains. Using a C program
when you should be programming in VB is pretty damned stupid, and
vice-versa. Anyone who would say the above probably has only programmed in
one environment, making one kind of program in one language. If, indeed,
you have programmed at all.

C is still the only real language for OS development and other standalone
environments. C has displaced assembly and a lot of other languages like
Forth in that domain, and it shows precisely no signs of going anyplace.

In the Unix world, C is the dominant programming language for userland
applications as well. Even though gcc (the dominant compiler in that
realm) offers full and efficient C++ support, object-oriented code simply
has not caught on in that realm outside some notable exceptions.

There is plenty of legacy code written in C. This is why compilers usually
support pre-Standard constructs, and it is why languages that look like C
(and, to some extent, act like C) thrive, while Fortran and Cobol and Lisp
descendants have failed. There's nothing to suggest that C is becoming any
less relevant with time.

Nov 14 '05 #8
Chris Barts <ch*****@gmail.com> wrote in message news:<pa****************************@gmail.com>...
There is plenty of legacy code written in C. This is why compilers usually
support pre-Standard constructs, and it is why languages that look like C
(and, to some extent, act like C) thrive, while Fortran and Cobol and Lisp
descendants have failed. There's nothing to suggest that C is becoming any
less relevant with time.


I think the statement "Fortran has failed" is wrong. Many scientists
and engineers still consider it the best tool for getting their work
done. Enough people are writing code in Fortran 95 to support about 10
compiler vendors (see
http://www.dmoz.org/Computers/Progra...ran/Compilers/
). There is a general trend towards using higher-level languages, and
Fortran is a higher-level language than either C or C++ for numerical
work, especially involving multidimensional arrays.

The Fortran 2003 standard was ratified a few weeks ago, and its new
features include OOP with inheritance, interoperability with C, IEEE
arithmetic, and stream I/O. Already some F95 compilers have added some
of these features.
Nov 14 '05 #9
In <cj**********@news-int.gatech.edu> Derrick Coetzee <dc****@moonflare.com> writes:
Now Java, C++, Visual Basic, and others are (each!) more prevalent.


Do you have some hard data to back up this statement? Whenever I look at
some open source project, I see C source code. Exceptionally Fortran or
C++.

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Currently looking for a job in the European Union
Nov 14 '05 #10
In article <cj**********@sunnews.cern.ch>, Dan Pop <Da*****@cern.ch> wrote:
In <cj**********@news-int.gatech.edu> Derrick Coetzee <dc****@moonflare.com> writes:
Now Java, C++, Visual Basic, and others are (each!) more prevalent.


Do you have some hard data to back up this statement? Whenever I look at
some open source project, I see C source code. Exceptionally Fortran or
C++.


It just depends on where you look.

When I look at code written in the real world, for pay, to solve (badly)
real world problems, I see VB (and/or other MS abominations) code.

Nov 14 '05 #11
>>>Now Java, C++, Visual Basic, and others are (each!) more prevalent.

Do you have some hard data to back up this statement? Whenever I look at
some open source project, I see C source code. Exceptionally Fortran or
C++.


It just depends on where you look.

When I look at code written in the real world, for pay, to solve (badly)
real world problems, I see VB (and/or other MS abominations) code.


And if you look at small-scale implementations of numerical or other
algorithms, every one is well advised to do that in Matlab as the
development time in my experience is shorter than with many other
languages.

<OT rant>
If the language serves the purpose, then it is okay to use it.
The thing I find objectionable is that a too large part of the
"programmers" using (for example) VB do not really want to think,
let alone learn to write programs in the Right Way for their
language/platform, thinking they are the king when they put together
a GUI and write some action routines called at mouseclick or the
like. I do not want to ascribe this to the language, even though
some languages, C among them, force you to learn using the grey stuff
between your ears. I just get very annoyed when interviewing someone
for a job who is not even aware that he/she cannot do serious,
reliable programming but think the world of themselves...
</OT rant>

As long as I think that my students benefit more from learning C
and doing things seemingly unnecessary complicated from scratch,
I will continue to teach them C (including C99 basics). I guess that
this will be for more years than I probably will teach anyone for the
same reason why learning maths and proving some basic stuff does a
world more of good than obtaining a top-of-the-line scientific
calculator.
-- Michael

Nov 14 '05 #12
In <cj**********@yin.interaccess.com> ga*****@yin.interaccess.com (Kenny McCormack) writes:
In article <cj**********@sunnews.cern.ch>, Dan Pop <Da*****@cern.ch> wrote:
In <cj**********@news-int.gatech.edu> Derrick Coetzee <dc****@moonflare.com> writes:
Now Java, C++, Visual Basic, and others are (each!) more prevalent.


Do you have some hard data to back up this statement? Whenever I look at
some open source project, I see C source code. Exceptionally Fortran or
C++.


It just depends on where you look.

When I look at code written in the real world, for pay, to solve (badly)
real world problems, I see VB (and/or other MS abominations) code.


When I do this kind of exercise, I usually see either badly written
C code or badly written C code using a few C++ features. In my segment
of the real world, Windows is seldom used a programming platform: it's
far too inadequate for this kind of usage.

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Currently looking for a job in the European Union
Nov 14 '05 #13
On 25 Sep 2004 00:26:10 -0700, qg@biodome.org (QuantumG) wrote:
Back in 2002, Harry H. Cheng wrote:
Agreed. gcc is a C compiler for different platforms.
VC++ is a C/C++ compiler for Windows.
SCC is a C compiler for Cray machine.
Ch is an embeddable C interpreter for different platforms.
They use different names and have different extensions to C.
However, they all conform to the ISO C90 standard.
I think they will conform to C99 eventually.

In reply to a rant I posted on comp.lang.c. The point of my rant was
that a large majority of existing compilers should have conformed with
a large percentage of C99 back in '99. The purpose of releasing a
standard is to codify existing practice.


That's an incorrect assumption very early in your argument.

--
Al Balmer
Balmer Consulting
re************************@att.net
Nov 14 '05 #14

In article <30**************************@posting.google.com >, be*******@aol.com writes:
Chris Barts <ch*****@gmail.com> wrote in message news:<pa****************************@gmail.com>...
There is plenty of legacy code written in C. This is why compilers usually
support pre-Standard constructs, and it is why languages that look like C
(and, to some extent, act like C) thrive, while Fortran and Cobol and Lisp
descendants have failed. There's nothing to suggest that C is becoming any
less relevant with time.


I think the statement "Fortran has failed" is wrong.


COBOL and Lisp are still going strong, too. Chris is zero-for-three
on this one.

--
Michael Wojcik mi************@microfocus.com

Even 300 years later, you should plan it in detail, when it comes to your
summer vacation. -- Pizzicato Five
Nov 14 '05 #15

<be*******@aol.com> wrote in message
news:30**************************@posting.google.c om...
Chris Barts <ch*****@gmail.com> wrote in message news:<pa****************************@gmail.com>...
There is plenty of legacy code written in C. This is why compilers usually support pre-Standard constructs, and it is why languages that look like C (and, to some extent, act like C) thrive, while Fortran and Cobol and Lisp descendants have failed. There's nothing to suggest that C is becoming any less relevant with time.


I think the statement "Fortran has failed" is wrong.

Yes it's wrong. But Chris did not make that statement. He stated:
"Fortran and Cobol and Lisp descendants have failed."

Whether this is true or not I have no idea, especially when
'failed' is a subjective issue.

-Mike
Nov 14 '05 #16

"Kenny McCormack" <ga*****@yin.interaccess.com> wrote in message
news:cj**********@yin.interaccess.com...
In article <cj**********@sunnews.cern.ch>, Dan Pop <Da*****@cern.ch> wrote:
In <cj**********@news-int.gatech.edu> Derrick Coetzee <dc****@moonflare.com> writes:
Now Java, C++, Visual Basic, and others are (each!) more prevalent.


Do you have some hard data to back up this statement? Whenever I look at
some open source project, I see C source code. Exceptionally Fortran or
C++.


It just depends on where you look.


Yes. But it seems that you 'look' with a very narrow view.
When I look at code written in the real world, for pay, to solve (badly)
real world problems, I see VB (and/or other MS abominations) code.


Your 'real world' must be very small then. PC's running Microsoft
Windows (or any other OS) are a tiny minority of computer systems
in existence.

-Mike
Nov 14 '05 #17
On Mon, 27 Sep 2004 16:43:58 GMT, in comp.lang.c ,
ga*****@yin.interaccess.com (Kenny McCormack) wrote:
In article <cj**********@sunnews.cern.ch>, Dan Pop <Da*****@cern.ch> wrote:
In <cj**********@news-int.gatech.edu> Derrick Coetzee <dc****@moonflare.com> writes:
Now Java, C++, Visual Basic, and others are (each!) more prevalent.
Do you have some hard data to back up this statement? Whenever I look at
some open source project, I see C source code. Exceptionally Fortran or
C++.


It just depends on where you look.


This is true.
When I look at code written in the real world, for pay, to solve (badly)
real world problems, I see VB (and/or other MS abominations) code.


whereas, in perhaps a more real-world situation than you can imagine, I see
C, C++, C#, Java, VB, VB.Net, Python and a heck of a lot of scripting
languages. Each being used where its most appropriate.

Only anidiot uses VB to write processor intensive libraries or code that
you need to have accessible on Solaris, Redhat and WinXP. Similarly only a
masochist uses C to write a WinXP GUI app.
--
Mark McIntyre
CLC FAQ <http://www.eskimo.com/~scs/C-faq/top.html>
CLC readme: <http://www.ungerhu.com/jxh/clc.welcome.txt>
----== Posted via Newsfeed.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeed.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= 19 East/West-Coast Specialized Servers - Total Privacy via Encryption =---
Nov 14 '05 #18
In article <kn********************************@4ax.com>,
Mark McIntyre <ma**********@spamcop.net> wrote:
....
whereas, in perhaps a more real-world situation than you can imagine, I see
C, C++, C#, Java, VB, VB.Net, Python and a heck of a lot of scripting
languages. Each being used where its most appropriate.
I stand by my original assertion - whatever it was - but it's not worth
arguing about. I think I made it clear that my ego is not in this.
Only anidiot uses VB to write processor intensive libraries or code that
you need to have accessible on Solaris, Redhat and WinXP. Similarly only a
masochist uses C to write a WinXP GUI app.


So true. But there's a f*** lot of idiots in the world, and more than your
fair share of masochists.

Nov 14 '05 #19
In article <9Z******************@newsread1.news.pas.earthlink .net>,
Mike Wahler <mk******@mkwahler.net> wrote:
....
Your 'real world' must be very small then. PC's running Microsoft
Windows (or any other OS) are a tiny minority of computer systems
in existence.


You sure about that? As I've tried to make clear, my ego's no in this, but
I'm pretty sure that, counting boxes or counting CPUs, IBM compatible PCs
(including all the server boxes which are really just overgrown PCs)
running on x86 chips make up a substantial percentage of the total number
of boxes in the world. I wouldn't be surprised if it was at least 60%.

Now, if you want to do it by computing power - megaflops or whatever - you
might have a defensible position.

Nov 14 '05 #20
Chris Barts wrote:
This is really, really narrow of you. C, Java, and Visual Basic all have
their own, usually mutually-exclusive, problem domains. Using a C program
when you should be programming in VB is pretty damned stupid, and
vice-versa. Anyone who would say the above probably has only programmed in
one environment, making one kind of program in one language. If, indeed,
you have programmed at all.


Please don't insult me. I'm aware that C is used extensively in many
standalone environments, and I have used it in such environments in
industry, and it is one of my favourite languages. Speaking of the
industry in general, however, a lot of code produced nowadays is
high-level web and database applications that run on desktop PCs, and
other similarly boring stuff. I am not asserting that C has been
supplanted universally, but only that it is no longer dominant across
most of the industry as it once was.
--
Derrick Coetzee
I grant this newsgroup posting into the public domain. I disclaim all
express or implied warranty and all liability. I am not a professional.
Nov 14 '05 #21
>In article <9Z******************@newsread1.news.pas.earthlink .net>,
Mike Wahler <mk******@mkwahler.net> wrote:
Your 'real world' must be very small then. PC's running Microsoft
Windows (or any other OS) are a tiny minority of computer systems
in existence.

I think you need to define "any other OS" here. :-)

In article <cj**********@yin.interaccess.com>
Kenny McCormack <ga*****@interaccess.com> wrote:You sure about that? As I've tried to make clear, my ego's no in this, but
I'm pretty sure that, counting boxes or counting CPUs, IBM compatible PCs
(including all the server boxes which are really just overgrown PCs)
running on x86 chips make up a substantial percentage of the total number
of boxes in the world. I wouldn't be surprised if it was at least 60%.
And I think you need to define "box" here.

Your microwave has a microprocessor. If your refrigerator is new
and high-end, it has one. Your TV has one; your VCR or DVD player
has one; your car, if it was built within the last decade, has at
least one, and probably over a dozen, CPUs. Are these "boxes"?

For that matter, even your desktop PC has more than one CPU. In
particular, every disk drive has a microprocessor, and a modern
multisync monitor has one. Your keyboard and mouse have (more
limited) microprocessors in them. The CPU on your motherboard --
which is the only one running Windows, if it is indeed running
Windows -- is quite outnumbered.
Now, if you want to do it by computing power - megaflops or whatever - you
might have a defensible position.


Actually, here, the Pentium-clones may have the edge. Many of the
small microprocessors (e.g., even the PowerPC in the TiVo) are
running at lower clock frequencies to reduce power dissipation
(which leads to heat, which requires a fan, which makes the TiVo
too noisy).
--
In-Real-Life: Chris Torek, Wind River Systems
Salt Lake City, UT, USA (40°39.22'N, 111°50.29'W) +1 801 277 2603
email: forget about it http://web.torek.net/torek/index.html
Reading email is like searching for food in the garbage, thanks to spammers.
Nov 14 '05 #22
In article <cj*********@news1.newsguy.com>,
Chris Torek <no****@torek.net> wrote:
In article <9Z******************@newsread1.news.pas.earthlink .net>,
Mike Wahler <mk******@mkwahler.net> wrote:
Your 'real world' must be very small then. PC's running Microsoft
Windows (or any other OS) are a tiny minority of computer systems
in existence.
I think you need to define "any other OS" here. :-)


That's not me. That was the other poster's concept.
You sure about that? As I've tried to make clear, my ego's no in this, but
I'm pretty sure that, counting boxes or counting CPUs, IBM compatible PCs
(including all the server boxes which are really just overgrown PCs)
running on x86 chips make up a substantial percentage of the total number
of boxes in the world. I wouldn't be surprised if it was at least 60%.


And I think you need to define "box" here.


As far as I'm concerned, a microwave is not a computer. It does have
a microprocessor, as you note. I think the point is that these days, just
about everything has a microprocessor, but that doesn't mean they are
computers.

And, the previous poster did say that:
PC's running Microsoft Windows (or any other OS) are a tiny minority of
computer systems in existence.

^^^^^^^^

Nov 14 '05 #23
>>>In article <9Z******************@newsread1.news.pas.earthlink .net>,
Mike Wahler <mk******@mkwahler.net> wrote:
Your 'real world' must be very small then. PC's running Microsoft
Windows (or any other OS) are a tiny minority of computer systems
in existence.
In article <cj*********@news1.newsguy.com>,
Chris Torek <no****@torek.net> wrote:
I think you need to define "any other OS" here. :-)

In article <news:cj**********@yin.interaccess.com>
Kenny McCormack <ga*****@interaccess.com> wrote:That's not me. That was the other poster's concept.
Indeed -- and note which name is the only available referent for "you"
at that point. :-)

Anyway, my point was that it is difficult to count "computers" and
"systems" and "OSes" without first defining each. Is an embedded
system a "system"? It has one or more microprocessors, and these
days, many of them are programmed in C (and some are even being
done in both Java and C, including high-end car "infotainment"
systems).
As far as I'm concerned, a microwave is not a computer. ...


This fits well with my personal definition of an "embedded system
computer", which is "any time you don't constantly think: there is
a computer in here" when you use it. :-)
--
In-Real-Life: Chris Torek, Wind River Systems
Salt Lake City, UT, USA (40°39.22'N, 111°50.29'W) +1 801 277 2603
email: forget about it http://web.torek.net/torek/index.html
Reading email is like searching for food in the garbage, thanks to spammers.
Nov 14 '05 #24
"Mike Wahler" <mk******@mkwahler.net> wrote:
<be*******@aol.com> wrote in message
Chris Barts <ch*****@gmail.com> wrote in message
(and, to some extent, act like C) thrive, while Fortran and Cobol and Lisp
descendants have failed. There's nothing to suggest that C is becoming any
less relevant with time.


I think the statement "Fortran has failed" is wrong.


Yes it's wrong. But Chris did not make that statement. He stated:
"Fortran and Cobol and Lisp descendants have failed."

Whether this is true or not I have no idea, especially when
'failed' is a subjective issue.


If he meant "Fortran and Cobol and (Lisp descendants)", it's clearly
untrue because of the first two languages.
However, I suspect he meant "(Fortran and Cobol and Lisp) descendants";
in which case, no, I don't know any succesful Fortran- and Cobol-alikes,
either, but AFAIK Scheme is doing reasonably well in the same areas for
which Lisp was originally used.

Richard
Nov 14 '05 #25
Derrick Coetzee <dc****@moonflare.com> wrote:
Chris Barts wrote:
This is really, really narrow of you. C, Java, and Visual Basic all have
their own, usually mutually-exclusive, problem domains. Using a C program
when you should be programming in VB is pretty damned stupid, and
vice-versa. Anyone who would say the above probably has only programmed in
one environment, making one kind of program in one language. If, indeed,
you have programmed at all.


Please don't insult me. I'm aware that C is used extensively in many
standalone environments, and I have used it in such environments in
industry, and it is one of my favourite languages. Speaking of the
industry in general, however, a lot of code produced nowadays is
high-level web and database applications that run on desktop PCs,


Quite. And I don't want any database application written in VB or Java
on _my_ network, thank you very much. Java is for slow, broken Web code;
VB is for slow, broken amateurs' programs. For production code, one
either uses a domain-specific language (such as, for database programs,
dBase 2000, or even FoxPro), or a high quality general language such as
C.

Richard
Nov 14 '05 #26
On Tue, 28 Sep 2004 03:07:41 GMT, ga*****@yin.interaccess.com (Kenny
McCormack) wrote:
As far as I'm concerned, a microwave is not a computer. It does have
a microprocessor, as you note. I think the point is that these days, just
about everything has a microprocessor, but that doesn't mean they are
computers.


My office is not a computer, either, but it has one in it. (Three,
actually.)

--
Al Balmer
Balmer Consulting
re************************@att.net
Nov 14 '05 #27
In article <lg********************************@4ax.com>,
Alan Balmer <al******@spamcop.net> wrote:
On Tue, 28 Sep 2004 03:07:41 GMT, ga*****@yin.interaccess.com (Kenny
McCormack) wrote:
As far as I'm concerned, a microwave is not a computer. It does have
a microprocessor, as you note. I think the point is that these days,
just about everything has a microprocessor, but that doesn't mean they
are computers.


My office is not a computer, either, but it has one in it. (Three,
actually.)


My car is not a door, but it has one in it. (2, actually)

Neither is my house a door, though it has several.

Are we having fun yet?
Nov 14 '05 #28

"Richard Bos" <rl*@hoekstra-uitgeverij.nl> wrote in message
news:41****************@news.individual.net...
"Mike Wahler" <mk******@mkwahler.net> wrote:
<be*******@aol.com> wrote in message
Chris Barts <ch*****@gmail.com> wrote in message

> (and, to some extent, act like C) thrive, while Fortran and Cobol and Lisp > descendants have failed. There's nothing to suggest that C is becoming any > less relevant with time.

I think the statement "Fortran has failed" is wrong.
Yes it's wrong. But Chris did not make that statement. He stated:
"Fortran and Cobol and Lisp descendants have failed."

Whether this is true or not I have no idea, especially when
'failed' is a subjective issue.


If he meant "Fortran and Cobol and (Lisp descendants)", it's clearly
untrue because of the first two languages.


Yes.
However, I suspect he meant "(Fortran and Cobol and Lisp) descendants";


Yes, that's how I interpreted it also. Perhaps Chris will clarify.

-Mike
Nov 14 '05 #29
Richard Bos wrote:
Java is for slow, broken Web code; VB is for slow, broken amateurs' programs.


If you believe either of these languages is confined to such small
domains, then it's your sphere that is small. Java is used in many
domains where portability is important, or where the added safety or
security of Java is important, including domains traditionally belonging
to C such as compilers and raytracers. Moreover, Java *can* be compiled
directly to native code that runs as fast as any other native code, and
you're confusing the Java language with the Java environment if you
imagine it can't be. Speed comparisons between modern VMs like the Java
HotSpot VM and native code also show that Java VMs can no longer really
be considered unacceptably slow in most cases (although heavyweight,
certainly). Welcome to 2004.

As for VB, well... er... I'm not about to defend VB.
--
Derrick Coetzee
I grant this newsgroup posting into the public domain. I disclaim all
express or implied warranty and all liability. I am not a professional.
Nov 14 '05 #30
On Tue, 28 Sep 2004 16:15:10 GMT, ga*****@yin.interaccess.com (Kenny
McCormack) wrote:
In article <lg********************************@4ax.com>,
Alan Balmer <al******@spamcop.net> wrote:
On Tue, 28 Sep 2004 03:07:41 GMT, ga*****@yin.interaccess.com (Kenny
McCormack) wrote:
As far as I'm concerned, a microwave is not a computer. It does have
a microprocessor, as you note. I think the point is that these days,
just about everything has a microprocessor, but that doesn't mean they
are computers.


My office is not a computer, either, but it has one in it. (Three,
actually.)


My car is not a door, but it has one in it. (2, actually)

Neither is my house a door, though it has several.

Are we having fun yet?


If we were discussing how many doors exist, your remark would be
apropos. As it is, we are discussing how many computers there are, and
your microwave has one, whether the fact supports your argument or
not.

--
Al Balmer
Balmer Consulting
re************************@att.net
Nov 14 '05 #31

In article <41****************@news.individual.net>, rl*@hoekstra-uitgeverij.nl (Richard Bos) writes:
"Mike Wahler" <mk******@mkwahler.net> wrote:
<be*******@aol.com> wrote in message
Chris Barts <ch*****@gmail.com> wrote in message

> ... Fortran and Cobol and Lisp descendants have failed ...

I think the statement "Fortran has failed" is wrong.
Yes it's wrong. But Chris did not make that statement. He stated:
"Fortran and Cobol and Lisp descendants have failed."


If he meant "Fortran and Cobol and (Lisp descendants)", it's clearly
untrue because of the first two languages.
However, I suspect he meant "(Fortran and Cobol and Lisp) descendants";


On reflection, I believe you're right, though I made the same error
as beliavsky in my previous post. That said, however, I still think
Chris is wrong.
in which case, no, I don't know any succesful Fortran- and Cobol-alikes,
either,
Both Fortran and COBOL feature current standards which offer
significant new features beyond what those languages traditionally
provided, and both seem to have communities of developers who use
only the traditional features, as well as communities who use the
new ones.

COBOL, for example, now features OO. Few COBOL programmers use OO
COBOL, but enough do to make supporting it profitable.

So in a sense, Fortran and COBOL *are* descendants of Fortran and
COBOL. They just didn't bother renaming the language.
but AFAIK Scheme is doing reasonably well in the same areas for
which Lisp was originally used.


Yes, and again with Lisp we have Common Lisp with its OO support
(CLOS), which is substantially different from traditional Lisp.

What chiefly distinguishes C and its "descendants" from the other
cases is that there was a strong movement to preserve C with only
relatively unobtrusive changes; thus most of the various languages
that diverged from C had to present themselves as new languages to
gain wide acceptance.

--
Michael Wojcik mi************@microfocus.com

It's like being shot at in an airport with all those guys running
around throwing hand grenades. Certain people function better with
hand grenades coming from all sides than other people do when the
hand grenades are only coming from inside out.
-- Dick Selcer, coach of the Cinci Bengals
Nov 14 '05 #32

In article <cj**********@yin.interaccess.com>, ga*****@yin.interaccess.com (Kenny McCormack) writes:

As far as I'm concerned, a microwave is not a computer.
I don't believe Chris claimed that it was. He said there was a
computer in it, and he questioned the definition of "box". I believe
the implication was that "box" should not be defined as "computer".
It does have
a microprocessor, as you note. I think the point is that these days, just
about everything has a microprocessor, but that doesn't mean they are
computers.
The people who write software for them probably feel differently.
And, the previous poster did say that:
PC's running Microsoft Windows (or any other OS) are a tiny minority of
computer systems in existence.

^^^^^^^^


And he's right.

You may choose to define "computer" as "general-purpose computer", but
don't be surprised if the rest of us continue to use a more sensible
definition.
--
Michael Wojcik mi************@microfocus.com

I would never understand our engineer. But is there anything in this world
that *isn't* made out of words? -- Tawada Yoko (trans. Margaret Mitsutani)
Nov 14 '05 #33
In article <cj*********@news2.newsguy.com>,
Michael Wojcik <mw*****@newsguy.com> wrote:

You may choose to define "computer" as "general-purpose computer", but
don't be surprised if the rest of us continue to use a different
definition.


No problem. And I won't lose any sleep over it, either.

Keep in mind that if you told the average man on the street that there was
a computer in his microwave, he'd rush home and open the door to remove his
PC from the microwave (and hope that no one turned the microwave on while
the PC was in there).

Nov 14 '05 #34
In article <cj*********@news2.newsguy.com>,
Michael Wojcik <mw*****@newsguy.com> wrote:
In article <41****************@news.individual.net>,
rl*@hoekstra-uitgeverij.nl (Richard Bos) writes:
COBOL, for example, now features OO.
That would be ADD ONE TO COBOL GIVING COBOL?

So in a sense, Fortran and COBOL *are* descendants of Fortran and
COBOL. They just didn't bother renaming the language.
but AFAIK Scheme is doing reasonably well in the same areas for
which Lisp was originally used.
Yes, and again with Lisp we have Common Lisp with its OO support
(CLOS), which is substantially different from traditional Lisp.


What chiefly distinguishes C and its "descendants" from the other
cases is that there was a strong movement to preserve C with only
relatively unobtrusive changes; thus most of the various languages
that diverged from C had to present themselves as new languages to
gain wide acceptance.
This looks like a (slight) overstatement of the case to me.

C++ is really the only "true" descendant of C; the other C-like languages
(at least the ones I know about) tend to have entirely different lineage
with C-like syntax pasted on (because curly braces are so much k3wLer than
"begin" and "end").

Interestingly, most C implementations come packaged with C++
implementations, and it's usually not difficult (though seldom entirely
trivial) to convert a well-written C program to a program that does The
Right Thing when given to a C++ compiler. (Not that there's often a
good reason to do this.)

So it seems that C and C++, taken as a pair, are not at all unlike F77
and F90 (if I've got those names right), or Lisp and Lisp-with-CLOS,
or old-COBOL and new-COBOL-with-OO, and languages like Java are just
hangers-on that add to the confusion.
dave

--
Dave Vandervies dj******@csclub.uwaterloo.ca The only rule is 'read everything Chris Torek writes'.

Have you seen his latest shopping list? Heavy stuff...
--CBFalconer and Richard Heathfield in comp.lang.c
Nov 14 '05 #35

In article <cj**********@yin.interaccess.com>, ga*****@yin.interaccess.com (Kenny McCormack) writes:

Keep in mind that if you told the average man on the street that there was
a computer in his microwave, he'd rush home and open the door to remove his
PC from the microwave (and hope that no one turned the microwave on while
the PC was in there).


I give the average man on the street more credit than that - at long
as we're talking about anglophone men who know what a microwave
[oven] is, and have some idea of what a computer is. (I can hardly
expect someone who doesn't meet those conditions to understand the
mooted statement.)

I suspect most educated people in the industrialized world are
conversant with the idea of embedded computers. For example, I don't
believe I've run into an automobile owner in the past decade or so
who wasn't aware that there was some kind of "computer" controlling
their car's engine, even if they had little idea what it might
actually be doing.

--
Michael Wojcik mi************@microfocus.com

Push up the bottom with your finger, it will puffy and makes stand up.
-- instructions for "swan" from an origami kit
Nov 14 '05 #36
On Tue, 28 Sep 2004 00:55:13 GMT, in comp.lang.c ,
ga*****@yin.interaccess.com (Kenny McCormack) wrote:
In article <kn********************************@4ax.com>,
Mark McIntyre <ma**********@spamcop.net> wrote:
...
Only anidiot uses VB to write processor intensive libraries or code that
you need to have accessible on Solaris, Redhat and WinXP. Similarly only a
masochist uses C to write a WinXP GUI app.


So true. But there's a f*** lot of idiots in the world, and more than your
fair share of masochists.


True. However this doesn't make them any less idiotic, and I can report
that none of them work for me, or will do in the future.
--
Mark McIntyre
CLC FAQ <http://www.eskimo.com/~scs/C-faq/top.html>
CLC readme: <http://www.ungerhu.com/jxh/clc.welcome.txt>
----== Posted via Newsfeed.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeed.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= 19 East/West-Coast Specialized Servers - Total Privacy via Encryption =---
Nov 14 '05 #37
In article <cj********@news3.newsguy.com>,
Michael Wojcik <mw*****@newsguy.com> wrote:
....
I suspect most educated people in the industrialized world are
conversant with the idea of embedded computers. For example, I don't
believe I've run into an automobile owner in the past decade or so
who wasn't aware that there was some kind of "computer" controlling
their car's engine, even if they had little idea what it might
actually be doing.


The fact that you put "computer" in quotes proves my point.

Anyone with any sense knows that a microprocessor isn't a computer anymore
than a door is a house. Or that a CRT is a TV.

Now, to be fair, I have come across a fair number of uneducated people who
refer to PCs as CPUs - no doubt because they think it makes them sound cool.
You know - as in, "Hey Fred, could you go install Word on Joe's CPU?".

Nov 14 '05 #38
On Tue, 28 Sep 2004 00:58:50 GMT, in comp.lang.c ,
ga*****@yin.interaccess.com (Kenny McCormack) wrote:
In article <9Z******************@newsread1.news.pas.earthlink .net>,
Mike Wahler <mk******@mkwahler.net> wrote:
...
Your 'real world' must be very small then. PC's running Microsoft
Windows (or any other OS) are a tiny minority of computer systems
in existence.


You sure about that? As I've tried to make clear, my ego's no in this, but
I'm pretty sure that, counting boxes or counting CPUs, IBM compatible PCs
(including all the server boxes which are really just overgrown PCs)
running on x86 chips make up a substantial percentage of the total number
of boxes in the world. I wouldn't be surprised if it was at least 60%.


There are CONSIDERABLY more nonobvious computers in the world than there
are personal computers. How many people in the US have mobile phones?
Cars? Microwaves? Digital alarm clocks? Video recorders? DVD players? MP3
players? PDAs? And we've not even started to think about ATMs, Pin card
readers, cash registers, vote counters, etc etc etc....

Mind you, if they're using VB to write vote-counting software, I can
predict the Nov result now:

G Bush -0x80090317
J Kerry -0x8009030D
R Nader Out of Cheese error +++ Redo From Start +++++


--
Mark McIntyre
CLC FAQ <http://www.eskimo.com/~scs/C-faq/top.html>
CLC readme: <http://www.ungerhu.com/jxh/clc.welcome.txt>
----== Posted via Newsfeed.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeed.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= 19 East/West-Coast Specialized Servers - Total Privacy via Encryption =---
Nov 14 '05 #39
On Tue, 28 Sep 2004 21:35:03 GMT, ga*****@yin.interaccess.com (Kenny
McCormack) wrote:
In article <cj********@news3.newsguy.com>,
Michael Wojcik <mw*****@newsguy.com> wrote:
...
I suspect most educated people in the industrialized world are
conversant with the idea of embedded computers. For example, I don't
believe I've run into an automobile owner in the past decade or so
who wasn't aware that there was some kind of "computer" controlling
their car's engine, even if they had little idea what it might
actually be doing.
The fact that you put "computer" in quotes proves my point.

Anyone with any sense knows that a microprocessor isn't a computer anymore
than a door is a house. Or that a CRT is a TV.

So anyone who disagrees with your personal definition has no sense.
You should probably be warned that there are a *lot* of people who
don't agree with you.
Now, to be fair, I have come across a fair number of uneducated people who
refer to PCs as CPUs - no doubt because they think it makes them sound cool.
You know - as in, "Hey Fred, could you go install Word on Joe's CPU?".


I've come across a few who think that a microprocessor is not a
computer.

--
Al Balmer
Balmer Consulting
re************************@att.net
Nov 14 '05 #40
In article <ip********************************@4ax.com>,
Alan Balmer <al******@spamcop.net> wrote:
....
Anyone with any sense knows that a microprocessor isn't a computer anymore
than a door is a house. Or that a CRT is a TV.
So anyone who disagrees with your personal definition has no sense.


Pretty much, yeah.
You should probably be warned that there are a *lot* of people who
don't agree with you.


I've learned to live with it. I'm used to it by now.
Now, to be fair, I have come across a fair number of uneducated people who
refer to PCs as CPUs - no doubt because they think it makes them sound cool.
You know - as in, "Hey Fred, could you go install Word on Joe's CPU?".


I've come across a few who think that a microprocessor is not a
computer.


It's not. It is a component of a computer. Note that a computer may have
more than one microprocessor. (*)

(*) Or, it may have none (certain mainframes...)

Nov 14 '05 #41
Dan Pop wrote:
In <cj**********@news-int.gatech.edu> Derrick Coetzee <dc****@moonflare.com> writes:

Now Java, C++, Visual Basic, and others are (each!) more prevalent.

Do you have some hard data to back up this statement? Whenever I look at
some open source project, I see C source code. Exceptionally Fortran or
C++.


Yes, agreed. The only caveat is that this is in a sense a biased
sample, because you're talking about cases where the source code
is visible, and hence the programming language known. Sadly, a
great deal of code is still invisible and uncheckable, and
written in who-knows-what source language (though one may guess).

Allin Cottrell
Nov 14 '05 #42
In <cj**********@news-int2.gatech.edu> Derrick Coetzee <dc****@moonflare.com> writes:
to C such as compilers and raytracers. Moreover, Java *can* be compiled
directly to native code that runs as fast as any other native code,
You're really naive if you believe this. Just because it's native code
it doesn't mean that it runs necessarily as fast as the native code
generated by a C compiler from a C program solving the same problem.

Java's portability comes at the cost of Java being an overspecified
programming language. If the native behaviour of the underlying
processor doesn't match the Java virtual machine specification, additional
code is required to provide the behaviour of the Java virtual machine.
Then, there are issues related to the bound checking and garbage
collection *required* by Java.

It is highly nonrealistic to expect a language designed on the principle
"the programmer is incompetent and cannot be trusted" to be as efficient
as languages that trust the programmer to know what he's doing. There
are redeeming advantages for Java's approach, but it is sheer foolishness
to believe that it comes at no cost.
and you're confusing the Java language with the Java environment if you
imagine it can't be.


The Java language is defined in terms of the Java virtual machine.
No Java to native code translation device can ignore the specification
of the Java virtual machine.

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Currently looking for a job in the European Union
Nov 14 '05 #43
Derrick Coetzee <dc****@moonflare.com> wrote:
Speed comparisons between modern VMs like the Java
HotSpot VM and native code also show


I am not impressed by IndustrhyStones. I _am_ impressed by my own
observations.

Richard
Nov 14 '05 #44
Mark McIntyre <ma**********@spamcop.net> wrote:
There are CONSIDERABLY more nonobvious computers in the world than there
are personal computers. How many people in the US have mobile phones?
Cars? Microwaves? Digital alarm clocks? Video recorders? DVD players? MP3
players? PDAs? And we've not even started to think about ATMs,
Many ATMs _are_ normal PCs. I've seen a picture of one showing a BSOD.
Ditto, but in person, the information terminals at an airport. I've
forgotten which; statistics say it's probably Schiphol, but my memory
insists that it was an English airport, which means in has to be either
Heathrow or Stansted.

Yes, actually, I _was_ a little worried. If they run the terminals on
Win-BSOD-dows, who knows WTF they run their important systems on?
Mind you, if they're using VB to write vote-counting software, I can
predict the Nov result now:

G Bush -0x80090317
J Kerry -0x8009030D
R Nader Out of Cheese error +++ Redo From Start +++++


Having read comp.risks for the last couple of years, I wouldn't put it
past Diebold.

Richard
Nov 14 '05 #45
ga*****@yin.interaccess.com (Kenny McCormack) wrote:
Michael Wojcik <mw*****@newsguy.com> wrote:

You may choose to define "computer" as "general-purpose computer", but
don't be surprised if the rest of us continue to use a different
definition.


No problem. And I won't lose any sleep over it, either.

Keep in mind that if you told the average man on the street that there was
a computer in his microwave, he'd rush home and open the door to remove his
PC from the microwave (and hope that no one turned the microwave on while
the PC was in there).


Keep in mind that to the average man on the street, "my computer" is his
monitor, and his computer is "my diskdrive".

Richard
Nov 14 '05 #46
"Kenny McCormack" <ga*****@yin.interaccess.com> wrote in message
news:cj**********@yin.interaccess.com...
In article <9Z******************@newsread1.news.pas.earthlink .net>,
Mike Wahler <mk******@mkwahler.net> wrote:
...
Your 'real world' must be very small then. PC's running Microsoft
Windows (or any other OS) are a tiny minority of computer systems
in existence.
You sure about that?


Absolutely positive.
As I've tried to make clear, my ego's no in this,
Egos are not relevant to the facts.
but
I'm pretty sure that, counting boxes or counting CPUs,
Not every computer is enclosed in a 'box'.
IBM compatible PCs
(including all the server boxes which are really just overgrown PCs)
Every server is an 'overgrown PC?' Huh? Even a 390? A VAX?
running on x86 chips make up a substantial percentage of the total number
of boxes in the world.
Not every computer is enclosed in a 'box'. Also, I feel that
the use of the word 'box' to indicate a computer is an attempt
to sound 'kewl', which impresses me not.
I wouldn't be surprised if it was at least 60%.
It's not.

Now, if you want to do it by computing power - megaflops or whatever - you
might have a defensible position.


The 'power' of various computers is moot to my assertion. I was
simply talking about the *number* of computers in existence.

I have a typical American house with mostly modern appliances,
and four automobiles. At any given time my 'office' (a converted
bedroom) houses from five to seven PC's, and three or four other
specialized computers which are *not* PC's. There are also several
dozen other computers scattered throughout my house and automobiles.
There's even a computer system that controls when and how long my
garden gets watered, based upon how much Mother Nature has already
done so.

-Mike
Nov 14 '05 #47

In article <cj**********@yin.interaccess.com>, ga*****@yin.interaccess.com (Kenny McCormack) writes:
In article <cj********@news3.newsguy.com>,
Michael Wojcik <mw*****@newsguy.com> wrote:
...
I suspect most educated people in the industrialized world are
conversant with the idea of embedded computers. For example, I don't
believe I've run into an automobile owner in the past decade or so
who wasn't aware that there was some kind of "computer" controlling
their car's engine, even if they had little idea what it might
actually be doing.
The fact that you put "computer" in quotes proves my point.


It does no such thing. Perhaps you are applying some idiosyncratic
restricted definition of the function of the quotation mark as well?
Anyone with any sense knows that a microprocessor isn't a computer anymore
than a door is a house. Or that a CRT is a TV.


This appears to be your own personal neurosis. You may believe it
applies to "anyone with any sense", of course, though you'd be verging
into outright psychosis.

Not that it matters. This entire discussion boils down to "most
computers are Windows boxes, provided we define 'computers' in a way
which makes that statement true, even though no one else here agrees
it should be defined that way". Since that is tautologically true,
pointless, and stupid, there's really no need to discuss it further.
It fails to support whatever argument you introduced it for - not, I
imagine, that anyone besides you cares.

--
Michael Wojcik mi************@microfocus.com

However, we maintain that our mission is more than creating high-tech
amusement--rather, we must endeavor to provide high-tech, high-touch
entertainment with an emphasis on enkindling human warmth.
-- "The Ultimate in Entertainment", from the president of video game
producer Namco
Nov 14 '05 #48

In article <41****************@news.individual.net>, rl*@hoekstra-uitgeverij.nl (Richard Bos) writes:

Many ATMs _are_ normal PCs. I've seen a picture of one showing a BSOD.
Ditto, but in person, the information terminals at an airport. I've
forgotten which; statistics say it's probably Schiphol, but my memory
insists that it was an English airport, which means in has to be either
Heathrow or Stansted.


I don't remember whether I've seen a crashed airport information
display at Heathrow, but I've seen them from time to time at the
Lansing, MI airport, and they're definitely running Windows. I don't
think I've seen a BSOD, but I have seen various Windows system error
dialog boxes.

My cell phone, on the other hand, is not, though it has a complete
Java runtime on it, and a web browser, and all sorts of other
nonsense. (It has yet to crash.)

So there are two more data points which demonstrate pretty much
nothing. Some embedded systems run Windows. Some do not.

--
Michael Wojcik mi************@microfocus.com

I'm not particularly funny, but I wanted to do something outrageous.
And I'm Norwegian, so I wasn't going to go too far. -- Darlyne Erickson
Nov 14 '05 #49
"Mike Wahler" <mk******@mkwahler.net> writes:
"Kenny McCormack" <ga*****@yin.interaccess.com> wrote in message
news:cj**********@yin.interaccess.com...

[...]
IBM compatible PCs
(including all the server boxes which are really just overgrown PCs)


Every server is an 'overgrown PC?' Huh? Even a 390? A VAX?


Note the lack of a comma after "boxes". I think he was referring to
the subset of server boxes which are "which are really just overgrown
PCs" (many of them are), not asserting that all server boxes are
overgrown PCs.
running on x86 chips make up a substantial percentage of the total number
of boxes in the world.


Not every computer is enclosed in a 'box'. Also, I feel that
the use of the word 'box' to indicate a computer is an attempt
to sound 'kewl', which impresses me not.


I suspect we can all agree on the following statements:

1. Most non-embedded computer systems are more or less PC-compatible
systems running x86 processors. This includes most desktop and laptop
PCs and many (but by no means all) servers.

2. Most computer systems, embedded or not, are *not* PC-compatibles.
This includes the engine computer(s) in your car and the CPUs in your
keyboard, your mobile phone, your washing machine, and your DVD
player.

I think the only point of disagreement is whether the term "computer"
applies to embedded systems as well as to standalone computers. That
may be an interesting question, but it's off-topic here.

Another question, that's more nearly topical, is how much programming
(C or otherwise) is done for embedded systems vs. non-embedded
systems, where "programming" might be measured in lines of code or in
programmer hours. Certainly the vast majority of the programmers I've
known haven't worked on embedded systems, but my experience is almost
certainly not representative.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Nov 14 '05 #50

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

4
by: Christian Tismer | last post by:
Dear Former Stackless Users, I have to use this list to announce something really bad to you, since all the Stackless lists are defunct: The Stackless project is finally dead, now and forever....
5
by: Nick Stansbury | last post by:
Hi, Sorry for the obscure title but I'm afraid I can't think of a better way to describe what happened to one of my clerks last night. The guy was working late, made a series of changes (accross a...
1
by: JD Kronicz | last post by:
Hi .. I have an issue I have been beating my head against the wall on for some time. I am trying to use late binding for MS graph so that my end users don't have to worry about having the right...
9
by: Zlatko Matić | last post by:
I was reading about late binding, but I'm not completely sure what is to be done in order to adjust code to late binding... For example, I'm not sure if this is correct: early binding: Dim ws...
5
by: eBob.com | last post by:
In another thread VJ made me aware of Tag. Fantastic! I've been wanting this capability for a long time. But it seems that I cannot use it with Option Strict On. In an event handler I have ......
30
by: lgbjr | last post by:
hi All, I've decided to use Options Strict ON in one of my apps and now I'm trying to fix a late binding issue. I have 5 integer arrays: dim IA1(500), IA2(500), IA3(500), IA4(500), IA5(500) as...
39
by: Mark Odell | last post by:
I've always declared variables used as indexes into arrays to be of type 'size_t'. I have had it brought to my attention, recently, that size_t is used to indicate "a count of bytes" and that using...
3
by: Sloan.Kohler | last post by:
Is Jython development dead or has it just seemed that way for over a year?. The jython.org website has a recent new appearance (but no new content) and there is some message traffic on the...
4
by: bukzor | last post by:
Does anyone have a pythonic way to check if a process is dead, given the pid? This is the function I'm using is quite OS dependent. A good candidate might be "try: kill(pid)", since it throws an...
0
by: DolphinDB | last post by:
Tired of spending countless mintues downsampling your data? Look no further! In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million...
0
by: ryjfgjl | last post by:
ExcelToDatabase: batch import excel into database automatically...
0
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
0
by: ArrayDB | last post by:
The error message I've encountered is; ERROR:root:Error generating model response: exception: access violation writing 0x0000000000005140, which seems to be indicative of an access violation...
1
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
1
by: Defcon1945 | last post by:
I'm trying to learn Python using Pycharm but import shutil doesn't work
0
by: af34tf | last post by:
Hi Guys, I have a domain whose name is BytesLimited.com, and I want to sell it. Does anyone know about platforms that allow me to list my domain in auction for free. Thank you
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.