473,387 Members | 1,574 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,387 software developers and data experts.

Why C Is Not My Favourite Programming Language

I've been utilising C for lots of small and a few medium-sized personal
projects over the course of the past decade, and I've realised lately
just how little progress it's made since then. I've increasingly been
using scripting languages (especially Python and Bourne shell) which
offer the same speed and yet are far more simple and safe to use. I can
no longer understand why anyone would willingly use C to program
anything but the lowest of the low-level stuff. Even system utilities,
text editors and the like could be trivially written with no loss of
functionality or efficiency in Python. Anyway, here's my reasons. I'd
be interested to hear some intelligent advantages (not
rationalisations) for using C.

No string type
--------------

C has no string type. Huh? Most sane programming languages have a
string type which allows one to just say "this is a string" and let the
compiler take care of the rest. Not so with C. It's so stubborn and
dumb that it only has three types of variable; everything is either a
number, a bigger number, a pointer or a combination of those three.
Thus, we don't have proper strings but "arrays of unsigned integers".
"char" is basically just a really small number. And now we have to
start using unsigned ints to represent multibyte characters.

What. A. Crock. An ugly hack.

Functions for insignificant operations
--------------------------------------

Copying one string from another requires including <string.h> in your
source code, and there are two functions for copying a string. One
could even conceivably copy strings using other functions (if one
wanted to, though I can't imagine why). Why does any normal language
need two functions just for copying a string? Why can't we use the
assignment operator ('=') like for the other types? Oh, I forgot.
There's no such thing as strings in C; just a big continuous stick of
memory. Great! Better still, there's no syntax for:

* string concatenation
* string comparison
* substrings

Ditto for converting numbers to strings, or vice versa. You have to use
something like atol(), or strtod(), or a variant on printf(). Three
families of functions for variable type conversion. Hello? Flexible
casting? Hello?

And don't even get me started on the lack of an exponentiation
operator.

No string type: the redux
-------------------------

Because there's no real string type, we have two options: arrays or
pointers. Array sizes can only be constants. This means we run the risk
of buffer overflow since we have to try (in vain) to guess in advance
how many characters we need. Pathetic. The only alternative is to use
malloc(), which is just filled with pitfalls. The whole concept of
pointers is an accident waiting to happen. You can't free the same
pointer twice. You have to always check the return value of malloc()
and you mustn't cast it. There's no builtin way of telling if a spot of
memory is in use, or if a pointer's been freed, and so on and so forth.
Having to resort to low-level memory operations just to be able to
store a line of text is asking for...

The encouragement of buffer overflows
-------------------------------------

Buffer overflows abound in virtually any substantial piece of C code.
This is caused by programmers accidentally putting too much data in one
space or leaving a pointer pointing somewhere because a returning
function ballsed up somewhere along the line. C includes no way of
telling when the end of an array or allocated block of memory is
overrun. The only way of telling is to run, test, and wait for a
segfault. Or a spectacular crash. Or a slow, steady leakage of memory
from a program, agonisingly 'bleeding' it to death.

Functions which encourage buffer overflows
------------------------------------------

* gets()
* strcat()
* strcpy()
* sprintf()
* vsprintf()
* bcopy()
* scanf()
* fscanf()
* sscanf()
* getwd()
* getopt()
* realpath()
* getpass()

The list goes on and on and on. Need I say more? Well, yes I do.

You see, even if you're not writing any memory you can still access
memory you're not supposed to. C can't be bothered to keep track of the
ends of strings; the end of a string is indicated by a null '\0'
character. All fine, right? Well, some functions in your C library,
such as strlen(), perhaps, will just run off the end of a 'string' if
it doesn't have a null in it. What if you're using a binary string?
Careless programming this may be, but we all make mistakes and so the
language authors have to take some responsibility for being so
intolerant.

No builtin boolean type
-----------------------

If you don't believe me, just watch:

$ cat > test.c
int main(void)
{
bool b;
return 0;
}

$ gcc -ansi -pedantic -Wall -W test.c
test.c: In function 'main':
test.c:3: 'bool' undeclared (first use in this function)

Not until the 1999 ISO C standard were we finally able to use 'bool' as
a data type. But guess what? It's implemented as a macro and one
actually has to include a header file to be able to use it!

High-level or low-level?
------------------------

On the one hand, we have the fact that there is no string type and
little automatic memory management, implying a low-level language. On
the other hand, we have a mass of library functions, a preprocessor and
a plethora of other things which imply a high-level language. C tries
to be both, and as a result spreads itself too thinly.

The great thing about this is that when C is lacking a genuinely useful
feature, such as reasonably strong data typing, the excuse "C's a
low-level language" can always be used, functioning as a perfect
'reason' for C to remain unhelpfully and fatally sparse.

The original intention for C was for it to be a portable assembly
language for writing UNIX. Unfortunately, from its very inception C has
had extra things packed into it which make it fail as an assembly
language. Its kludgy strings are a good example. If it were at least
portable these failings might be forgivable, but C is not portable.

Integer overflow without warning
--------------------------------

Self explanatory. One minute you have a fifteen digit number, then try
to double or triple it and - boom - its value is suddenly
-234891234890892 or something similar. Stupid, stupid, stupid. How hard
would it have been to give a warning or overflow error or even just
reset the variable to zero?

This is widely known as bad practice. Most competent developers
acknowledge that silently ignoring an error is a bad attitude to have;
this is especially true for such a commonly used language as C.

Portability?!
-------------

Please. There are at least four official specifications of C I could
name from the top of my head and no compiler has properly implemented
all of them. They conflict, and they grow and grow. The problem isn't
subsiding; it's increasing each day. New compilers and libraries are
developed and proprietary extensions are being developed. GNU C isn't
the same as ANSI C isn't the same as K&R C isn't the same as Microsoft
C isn't the same as POSIX C. C isn't portable; all kinds of machine
architectures are totally different, and C can't properly adapt because
it's so muttonheaded. It's trapped in The Unix Paradigm.

If it weren't for the C preprocessor, then it would be virtually
impossible to get C to run on multiple families of processor hardware,
or even just slightly differing operating systems. A programming
language should not require a C preprocessor so that it can run on both
FreeBSD, Linux or Windows without failing to compile.

C is unable to adapt to new conditions for the sake of "backward
compatibility", throwing away the opportunity to get rid of stupid,
utterly useless and downright dangerous functions for a nonexistent
goal. And yet C is growing new tentacles and unnecessary features
because of idiots who think adding seven new functions to their C
library will make life easier. It does not.

Even the C89 and C99 standards conflict with each other in ridiculous
ways. Can you use the long long type or can't you? Is a certain
constant defined by a preprocessor macro hidden deep, deep inside my C
library? Is using a function in this particular way going to be
undefined, or acceptable? What do you mean, getch() isn't a proper
function but getc() and getchar() are?

The implications of this false 'portability'
--------------------------------------------

Because C pretends to be portable, even professional C programmers can
be caught out by hardware and an unforgiving programming language;
almost anything like comparisons, character assignments, arithmetic, or
string output can blow up spectacularly for no apparent reason because
of endianness or because your particular processor treats all chars as
unsigned or silly, subtle, deadly traps like that.

Archaic, unexplained conventions
--------------------------------

In addition to the aforementioned problems, C also has various
idiosyncracies (invariably unreported) which not even some teachers of
C are aware of:

* "Don't use fflush(stdin)."
* "gets() is evil."
* "main() must return an integer."
* "main() can only take one of three sets of arguments."
* "main() can only return either EXIT_SUCCESS or EXIT_FAILURE."
* "You musn't cast the return value of malloc()."
* "fileno() isn't an ANSI compliant function."
* "A preprocessor macro oughtn't use any of its arguments more than
once."

....all these unnecessary and unmentioned quirks mean buggy code. Death
by a thousand cuts. Ironic when you consider that Kernighan thinks of
Pascal in the same way when C has just as many little gotchas that
bleed you to death gradually and painfully.

Blaming The Progammer
---------------------

Due to the fact that C is pretty difficult to learn and even harder to
actually use without breaking something in a subtle yet horrific way
it's assumed that anything which goes wrong is the programmer's fault.
If your program segfaults, it's your fault. If it crashes, mysteriously
returning 184 with no error message, it's your fault. When one single
condition you'd just happened to have forgotten about whilst coding
screws up, it's your fault.

Obviously the programmer has to shoulder most of the responsibility for
a broken program. But as we've already seen, C positively tries to make
the programmer fail. This increases the failure rate and yet for some
reason we don't blame the language when yet another buffer overflow is
discovered. C programmers try to cover up C's inconsistencies and
inadequacies by creating a culture of 'tua culpa'; if something's
wrong, it's your fault, not that of the compiler, linker, assembler,
specification, documentation, or hardware.

Compilers have to take some of the blame. Two reasons. The first is
that most compilers have proprietary extensions built into them. Let me
remind you that half of the point of using C is that it should be
portable and compile anywhere. Adding extensions violates the original
spirit of C and removes one of its advantages (albeit an already
diminished advantage).

The other (and perhaps more pressing) reason is the lack of anything
beyond minimal error checking which C compilers do. For every ten types
of errors your compiler catches, another fifty will slip through.
Beyond variable type and syntax checking the compiler does not look for
anything else. All it can do is give warnings on unusual behaviour,
though these warnings are often spurious. On the other hand, a single
error can cause a ridiculous cascade, or make the compiler fall over
and die because of a misplaced semicolon, or, more accurately and
incriminatingly, a badly constructed parser and grammar. And yet,
despite this, it's your fault.

To quote The Unix Haters' Handbook:

"If you make even a small omission, like a single semicolon, a C
compiler tends to get so confused and annoyed that it bursts into tears
and complains that it just can't compile the rest of the file since one
missing semicolon has thrown it off so much."

So C compilers may well give literally hundreds of errors stating that
half of your code is wrong if you miss out a single semicolon. Can it
get worse? Of course it can! This is C!

You see, a compiler will often not deluge you with error information
when compiling. Sometimes it will give you no warning whatsoever even
if you write totally foolish code like this:

#include <stdio.h>

int main()
{
char *p;
puts(p);
return 0;
}

When we compile this with our 'trusty' compiler gcc, we get no errors
or warnings at all. Even when using the '-W' and '-Wall' flags to make
it watch out for dangerous code it says nothing.

$ gcc -W -Wall stupid.c
$

In fact, no warning is given ever unless you try to optimise the
program with a '-O' flag. But what if you never optimise your program?
Well, you now have a dangerous program. And unless you check the code
again you may well never notice that error.

What this section (and entire document) is really about is the sheer
unfriendliness of C and how it is as if it takes great pains to be as
difficult to use as possible. It is flexible in the wrong way; it can
do many, many different things, but this makes it impossible to do any
single thing with it.

Trapped in the 1970s
--------------------

C is over thirty years old, and it shows. It lacks features that modern
languages have such as exception handling, many useful data types,
function overloading, optional function arguments and garbage
collection. This is hardly surprising considering that it was
constructed from an assembler language with just one data type on a
computer from 1970.

C was designed for the computer and programmer of the 1970s,
sacrificing stability and programmer time for the sake of memory.
Despite the fact that the most recent standard is just half a decade
old, C has not been updated to take advantage of increased memory and
processor power to implement such things as automatic memory
management. What for? The illusion of backward compatibility and
portability.

Yet more missing data types
---------------------------

Hash tables. Why was this so difficult to implement? C is intended for
the programming of things like kernels and system utilities, which
frequently use hash tables. And yet it didn't occur to C's creators
that maybe including hash tables as a type of array might be a good
idea when writing UNIX? Perl has them. PHP has them. With C you have to
fake hash tables, and even then it doesn't really work at all.

Multidimensional arrays. Before you tell me that you can do stuff like
int multiarray[50][50][50] I think that I should point out that that's
an array of arrays of arrays. Different thing. Especially when you
consider that you can also use it as a bunch of pointers. C programmers
call this "flexibility". Others call it "redundancy", or, more
accurately, "mess".

Complex numbers. They may be in C99, but how many compilers support
that? It's not exactly difficult to get your head round the concept of
complex numbers, so why weren't they included in the first place? Were
complex numbers not discovered back in 1989?

Binary strings. It wouldn't have been that hard just to make a
compulsory struct with a mere two members: a char * for the string of
bytes and a size_t for the length of the string. Binary strings have
always been around on Unix, so why wasn't C more accommodating?

Library size
------------

The actual core of C is admirably small, even if some of the syntax
isn't the most efficient or readable (case in point: the combined '? :'
statement). One thing that is bloated is the C library. The number of
functions in a full C library which complies with all significant
standards runs into four digit figures. There's a great deal of
redundancy, and code which really shouldn't be there.

This has knock-on effects, such as the large number of configuration
constants which are defined by the preprocessor (which shouldn't be
necessary), the size of libraries (the GNU C library almost fills a
floppy disk and its documentation, three) and inconsistently named
groups of functions in addition to duplication.

For example, a function for converting a string to a long integer is
atol(). One can also use strtol() for exactly the same thing. Boom -
instant redundancy. Worse still, both functions are included in the
C99, POSIX and SUSv3 standards!

Can it get worse? Of course it can! This is C!

As a result it's only logical that there's an equivalent pair of atod()
and strtod() functions for converting a string to a double. As you've
probably guessed, this isn't true. They are called atof() and strtod().
This is very foolish. There are yet more examples scattered through the
standard C library like a dog's smelly surprises in a park.

The Single Unix Specification version three specifies 1,123 functions
which must be available to the C programmer of the compliant system. We
already know about the redundancies and unnecessary functions, but
across how many header files are these 1,123 functions spread out? 62.
That's right, on average a C library header will define approximately
eighteen functions. Even if you only need to use maybe one function
from each of, say, five libraries (a common occurrence) you may well
wind up including 90, 100 or even 150 function definitions you will
never need. Bloat, bloat, bloat. Python has the right idea; its import
statement allows you to define exactly the functions (and global
variables!) you need from each library if you prefer. But C? Oh, no.

Specifying structure members
----------------------------

Why does this need two operators? Why do I have to pick between '.' and
'->' for a ridiculous, arbitrary reason? Oh, I forgot; it's just yet
another of C's gotchas.

Limited syntax
--------------

A couple of examples should illustrate what I mean quite nicely. If
you've ever programmed in PHP for a substantial period of time, you're
probably aware of the 'break' keyword. You can use it to break out from
nested loops of arbitrary depth by using an integer, like so:

for ($i = 0; $i < 10; $i++) {

for ($j = 0; $j < 10; $j++) {

for ($k = 0; $k < 10; $k++) {
break 2;
}
}

/* breaks out to here */

}

There is no way of doing this in C. If you want to break out from a
series of nested for or while loops then you have to use a goto. This
is what is known as a crude hack.

In addition to this, there is no way to compare any non-numerical data
type using a switch statement. Not even strings. In the programming
language D, one can do:

char s[];

switch (s) {

case "hello":
/* something */
break;

case "goodbye":
/* something else */
break;

case "maybe":
/* another action */
break;

default:
/* something */
break;

}

C does not allow you to use switch and case statements for strings. One
must use several variables to iterate through an array of case strings
and compare them to the given string with strcmp(). This reduces
performance and is just yet another hack.

In fact, this is an example of gratuitous library functions running
wild once again. Even comparing one string to another requires use of
the strcmp() function:

char string[] = "Blah, blah, blah\n";

if (strcmp(string, "something") == 0) {

/* do something */

}

Flushing standard I/O
---------------------

A simple microcosm of the "you can do this, but not that" philosophy of
C; one has to do two different things to flush standard input and
standard output.

To flush the standard output stream, the fflush() function is used
(defined by <stdio.h>). One doesn't usually need to do this after every
bit of text is printed, but it's nice to know it's there, right?

Unfortunately, fflush() can't be used to flush the contents of standard
input. Some C standards explicitly define it as having undefined
behaviour, but this is so illogical that even textbook authors
sometimes mistakenly use fflush(stdin) in examples and some compilers
won't bother to warn you about it. One shouldn't even have to flush
standard input; you ask for a character with getchar(), and the program
should just read in the first character given and disregard the rest.
But I digress...

There is no 'real' way to flush standard input up to, say, the end of a
line. Instead one has to use a kludge like so:

int c;

do {

errno = 0;
c = getchar();

if (errno) {
fprintf(stderr,
"Error flushing standard input buffer: %s\n",
strerror(errno));
}

} while ((c != '\n') && (!feof(stdin)));

That's right; you need to use a variable, a looping construct, two
library functions and several lines of exception handling code to flush
the standard
input buffer.

Inconsistent error handling
---------------------------

A seasoned C programmer will be able to tell what I'm talking about
just by reading the title of this section. There are many incompatible
ways in which a C library function indicates that an error has
occurred:

* Returning zero.
* Returning nonzero.
* Returning EOF.
* Returning a NULL pointer.
* Setting errno.
* Requiring a call to another function.
* Outputting a diagnostic message to the user.
* Triggering an assertion failure.
* Crashing.

Some functions may actually use up to three of these methods. (For
instance, fread().) But the thing is that none of these are compatible
with each other and error handling does not occur automatically; every
time a C programmer uses a library function they must check manually
for an error. This bloats code which would otherwise be perfectly
readable without if-blocks for error handling and variables to keep
track of errors. In a large software project one must write a section
of code for error handling hundreds of times. If you forget, something
can go horribly wrong. For example, if you don't check the return value
of malloc() you may accidentally try to use a null pointer. Oops...

Commutative array subscripting
------------------------------

"Hey, Thompson, how can I make C's syntax even more obfuscated and
difficult to understand?"

"How about you allow 5[var] to mean the same as var[5]?"

"Wow; unnecessary and confusing syntactic idiocy! Thanks!"

"You're welcome, Dennis."

Yes, I understand that array subscription is just a form of addition
and so it should be commutative, but doesn't it seem just a bit foolish
to say that 5[var] is the same as var[5]? How on earth do you take the
var'th value of 5?

Variadic anonymous macros
-------------------------

In case you don't understand what variadic anonymous macros are,
they're macros (i.e. pseudofunctions defined by the preprocessor) which
can take a variable number of arguments. Sounds like a simple thing to
implement. I mean, it's all done by the preprocessor, right? And
besides, you can define proper functions with variable numbers of
arguments even in the original K&R C, right?

In that case, why can't I do:

#define error(...) fprintf(stderr, ...)

without getting a warning from GCC?

warning: anonymous variadic macros were introduced in C99

That's right, folks. Not until late 1999, 30 years after development on
the C programming language began, have we been allowed to do such a
simple task with the preprocessor.

The C standards don't make sense
--------------------------------

Only one simple quote from the ANSI C standard - nay, a single footnote
- is needed to demonstrate the immense idiocy of the whole thing.
Ladies, gentlemen, and everyone else, I present to you...footnote 82:

All whitespace is equivalent except in certain situations.

I'd make a cutting remark about this, but it'd be too easy.

Too much preprocessor power
---------------------------

Rather foolishly, half of the actual C language is reimplemented in the
preprocessor. (This should be a concern from the start; redundancy
usually indicates an underlying problem.) We can #define fake
variables, fake conditions with #ifdef and #ifndef, and look, there's
even #if, #endif and the rest of the crew! How useful!

Erm, sorry, no.

Preprocessors are a good idea for a language like C. As has been
iterated, C is not portable. Preprocessors are vital to bridging the
gap between different computer architectures and libraries and allowing
a program to compile on multiple machines without having to rely on
external programs. The #define statement, in this case, can be used
perfectly validly to set 'flags' that can be used by a program to
determine all sorts of things: which C standard is being used, which
library, who wrote it, and so on and so forth.

Now, the situation isn't as bad as for C++. In C++, the preprocessor is
so packed with unnecessary rubbish that one can actually use it to
calculate an arbitrary series of Fibonacci numbers at compile-time.
However, C comes dangerously close; it allows the programmer to define
fake global variables with wacky values which would not otherwise be
proper code, and then compare values of these variables. Why? It's not
needed; the C language of the Plan 9 operating system doesn't let you
play around with preprocessor definitions like this. It's all just
bloat.

"But what about when we want to use a constant throughout a program? We
don't want to have to go through the program changing the value each
time we want to change the constant!" some may complain. Well, there's
these things called global variables. And there's this keyword, const.
It makes a constant variable. Do you see where I'm going with this?

You can do search and replace without the preprocessor, too. In fact,
they were able to do it back in the seventies on the very first
versions of Unix. They called it sed. Need something more like cpp? Use
m4 and stop complaining. It's the Unix way.

Nov 14 '05
134 7846

"Kelsey Bjarnason" <kb********@gmail.com> schreef in bericht
news:pa****************************@gmail.com...
Frankly, I think that's a load of hogwash; C makes coding no more or less
likely to produce bugs than any other language - if each is done by a
competent coder.

For example, in C, a competent coder knows you don't use gets, you use the
scanf family very carefully, or you use fgets with gay abandon, knowing
full well that short of a serious library bug, you simply aren't going to
have to cope with a buffer overflow on input using fgets.
And how do you become a competent coder? By making mistakes and doing thing
because you dont know better. Look at this group for instance, so many
beginners use gets. They will continue to use gets until 1) somebody
convinces them why not to 2) they cause a buffer overrun in a real
application causing their life to become miserable for a while.

One of the great things about C is that you can become a true master in it,
precisely because it has many pitfalls.
Similarly, when handling strings, a competent coder knows you have to add
1 to the length, to hold the terminal \0; so either he does so by habit,
or he creates functions to do so automatically - dupstr, for example,
could duplicate a string and automatically handle the addition of the
extra byte's space, and the coder would never again have to worry about
adding the one or not.

Where I find you run into the most problems is with code that simply isn't
properly thought out, combined with coders who aren't quite as good as
they think they are.

The perhaps classic example of this is the ever-repeating idea that on
freeing a pointer, it should subsequently be "NULLed out"; that is, the
sequence of code should read free(p); p = NULL;. The idea is that you can
then tell, easily, if you're re-using a freed pointer.

This seems like a good idea until you realise there could have been an
intervening q = p; and that q will _not_ be NULL, but will also not be
valid; this cutesy "NULLing out" notion falls apart, but proper code
design would have rendered it irrelevant in the first place - if the code
is well designed, you don't need to worry whether p was freed or not, you
_know_ whether it was freed or not.

There are some other issues, such as using the wrong types, assuming
pointers and ints are the same size, or can be trivially converted, that
sort of thing, which may be specific to C, but competent coders generally
aren't going to make that mistake, any more than a competent electrician
is going to get himself zapped by poking a knife into a possibly live
socket; these are things he learns to avoid as a matter of habit.

This leaves other issues - algorithmic failures, for example - which are
not applicable to C alone, but to all languages. Forgetting to calculate
a checksum, or misdesigning an encryption function, or failing to check an
error code, these can happen in any language, not just C.

Sure, C has its pitfalls, but so do all languages. If you're any good at
the language, you know what the pitfalls are and they generally don't
affect you, because you simply avoid the situations where they'd arise, as
a matter of habit.

Nov 14 '05 #101
Kelsey Bjarnason wrote:
On Sun, 06 Feb 2005 09:50:44 -0800, evolnet.regular wrote:

[snips]
True, but my point is:

(1) C introduces entirely new classes of bugs
and
(2) C makes bugs more likely, regardless of the programmer's skill
and
(3) C makes it much harder to root out and fix bugs

.... snip ...
Where I find you run into the most problems is with code that
simply isn't properly thought out, combined with coders who aren't
quite as good as they think they are.
.... snip ...
This leaves other issues - algorithmic failures, for example -
which are not applicable to C alone, but to all languages.
Forgetting to calculate a checksum, or misdesigning an encryption
function, or failing to check an error code, these can happen in
any language, not just C.

Sure, C has its pitfalls, but so do all languages. If you're any
good at the language, you know what the pitfalls are and they
generally don't affect you, because you simply avoid the
situations where they'd arise, as a matter of habit.


Most of these things can be designed around by any competent
programmer. The two areas that are often impossibly awkward to
check are overflows and wild pointers. The wild pointer problem is
built into the language. The overflow problem is not, and maybe
implementors will start to trap any integer overflows.

--
"If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers." - Keith Thompson
Nov 14 '05 #102
"Servé La" <bl****@bestaatniet.nl> writes:
[...]
One of the great things about C is that you can become a true master in it,
precisely because it has many pitfalls.


Another way to put that is that you *have* to become a true master in
C, precisely because it has many pitfalls.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Nov 14 '05 #103

"Keith Thompson" <ks***@mib.org> schreef in bericht
news:ln************@nuthaus.mib.org...
"Servé La" <bl****@bestaatniet.nl> writes:
[...]
One of the great things about C is that you can become a true master in it, precisely because it has many pitfalls.


Another way to put that is that you *have* to become a true master in
C, precisely because it has many pitfalls.


heh, no I dont agree with that. Nobody has to do anything. One can always
choose another language if they dant wanna become good at C
Nov 14 '05 #104
"Servé La" <bl****@bestaatniet.nl> writes:
"Keith Thompson" <ks***@mib.org> schreef in bericht
news:ln************@nuthaus.mib.org...
"Servé La" <bl****@bestaatniet.nl> writes:
[...]
> One of the great things about C is that you can become a true
> master in it, precisely because it has many pitfalls.


Another way to put that is that you *have* to become a true master in
C, precisely because it has many pitfalls.


heh, no I dont agree with that. Nobody has to do anything. One can always
choose another language if they dant wanna become good at C


Yes, there are always choices. What I meant was that if you're going
to program in C, you can either become a "true master" in C, or you
can risk running into its many pitfalls.

This is, of course a gross oversimplification. A more accurate
statement might be that C is less friendly to beginners than many
other languages.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Nov 14 '05 #105

Keith Thompson wrote:
"Servé La" <bl****@bestaatniet.nl> writes:
[...]
One of the great things about C is that you can become a true master in it, precisely because it has many pitfalls.


Another way to put that is that you *have* to become a true master in
C, precisely because it has many pitfalls.


Exactly right.

The only problem is that there's no longer a gradual transition period
between "total newbie" and "true master" as there is with most
languages. You have to use and study C intensely for years before you
can safely be let loose with it. As such, it's simply not a practical
language for the vast majority of people.

Nov 14 '05 #106
ev*************@gmail.com writes:
Keith Thompson wrote:
"Servé La" <bl****@bestaatniet.nl> writes:
[...]
> One of the great things about C is that you can become a true
> master in it, precisely because it has many pitfalls.


Another way to put that is that you *have* to become a true master in
C, precisely because it has many pitfalls.


Exactly right.


In another followup, I said that that was a gross oversimplification.

Incidentally, "evolnet.regular", you've never responded to the posts
in which several people pointed out that your original lengthy article
was identical to an article posted by someone else on Kuro5hin, and
that you didn't mention this or credit the (presumed) original author.
Would you care to explain?

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Nov 14 '05 #107
ev*************@gmail.com wrote:

The only problem is that there's no longer a gradual transition period
between "total newbie" and "true master" as there is with most
languages.
Yes, there is. There's a big range of knowledge. Look in comp.lang.c
for evidence thereof.
You have to use and study C intensely for years before you
can safely be let loose with it. As such, it's simply not a practical
language for the vast majority of people.


No language is a practical language for the vast majority of people.
And you have to use and study any language intensely for years before
you can safely be let loose with it.

The fact that people /are/ let loose earlier than this is reflected
in the poor quality of so many programs for which the copyright
owners have the gall to charge money. Some people have no shame.
Nov 14 '05 #108
ev*************@gmail.com wrote:
The only problem is that there's no longer a gradual transition period
between "total newbie" and "true master" as there is with most
languages. You have to use and study C intensely for years before you
can safely be let loose with it.


This is true for all other programming languages as well; the greatest
difficulty of programming lies not in syntax and bookkeeping, but in
algorithmic design. The great advantage of C is that it makes it
immediately obvious whether one is a master or an apprentice.

Richard
Nov 14 '05 #109
Gladly.

I originally wrote this article. I figured it would seem
self-aggrandizing to credit myself with having written it so I didn't
originally mention it.

Hope this helps.

Nov 14 '05 #110
"Yes, there is. There's a big range of knowledge. Look in comp.lang.c
for evidence thereof. "
From what I've seen, it's more like somebody asking a relatively

innocent question, followed by 25 other people leaping in with a
barrage of "OH MY GOODNESS THAT IS NOT IN THE STANDARD LEAVE THIS
NEWSGROUP NOW". Two people subsequently get in a back and forth
argument over footnotes in a random ANSI/ISO/POSIX standard. This
doesn't happen with other languages. Why?
"No language is a practical language for the vast majority of people."

This is true, but irrelevant. There are also many people who are
entirely capable of programming if the language is reasonably well
designed.

"And you have to use and study any language intensely for years before
you can safely be let loose with it. "

Not really. All it takes is good documentation and some intelligence
and you can write safe and secure programs in most mid-level or
high-level languages such as Java, Python, PHP, Perl, and the like.
Just remember not to trust user input and to debug properly.

Why aren't there as many security holes in getmail as fetchmail, for
instance? Why are there so few security holes in major programs of
every other language put together compared to those written in C or C++?

Nov 14 '05 #111
On 11 Feb 2005 15:30:48 -0800, in comp.lang.c , ev*************@gmail.com
wrote:

(Stuff)

Troll. Alert.

--
Mark McIntyre
CLC FAQ <http://www.eskimo.com/~scs/C-faq/top.html>
CLC readme: <http://www.ungerhu.com/jxh/clc.welcome.txt>
Nov 14 '05 #112
If people don't want to hear the answer, they oughtn't ask the
question.

Nov 14 '05 #113
In article <11**********************@c13g2000cwb.googlegroups .com>,
<ev*************@gmail.com> wrote:
:From what I've seen, it's more like somebody asking a relatively
:innocent question, followed by 25 other people leaping in with a
:barrage of "OH MY GOODNESS THAT IS NOT IN THE STANDARD LEAVE THIS
:NEWSGROUP NOW".

That line has a grain of truth to it.

:Two people subsequently get in a back and forth
:argument over footnotes in a random ANSI/ISO/POSIX standard. This
:doesn't happen with other languages. Why?

Partly because there aren't that many ISO standardized languages.

There is:

ISO 1989:2002 Programming Language Cobol

ISO 1539-1:1997 "Fortran 95"
ISO 1539-1:2004 Fortran 2003
ISO 1539-2:2002 Fortran Standard part 2 (varying length strings)
ISO 1539-3:1998 Fortran Standard part 3 (conditional compilation)
ISO 7846:1985 Industrial real-time Fortran (withdrawn)

ISO 8652:1995 Ada
ISO 13813:1998 Generic Packages of Real and Complex Type Declarations
and Basic Operations for Ada
ISO 15291:1999 Ada Semantic Interface Specification
ISO 18009:1999 Conformity Assessment of an Ada Language Processor

ISO 13816:1997 Programming Language ISLISP

ISO 13211-1 Programming Language Prolog part 1 -- General Core

ISO 14882 Programming Language C++

ISO 7185:1990 Pascal
ISO 10206:1991 Extended Pascal

ISO 10514-1:1996 Modula-2 (Base Language)
ISO 10514-2:1998 Modula-2 (OO extension)
ISO 10514-3:1998 Modula-2 (Generic extension)

ISO 1538:1984 Programming Langauge Algol 60 (withdrawn)

ISO 10279:1991 Programming Language BASIC

ISO 4335 Programming Language PL/1

There is also an active working group on APL.

It is interesting that of the "serious" living languages,
only ISLISP, C, and C+ are specified by a single standard: the rest
require multiple standards.

Ada, of course, had the design goal that there would be NO Ada
varients, that -every- Ada implimentation would be able to compile
-every- Ada program and that they would all behave exactly the same way.

Fortran... I can't say that it is obvious to me why one would need
an entire standard to deal with varying length strings.
What is of particular interest is that all of the languages that you
had submitted as being better than C are absent. No international
standard for Java, Python, Perl, Tcl. ksh is, I suppose, specified
as part of one of the POSIX standards (POSIX.2 maybe?), but it is
not easy to write meaningful programs in pure ksh without calling
upon outside programs such as sed or nawk or test.

--
Oh, to be a Blobel!
Nov 14 '05 #114
ev*************@gmail.com wrote:

"Yes, there is. There's a big range of knowledge. Look in comp.lang.c
for evidence thereof. "
From what I've seen, it's more like somebody asking a relatively innocent question, followed by 25 other people leaping in with a
barrage of "OH MY GOODNESS THAT IS NOT IN THE STANDARD LEAVE THIS
NEWSGROUP NOW".


Then you don't understand the dynamics of the group.
Two people subsequently get in a back and forth
argument over footnotes in a random ANSI/ISO/POSIX standard. This
doesn't happen with other languages. Why?
Perhaps that's because, in comp.lang.c, we enjoy discussing the
dark corners of the language (often as a reminder to ourselves not
to use them!).
"No language is a practical language for the vast majority of people."

This is true, but irrelevant. There are also many people who are
entirely capable of programming if the language is reasonably well
designed.
Yes; DWIM will be very popular if it ever gets past the vapourware
stage.
"And you have to use and study any language intensely for years before
you can safely be let loose with it. "

Not really.
Oh dear oh dear oh dear.

All it takes is good documentation and some intelligence
and you can write safe and secure programs in most mid-level or
high-level languages such as Java, Python, PHP, Perl, and the like.
Just remember not to trust user input and to debug properly.
(a) that isn't all there is to it; (b) those last two points alone
take a very long time to get right.

Why aren't there as many security holes in getmail as fetchmail, for
instance?
If that's true (and I haven't studied the sources of the two programs
in question), it's probably because the fetchmail author put more
security holes into his program than the getmail author did into his.
Why are there so few security holes in major programs of
every other language put together compared to those written in C or C++?


(a) You don't actually know this for a fact, because nobody knows
how many security holes any large non-trivial program has, irrespective
of the language in which it's written.
(b) Security holes are most easy to exploit, and therefore -
eventually - to come to non-exploiters' attention, in networking
programs. Most non-trivial networking programs are written in C.
(Warning: about to invent some random numbers) Headline: C programs
responsible for 80% of security holes! Small print: 90% of programs
are written in C. It is easy to ignore the small print, and imagine
that we can cut 80% of security holes by banning C, but if you do
the numbers you'll find that this just isn't true.

In fact, experienced C programmers are well aware of the ways in
which security holes can occur. The "point, click, and see what
happens" approach to programming which you appear to advocate is
much /more/ likely to lead to security problems.
Nov 14 '05 #115
ev*************@gmail.com wrote:

If people don't want to hear the answer, they oughtn't ask the
question.


Which answer to what question?
Nov 14 '05 #116
ev*************@gmail.com writes:
Gladly.

I originally wrote this article. I figured it would seem
self-aggrandizing to credit myself with having written it so I didn't
originally mention it.

Hope this helps.


Ok, thanks for the clarification. So you're James A C Joyce?

I fail to see how it would be "self-aggrandizing" to claim credit for
something you wrote; if it hadn't appeared on Kuro5hin, we would have
assumed you wrote it anyway.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Nov 14 '05 #117
"Joona I Palaste" <pa*****@cc.helsinki.fi> wrote in message
news:cu**********@oravannahka.helsinki.fi...
ev*************@gmail.com scribbled the following
on comp.lang.c:
In that case, why is it that there are so many buffer overflows in so many C programs written by presumably experienced coders and yet so few in programs written in *any other language*?
I find it hard to believe there are more buffer overflows in C

programs than there are in assembly programs. Unless you don't count assembly as a language.


My axiom:
C is a Master's Language. Simple, Elegant, Powerful. Like a handgun. In
the right hands, it executes perfectly. In the wrong hands, it simply
executes.

My opinion:
The idea of why C bugs may be seen more often than "other" languages, is
simple. Everything you use is written in C. Your Perl translator was
written in C, as was your Visual Basic compiler. Whatever you think is
"safe" probably has a lot of C code in it. Certainly your C++ compiler
does (also, probably mostly straight C / ASM).

--
Mabden
Nov 14 '05 #118
"Mabden" <mabden@sbc_global.net> writes:
"Joona I Palaste" <pa*****@cc.helsinki.fi> wrote in message
news:cu**********@oravannahka.helsinki.fi...
ev*************@gmail.com scribbled the following
on comp.lang.c:
In that case, why is it that there are so many buffer overflows in so many C programs written by presumably experienced coders and yet so few in programs written in *any other language*?
I find it hard to believe there are more buffer overflows in C

programs
than there are in assembly programs. Unless you don't count assembly

as
a language.


My axiom:
C is a Master's Language. Simple, Elegant, Powerful. Like a handgun. In
the right hands, it executes perfectly. In the wrong hands, it simply
executes.


My Problem with C is the following:
If you want to write robust, extedible and safe code, the code gets a
bit unreadable very fast. This problem is shared by all procedual
languages, allthough C has more conzepts to write such code than other
languages of this kind.
My opinion:
The idea of why C bugs may be seen more often than "other" languages, is
simple. Everything you use is written in C. Your Perl translator was
written in C, as was your Visual Basic compiler. Whatever you think is
"safe" probably has a lot of C code in it. Certainly your C++ compiler
does (also, probably mostly straight C / ASM).


Thats true, but it is often better to use other languages to get the
result, as using C, because of the reason I said before.

Kind regards,
Nicolas
--
| Nicolas Pavlidis | Elvis Presly: |\ |__ |
| Student of SE & KM | "Into the goto" | \|__| |
| pa****@sbox.tugraz.at | ICQ #320057056 | |
|-------------------University of Technology, Graz----------------|
Nov 14 '05 #119
On 13 Feb 2005 14:29:38 +0100, Nicolas Pavlidis
<pa****@sbox.tugraz.at> wrote:
My Problem with C is the following:
If you want to write robust, extedible and safe code, the code gets a
bit unreadable very fast.
That better describes Perl. Well, apart from the bits about robust and
safe code...
This problem is shared by all procedual
languages, allthough C has more conzepts to write such code than other
languages of this kind.


C is one of few languages which even has a standard. And it's one of
few modern languages which even has a compiler (Perl, Python and Java
are interpreted, which is why they are 'safe' in some fashion because
the interpreter can check for things like invalid accesses -- at the
cost of speed and complexity elsewhere). They can also be more
'extendable' that way (FORTH is the ultimate extendable language), but
that isn't always an advantage (I've heard FORTH -- and Perl --
programmers praising the language because they can "write things no one
else can understand" by using their own extensions, as though that is a
good thing).
My opinion:
The idea of why C bugs may be seen more often than "other" languages, is
simple. Everything you use is written in C. Your Perl translator was
written in C, as was your Visual Basic compiler. Whatever you think is
"safe" probably has a lot of C code in it. Certainly your C++ compiler
does (also, probably mostly straight C / ASM).


Thats true, but it is often better to use other languages to get the
result, as using C, because of the reason I said before.


No, it is sometime better to use other languages for specific tasks
because those languages are better at those specific tasks. I wouldn't
write a web CGI program in C, I'd use PHP because it's designed for
that; if I wanted heavy number crunching I'd still use Fortran (or
assembler for the specific platform); if I want string and shell
handling I use AWK or Perl; etc.

No language is perfect. No editor is perfect, nor is any operating
system or CPU. They are all compromises and they are designed for and
best at certain tasks.

A good programmer should know many different programming languages, and
know the strengths and weaknesses of them for the programs required. It
is rare that a programmer gets a completely free hand (often the choice
of language is up to management) but there are often opportunities for a
programmer to choose the best for a specific job (at my work we have
some using Perl, some using Java, some using C and some using C++, for
different jobs; I use C for portable code, Awk and C++ (and Unix/POSIX
utilities) for local utilities and processing).

Chris C
Nov 14 '05 #120
In article <11**********************@c13g2000cwb.googlegroups .com>,
<ev*************@gmail.com> wrote:
"And you have to use and study any language intensely for years before
you can safely be let loose with it. "

Not really. All it takes is good documentation and some intelligence
and you can write safe and secure programs in most mid-level or
high-level languages such as Java, Python, PHP, Perl, and the like.
Just remember not to trust user input and to debug properly.
You missed designing and writing code properly. That will avoid a lot
more problems (safety-related or not) than even good debugging will
ever catch.
And all it takes is good documentation and some intelligence and you
can write safe and secure programs in C. Just remember not to trust
user input and to design, write, and debug your program properly.

Why are there so few security holes in major programs of
every other language put together compared to those written in C or C++?


Perhaps because there are so few major programs at all in every other
language?
dave

--
Dave Vandervies dj******@csclub.uwaterloo.ca
If spelling out "if" and "else" is still too verbose for your tastes,
you might consider APL.
--Keith Thompson in comp.lang.c
Nov 14 '05 #121
In article <cu**********@rumours.uwaterloo.ca>,
Dave Vandervies <dj******@csclub.uwaterloo.ca> wrote:
:In article <11**********************@c13g2000cwb.googlegroups .com>,
: <ev*************@gmail.com> wrote:

:>"And you have to use and study any language intensely for years before
:>you can safely be let loose with it. "

:>Not really. All it takes is good documentation and some intelligence
:>and you can write safe and secure programs in most mid-level or
:>high-level languages such as Java, Python, PHP, Perl, and the like.

:You missed designing and writing code properly.

S/he also missed the problem that Perl doesn't *have* good documentation.

Documentation it has, but there are so many different interactions
possible between the features that you need a fair bit of experience to
write solid maintainable perl programs that are also somewhat efficient.

The poster should try reading the 'perlgolf' contest entries sometime...

http://perlgolf.sourceforge.net/
--
Sub-millibarn resolution bio-hyperdimensional plasmatic space
polyimaging is just around the corner. -- Corry Lee Smith
Nov 14 '05 #122
In article <cu**********@canopus.cc.umanitoba.ca>, ro******@ibd.nrc-
cnrc.gc.ca says...
The poster should try reading the 'perlgolf' contest entries sometime...


http://terje.perlgolf.org/wsp/pgas/s...le=61&season=1

Gag.

--
Randy Howard (2reply remove FOOBAR)
"Making it hard to do stupid things often makes it hard
to do smart ones too." -- Andrew Koenig
Nov 14 '05 #123

In article <cu**********@canopus.cc.umanitoba.ca>, ro******@ibd.nrc-cnrc.gc.ca (Walter Roberson) writes:

ISO 1989:2002 Programming Language Cobol

It is interesting that of the "serious" living languages,
only ISLISP, C, and C+ are specified by a single standard: the rest
require multiple standards.


COBOL is very serious indeed. There are still thousands of active
COBOL programmers developing new applications, and the COBOL standard
(in various versions, but a single standard) is quite important.

--
Michael Wojcik mi************@microfocus.com

HTML is as readable as C. You can take this either way. -- Charlie Gibbs
Nov 14 '05 #124
In article <cu*********@news4.newsguy.com>,
Michael Wojcik <mw*****@newsguy.com> wrote:

:In article <cu**********@canopus.cc.umanitoba.ca>, ro******@ibd.nrc-cnrc.gc.ca (Walter Roberson) writes:

:> ISO 1989:2002 Programming Language Cobol

:> It is interesting that of the "serious" living languages,
:> only ISLISP, C, and C+ are specified by a single standard: the rest
:> require multiple standards.

:COBOL is very serious indeed. There are still thousands of active
:COBOL programmers developing new applications, and the COBOL standard
:(in various versions, but a single standard) is quite important.

Good point. Is COBOL still "living" in the sense of evolving in
response to real needs (not just for marketing or acedemic purposes) ?
--
I don't know if there's destiny,
but there's a decision! -- Wim Wenders (WoD)
Nov 14 '05 #125

In article <cu**********@canopus.cc.umanitoba.ca>, ro******@ibd.nrc-cnrc.gc.ca (Walter Roberson) writes:
In article <cu*********@news4.newsguy.com>,
Michael Wojcik <mw*****@newsguy.com> wrote:

:COBOL is very serious indeed. There are still thousands of active
:COBOL programmers developing new applications, and the COBOL standard
:(in various versions, but a single standard) is quite important.

Good point. Is COBOL still "living" in the sense of evolving in
response to real needs (not just for marketing or acedemic purposes) ?


This is a somewhat subjective question, but I'd say yes. The 2002
standard is not fully supported by any implementation, as far as I
know, but most of it is supported by major commercial implementations,
and they're moving toward full conformance. COBOL's (now standard) OO
features have not proven wildly popular with most COBOL developers, as
far as I can tell, but there are a significant number at least
experimenting with them - questions about OO COBOL pop up from time to
time on comp.lang.cobol.

More recent changes not yet covered by existing standards include
support for XML parsing and generation in the language. That's drawn
quite a lot of interest.

It's not my favorite language (even though it pays my salary), but
the modern version - with free-format source and other conveniences -
is definitely more palatable. COBOL's resistance to modularity (the
separation of data definition and code, for example, and the way the
language structure encourages huge subroutines broken up into
non-parameterized "sections") remains an issue, but in practice
developers have managed to deal with it.

--
Michael Wojcik mi************@microfocus.com

But I still wouldn't count out the monkey - modern novelists being as
unpredictable as they are at times. -- Marilyn J. Miller
Nov 14 '05 #126
Assembly is better than C++ in certain cases. If memory is an issue on
a microprocessor or some tiny device, then you would you Assembly
because it is more efficient in handling memory and registers. Assembly
is more efficient and faster than C++ in some situations depending on
the coder.

If you don't understand the importance of memory, then you're a fool
who writes inefficient programs of gibberish and doesn't realize it.

Nov 14 '05 #127
ap*****@gmail.com wrote:
Assembly is better than C++ in certain cases.


Who cares, for Pete's sake? Neither "assembly" nor C++ is C, but for
some strange reason you posted this vacuity to comp.lang.c

[remainder of useless crap deleted]

Nov 14 '05 #128
what is your problem, Martin Ambuhl? Are you illiterate?

Of course this is for C, but that doesnt mean we can't discuss and
compare the difference between C++, C, and other languages.

I was just following up on the original post, so before you start
making your ignorant comments stop to think for a second.

Nov 14 '05 #129
ap*****@gmail.com wrote:
what is your problem, Martin Ambuhl? Are you illiterate?
Well, I would have called Martin "knowledgeable" and not have
thought of him having a problem.
In fact, you seem to have a problem or two. For example with
quoting the message you are replying to (or parts thereof).

Of course this is for C, but that doesnt mean we can't discuss and
compare the difference between C++, C, and other languages.
Right, this is what we are doing here in comp.lang.compare... o wait,
we are in comp.lang.c .
In fact, even in clc you can do your comparison as far as it makes
sense. However, I fail to see how a comparison between assembly
language and C++ is relevant in the least round here.

I was just following up on the original post, so before you start
making your ignorant comments stop to think for a second.


You were not. At least the message headers suggest that you
did not reply to the OP. learn.to/quote might help you communicate
in a more efficient manner. Afterwards, you just have to learn
manners and C.
-Michael
--
E-Mail: Mine is an /at/ gmx /dot/ de address.
Nov 14 '05 #130
Who are you to tell someone about manners? Thats just rude. How hard is
it to get it in your thick skulls? He made a statement about assembly
and I was just making a comment about his false claim. How is this
wrong to make a defending statement about assembly? I didnt bring up
assembly in the first place. And is it a crime to have accidently
posted in the wrong branch? Give it a rest.

Don't blame me. Why don't you blame him for bringing it up and stop
making idiotic accusations. For example: If I were to call you an idiot
as part as my post, and you tried to defend you're self and make a
response that proves your intelligence if any. According to your
idiotic logic, you were wrong for trying to make a statement in your
defense and then I would post that this isnt the correct group to
discuss intelligence and its your fault for just trying to defend
yourself. (Note this is an extreme but similiar case)

And btw I already know C

Nov 14 '05 #131
"ap*****@gmail.com" wrote:

Who are you to tell someone about manners? Thats just rude.
It is?
How hard is
it to get it in your thick skulls?
And that isn't? What a strange idea of "rude" you have.

<snip>
And btw I already know C


You do?
Nov 14 '05 #132
I never said I wasn't rude. That comment was deliberately made. Theres
no point in point out something so obvious.
You do?

Wow. Lets just post some more spam. Responding with these question type
responses is pointless and spam, just like this response and everything
the follows. Just drop it.

Nov 14 '05 #133
"ap*****@gmail.com" wrote:

I never said I wasn't rude. That comment was deliberately made. Theres
no point in point out something so obvious.


If you yourself are rude (which you seem to admit here), then you can
hardly complain if others are rude to you.

As for pointing out the obvious, most of the advice given here by
comp.lang.c's regulars is in fact obvious - to them. But not to the
people asking. It seemed to me that it wasn't obvious to you that
your own rudeness invalidated your complaint in the minds of any
disinterested party (such as myself).
You do?

Wow. Lets just post some more spam. Responding with these question type
responses is pointless and spam, just like this response and everything
the follows. Just drop it.


This isn't spam by my definition of the word. But you claimed to
know C. I haven't yet seen any evidence to support that claim,
which is why I asked the question.
Nov 14 '05 #134
Dude, give it a rest. Whatever makes you sleep better, I'll admit to.

Nov 14 '05 #135

This thread has been closed and replies have been disabled. Please start a new discussion.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.