By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,822 Members | 731 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,822 IT Pros & Developers. It's quick & easy.

Function declaring style

P: n/a
Hello,

In the short time I have spent reading this newsgroup, I have seen this
sort of declaration a few times:
int
func (string, number, structure)
char* string
int number
struct some_struct structure
Now, I am vaguely familiar with it; I did not read up on it because I
have read in an apparently misinformed book that such declarations were
very old and that no one used them any more. Can someone please explain
why they are still using this style and why the book said it was
obsolete? Is what I have written any different than:
int
func (char* string, int number, struct some_struct structure) ?


Sorry if I'm being too basic about this, but it's just something that
will probably never get mentioned in regular C courses.

Thanks in advance.

Feb 5 '06 #1
Share this Question
Share on Google+
6 Replies


P: n/a
On 5 Feb 2006 15:28:27 -0800, in comp.lang.c , dn****@hotpop.com
wrote:
In the short time I have spent reading this newsgroup, I have seen this
sort of declaration a few times:
actually you mean definition.

(snip example of pre-ANSI function definition)
Now, I am vaguely familiar with it; I did not read up on it because I
have read in an apparently misinformed book that such declarations were
very old and that no one used them any more.
The book is correct. However legacy code is likely to still contain
such definitions, as will code written by people forced to use antique
compilers, or who are learning from very old books.
Is what I have written any different than:
int
func (char* string, int number, struct some_struct structure) ?


Only slightly - this ISO/ANSI definition is also a prototype, and
gives the compiler more ability to check types I believe.
Mark McIntyre
--
"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are,
by definition, not smart enough to debug it."
--Brian Kernighan

----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
----= East and West-Coast Server Farms - Total Privacy via Encryption =----
Feb 5 '06 #2

P: n/a
<dn****@hotpop.com> wrote
int
func (string, number, structure)
char* string
int number
struct some_struct structure


Now, I am vaguely familiar with it; I did not read up on it because I
have read in an apparently misinformed book that such declarations were
very old and that no one used them any more. Can someone please explain
why they are still using this style and why the book said it was
obsolete? Is what I have written any different than:

I'm still using Fortran 77.
I would guess that the style is borrowed from Fortran, and of course it is
messy and hard to read and the modern syntax is better.

However you should be familiar with it. I failed a job interview because I
wasn't, and was presented with some pre-ANSI code to debug. The company
still used it, for some reason.
Feb 6 '06 #3

P: n/a
dn****@hotpop.com writes:
In the short time I have spent reading this newsgroup, I have seen this
sort of declaration a few times:
int
func (string, number, structure)
char* string
int number
struct some_struct structure


Now, I am vaguely familiar with it; I did not read up on it because I
have read in an apparently misinformed book that such declarations were
very old and that no one used them any more. Can someone please explain
why they are still using this style and why the book said it was
obsolete? Is what I have written any different than:
int
func (char* string, int number, struct some_struct structure) ?


That old style of function definition has been basically obsolete
since the ANSI standard was approved in 1989. For several years after
that, there were still enough compilers in use that didn't support
prototypes (the superior alternative introduced by the ANSI standard)
that it was still sometimes necessary to use old-style definitions.
You'll still see a fair amount of old code that uses preprocessor
tricks to cater to pre-ANSI and ANSI compilers; there's also a tool
called "ansi2knr" that translates code using prototypes to code using
the old-style definitions. ("knr" refers to K&R, Kernighan &
Ritchie's _The C Programming Language_. The first edition describes
the pre-ANSI version of the language. The second edition describes
the newer language defined by the ANSI standard.)

The 1989 ANSI standard (or the equivalent 1990 ISO C standard) has
caught on almost universally. I'm sure there are still pre-ANSI
compilers in use somewhere, but I don't think they exist on any of the
systems I currently use. There's no longer any need to use old-style
function definitions unless you have a specific requirement to support
an ancient system.

But the old-style definitions are still supported by the newer
standards for backward compatibility.

The newer 1999 ISO C standard has not caught on as quickly, probably
because it's less of an improvement over the previous standard than
the 1989 ANSI C standard was over the (non-)standard that it replaced.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Feb 6 '06 #4

P: n/a
> In the short time I have spent reading this newsgroup, I have seen this
sort of declaration a few times:
int
func (string, number, structure)
char* string
int number
struct some_struct structure

(There should be semicolons after each of the three declarations.)
Now, I am vaguely familiar with it; I did not read up on it because I
have read in an apparently misinformed book that such declarations were
very old and that no one used them any more. Can someone please explain
why they are still using this style and why the book said it was
obsolete?
There is no good reason to use the old style in new code today.
Is what I have written any different than:
int
func (char* string, int number, struct some_struct structure) ?


Yes, there is a difference: when you use the modern "prototype"
syntax, the compiler does automatic type conversions on each argument
the same way as it does in an ordinary assignment (=) expression.
With the old syntax, the types have to match or the behavior is
undefined. So these lines:

answer = func("hello", 3.1414, strucked);
answer = func("hello", 3, strucked);

are equivalent if func() was defined using a prototype, because the
double constant 3.1414 is safely converted to int. (If func() was
defined in a separate file, of course, you also need a prototyped
declaration in scope when you call it.)

If you used the old syntax, only the third form would work safely.
The others would cause undefined behavior -- in practice, what's
likely to happen is at least that some bits from the floating-point
value will be misinterpreted as an integer, and other arguments may
be misread as well. Similar issues arise if you use a long int
argument (3L) and int and long int are different sizes.

There are some other subtle differences relating certain to specific
types of arguments such as "float" and "short".
--
Mark Brader "If you design for compatibility with a
Toronto donkey cart, what you get is a donkey cart."
ms*@vex.net -- ?, quoted by Henry Spencer

My text in this article is in the public domain.
Feb 6 '06 #5

P: n/a

Mark Brader wrote:
In the short time I have spent reading this newsgroup, I have seen this
sort of declaration a few times:
int
func (string, number, structure)
char* string
int number
struct some_struct structure

(There should be semicolons after each of the three declarations.)


Sorry about that. And sorry for using incorrect terminology.
Now, I am vaguely familiar with it; I did not read up on it because I
have read in an apparently misinformed book that such declarations were
very old and that no one used them any more. Can someone please explain
why they are still using this style and why the book said it was
obsolete?
There is no good reason to use the old style in new code today.
Is what I have written any different than:
int
func (char* string, int number, struct some_struct structure) ?


Yes, there is a difference: when you use the modern "prototype"
syntax, the compiler does automatic type conversions on each argument
the same way as it does in an ordinary assignment (=) expression.
With the old syntax, the types have to match or the behavior is
undefined. So these lines:

answer = func("hello", 3.1414, strucked);
answer = func("hello", 3, strucked);

are equivalent if func() was defined using a prototype, because the
double constant 3.1414 is safely converted to int. (If func() was
defined in a separate file, of course, you also need a prototyped
declaration in scope when you call it.)

If you used the old syntax, only the third form would work safely.
The others would cause undefined behavior -- in practice, what's
likely to happen is at least that some bits from the floating-point
value will be misinterpreted as an integer, and other arguments may
be misread as well. Similar issues arise if you use a long int
argument (3L) and int and long int are different sizes.

There are some other subtle differences relating certain to specific
types of arguments such as "float" and "short".


Thanks ot everyone for valuable advice and insight.
--
Mark Brader "If you design for compatibility with a
Toronto donkey cart, what you get is a donkey cart."
ms*@vex.net -- ?, quoted by Henry Spencer

My text in this article is in the public domain.


Feb 6 '06 #6

P: n/a
On Sun, 05 Feb 2006 23:46:04 +0000, Mark McIntyre
<ma**********@spamcop.net> wrote:
On 5 Feb 2006 15:28:27 -0800, in comp.lang.c , dn****@hotpop.com
wrote:

Is [old-style function definition] any different than:
int
func (char* string, int number, struct some_struct structure) ?


Only slightly - this ISO/ANSI definition is also a prototype, and
gives the compiler more ability to check types I believe.


Almost. A prototype definition is also a (prototype) declaration and
the compiler _must_ diagnose any mismatch with calls made in the scope
of that declaration (which is the rest of the translation unit, i.e.,
intramodule calls) _and_ any mismatch with another/prior prototype
declaration (such as an interface in an #include'd .h file).

A nonprototype definition is also a nonprototype declaration and does
not require such checking*; but the definition does provide type
information that the compiler and/or linker _can_ check if they wish.
* If there is _another_ prototype declaration in scope _that_
declaration does require checking of calls -- but not matching with
the definition. This was (is?) a handy transition technique, because
it is easy to macroize a declaration to be either prototype or
oldstyle, but much harder to do this with the definition.

There is also a subtle difference for some parameter types, though not
the ones in the OP's case. If a parameter in a K&R1 definition is a
float or an integer type narrower (lower rank) than int they are
actually passed as double and (signed or unsigned) int respectively
and then 'narrowed' back to the declared types in the called body.
Thus to write a prototype declaration, and make prototyped calls, to a
K&R1-defined function with such parameter types, the prototype must
use the widened types.

- David.Thompson1 at worldnet.att.net
Feb 13 '06 #7

This discussion thread is closed

Replies have been disabled for this discussion.