By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,837 Members | 1,195 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,837 IT Pros & Developers. It's quick & easy.

Interesting features of #define and typedef, correct me if I am wrong

P: n/a
Look at these two codes:

===================================
#define int_ptr int*
int_ptr a, b;
===================================

and

===================================
typedef int* int_ptr
int_ptr a, b;
===================================

In first example, only a is a pointer-to-an-integer but b is an
integer.

At second example, both of a, b are pointer-to-an-integer.

Do you guys concur with me?

Feb 1 '07 #1
Share this Question
Share on Google+
13 Replies


P: n/a
<fd*******@gmail.comwrote in message
Look at these two codes:

===================================
#define int_ptr int*
int_ptr a, b;
===================================

and

===================================
typedef int* int_ptr
int_ptr a, b;
===================================

In first example, only a is a pointer-to-an-integer but b is an
integer.

At second example, both of a, b are pointer-to-an-integer.

Do you guys concur with me?
That's one of the reasons you need typedef. The syntax for declaring
pointers is a bit unusual, though it makes sense in a formal grammary sort
of way.
Feb 1 '07 #2

P: n/a
Malcolm McLean wrote:
<fd*******@gmail.comwrote in message
>>Look at these two codes:

===================================
#define int_ptr int*
int_ptr a, b;
===================================

and

===================================
typedef int* int_ptr
int_ptr a, b;
===================================

In first example, only a is a pointer-to-an-integer but b is an
integer.

At second example, both of a, b are pointer-to-an-integer.

Do you guys concur with me?

That's one of the reasons you need typedef. The syntax for declaring
pointers is a bit unusual, though it makes sense in a formal grammary sort
of way.
It's also a good reason for not declaring more than one variable per line.

--
Ian Collins.
Feb 1 '07 #3

P: n/a

"Ian Collins" <ia******@hotmail.comwrote in message
news:52*************@mid.individual.net...
Malcolm McLean wrote:
><fd*******@gmail.comwrote in message
[OP]
>>
That's one of the reasons you need typedef. The syntax for declaring
pointers is a bit unusual, though it makes sense in a formal grammary
sort
of way.
It's also a good reason for not declaring more than one variable per line.
This post is typical of a variety that I don't really understand. It uses
the idioms of C to do things that are ill-advised. Why? Because it's
unclear and unnecessary. With preprocessor stuff and type definitions, the
world is a better place when whitespace and carriage return delimit the
statements liberally. What makes a #define statement interesting is the
content of what is being defined, not the statement itself. LS
Feb 1 '07 #4

P: n/a
fd*******@gmail.com wrote:
===================================
#define int_ptr int*
int_ptr a, b;
===================================

and

===================================
typedef int* int_ptr
int_ptr a, b;
===================================

In first example, only a is a pointer-to-an-integer but b is an
integer.

At second example, both of a, b are pointer-to-an-integer.
The base type is int. The '*' denotes the identifier as being a pointer to
that base type.

You're operating from the paradigm that the base type is pointer-to-int, which
it isn't - it's int.
Feb 2 '07 #5

P: n/a
fd*******@gmail.com said:
Look at these two codes:

===================================
#define int_ptr int*
int_ptr a, b;
===================================

and

===================================
typedef int* int_ptr
int_ptr a, b;
===================================

In first example, only a is a pointer-to-an-integer but b is an
integer.

At second example, both of a, b are pointer-to-an-integer.

Do you guys concur with me?
I don't understand why you think it's interesting.

--
Richard Heathfield
"Usenet is a strange place" - dmr 29/7/1999
http://www.cpax.org.uk
email: rjh at the above domain, - www.
Feb 2 '07 #6

P: n/a
On Feb 2, 11:36 am, Richard Heathfield <r...@see.sig.invalidwrote:

<snip>
===================================
#define int_ptr int*
int_ptr a, b;
===================================
and
===================================
typedef int* int_ptr
int_ptr a, b;
===================================
In first example, only a is a pointer-to-an-integer but b is an
integer.
At second example, both of a, b are pointer-to-an-integer.
Do you guys concur with me?

I don't understand why you think it's interesting.
Is it true that typedef increases code size but #define does_NOT_,
since latter being the preprocessor.

Thanks,
Nishu

Feb 2 '07 #7

P: n/a
Nishu said:

<snip>
Is it true that typedef increases code size but #define does_NOT_,
since latter being the preprocessor.
It depends on what you mean by code size. You might conceivably mean the
source code, in which case of course they both increase it, just as any
language construct does, including whitespace.

But if you mean the object code, then no, it isn't true. For one thing,
typedefs needn't increase code size. For another, #defines definitely can.
Thirdly, #defines are not the preprocessor. They are merely expanded by the
preprocessor. "Expanded" should give you an extra clue here.

--
Richard Heathfield
"Usenet is a strange place" - dmr 29/7/1999
http://www.cpax.org.uk
email: rjh at the above domain, - www.
Feb 2 '07 #8

P: n/a
"Nishu" <na**********@gmail.comwrites:
[...]
Is it true that typedef increases code size but #define does_NOT_,
since latter being the preprocessor.
No. A typedef merely creates an alias for an existing type. It's a
notational convenience. I can't think of any reason why it would
increase code size. If you can explain what you mean (increase code
size compared to what?), perhaps we can explain further.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Feb 2 '07 #9

P: n/a
On Feb 2, 12:58 pm, Keith Thompson <k...@mib.orgwrote:
"Nishu" <naresh.at...@gmail.comwrites:

[...]
Is it true that typedef increases code size but #define does_NOT_,
since latter being the preprocessor.

No. A typedef merely creates an alias for an existing type. It's a
notational convenience. I can't think of any reason why it would
increase code size. If you can explain what you mean (increase code
size compared to what?), perhaps we can explain further.
By increase in code size, I meant increase in object code size (or the
library size which includes least symbols(in release mode)) by using
typedef instead of preprocessor.

I've a notion that Preprocessors were simply replaced by Preprocessor
before compiler/assembler generates the object code but for typedef
there might be some implicit conversions since it is not a
preprocessor.

Thanks,
Nishu

Feb 2 '07 #10

P: n/a
Nishu a écrit :
On Feb 2, 12:58 pm, Keith Thompson <k...@mib.orgwrote:
>>"Nishu" <naresh.at...@gmail.comwrites:

[...]

>>>Is it true that typedef increases code size but #define does_NOT_,
since latter being the preprocessor.

No. A typedef merely creates an alias for an existing type. It's a
notational convenience. I can't think of any reason why it would
increase code size. If you can explain what you mean (increase code
size compared to what?), perhaps we can explain further.


By increase in code size, I meant increase in object code size (or the
library size which includes least symbols(in release mode)) by using
typedef instead of preprocessor.

I've a notion that Preprocessors were simply replaced by Preprocessor
before compiler/assembler generates the object code but for typedef
there might be some implicit conversions since it is not a
preprocessor.

Thanks,
Nishu
You are wrong. The object code size will be the same, since the assembly
instructions for using an int pointer or a typedefed int pointer will be
exactly the same.

Of course, if your compiler generates DEBUG INFORMATION then it will
generate a record for the typedef. But many compilers (lcc-win32 for
instance) generate ALSO records for the #defines, noting if it is a
macro or just a replacement, etc. This will be highly specific to the
format of the debug information but I would guess it will be almost the
same.
Feb 2 '07 #11

P: n/a

"Nishu" <na**********@gmail.comwrote in message
>
>No. A typedef merely creates an alias for an existing type. It's a
notational convenience. I can't think of any reason why it would
increase code size. If you can explain what you mean (increase code
size compared to what?), perhaps we can explain further.

By increase in code size, I meant increase in object code size (or the
library size which includes least symbols(in release mode)) by using
typedef instead of preprocessor.

I've a notion that Preprocessors were simply replaced by Preprocessor
before compiler/assembler generates the object code but for typedef
there might be some implicit conversions since it is not a
preprocessor.
A compiler can do what it wants.
So it is conceivable that it might add an ASCII string representing the
typedef to some intermediate file, even the final executable. However
normally typedef would be resolved to the basic type by the front end, and
the object have the plain signature.
Feb 3 '07 #12

P: n/a

"Nishu" <na**********@gmail.comwrote in message
news:11**********************@v33g2000cwv.googlegr oups.com...
On Feb 2, 12:58 pm, Keith Thompson <k...@mib.orgwrote:
>"Nishu" <naresh.at...@gmail.comwrites:

[...]
Is it true that typedef increases code size but #define does_NOT_,
since latter being the preprocessor.

No. A typedef merely creates an alias for an existing type. It's a
notational convenience. I can't think of any reason why it would
increase code size. If you can explain what you mean (increase code
size compared to what?), perhaps we can explain further.

By increase in code size, I meant increase in object code size (or the
library size which includes least symbols(in release mode)) by using
typedef instead of preprocessor.

I've a notion that Preprocessors were simply replaced by Preprocessor
before compiler/assembler generates the object code but for typedef
there might be some implicit conversions since it is not a
preprocessor.
#define verbose damnit
I think it makes next to no sense to talk about code size with either a
defines or a typedef. You take all the room you need. LS
Feb 6 '07 #13

P: n/a
Lane Straatman wrote:
I think it makes next to no sense to talk about code size with either a
defines or a typedef. You take all the room you need. LS
Yes, and there is no reason why increased room would be needed either
way.

--
DPS
Feb 12 '07 #14

This discussion thread is closed

Replies have been disabled for this discussion.