By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
448,652 Members | 1,696 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 448,652 IT Pros & Developers. It's quick & easy.

how to detect the compile is 32 bits or 64 bits?

P: n/a
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?

Aug 5 '06 #1
Share this Question
Share on Google+
15 Replies


P: n/a
steve yee wrote:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
There is no portable way.

There *are* portable ways to check the ranges of arithmetic
types at compile time, if that's what your "different code" needs.
Use the macros defined in the <limits.hheader.

--
Eric Sosman
es*****@acm-dot-org.invalid
Aug 5 '06 #2

P: n/a
steve yee posted:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?

You shouldn't need to if you're writing portable code.

Nonetheless, you could try something like:

#include <limits.h>

/* Number of bits in inttype_MAX, or in any (1<<k)-1 where 0 <= k < 3.2E+10
*/
#define IMAX_BITS(m) ((m) /((m)%0x3fffffffL+1) /0x3fffffffL %0x3fffffffL *
30 \
+ (m)%0x3fffffffL /((m)%31+1)/31%31*5 + 4-12/((m)%31+3))

#if IMAX_BITS(UINT_MAX) == 32
#define BIT32
#else
#if IMAX_BITS(UINT_MAX) == 64
#define BIT64
#else
#error "System must be either 32-Bit or 64-Bit."
#endif
#endif

int main(void)
{
/* Here's my non-portable code */
}

--

Frederick Gotham
Aug 5 '06 #3

P: n/a
steve yee wrote:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
You invite bugs, as well as taking the question off the topic of
standard C, if you write source code which has to change. The usual
question is about (sizeof)<various pointer types>. If your question is
simply about (sizeof)int, you create problems by assuming it is
determined by whether you have a 32- or 64-bit platform. If you insist
on unions of pointers and ints, knowing whether it is 32 or 64 bits
won't save you. More so, if you are one of those who writes code which
depends on (sizeof)(size_t x).
Aug 5 '06 #4

P: n/a
Frederick Gotham wrote:
steve yee posted:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?


You shouldn't need to if you're writing portable code.

Nonetheless, you could try something like:

#include <limits.h>

/* Number of bits in inttype_MAX, or in any (1<<k)-1 where 0 <= k < 3.2E+10
*/
#define IMAX_BITS(m) ((m) /((m)%0x3fffffffL+1) /0x3fffffffL %0x3fffffffL *
30 \
+ (m)%0x3fffffffL /((m)%31+1)/31%31*5 + 4-12/((m)%31+3))

#if IMAX_BITS(UINT_MAX) == 32
#if UINT_MAX == 0xFFFFFFFF
#define BIT32
#else
#if IMAX_BITS(UINT_MAX) == 64
#elif UINT_MAX == 0xFFFFFFFFFFFFFFFF
#define BIT64
#else
#error "System must be either 32-Bit or 64-Bit."
#endif
#endif
Aug 5 '06 #5

P: n/a
In article <0f*****************@newssvr25.news.prodigy.net> ,
Tim Prince <tp*****@nospammyrealbox.comwrote:
>steve yee wrote:
>i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
>You invite bugs, as well as taking the question off the topic of
standard C, if you write source code which has to change.
Not necessarily.
>The usual
question is about (sizeof)<various pointer types>. If your question is
simply about (sizeof)int, you create problems by assuming it is
determined by whether you have a 32- or 64-bit platform. If you insist
on unions of pointers and ints, knowing whether it is 32 or 64 bits
won't save you. More so, if you are one of those who writes code which
depends on (sizeof)(size_t x).
You appear to have read a fair bit into the poster's question
that I don't think is justified by what the poster wrote.

Suppose I have an algorithm, such as a cryptography algorithm, that
operates on chunks of bits at a time. The algorithm is the same
(except perhaps for a few constants) whether I'm computing with 32 or
64 bits, but the chunk size differs for the two cases. In such a case,
I -could- write the code using only the minimum guaranteed size,
but on most platforms it would be noticably more efficient to use
the larger chunk size *if the platform supports it*. The constants
for the algorithm could probably be computed at run-time and a
generic algorithm used, but in the real world, having the constants
available at compile time increases compiler optimization opportunities
leading to faster code.

In C, it is not an error to write int x = 40000; for use on
platforms that have an int of at least 17 bits. It is not maximally
portable, but it is not an error -- and the OP was asking for a
a compile-time method of selecting such code for platforms that allow it,
dropping back to smaller value assumptions when that is all the
platform supports.
--
Okay, buzzwords only. Two syllables, tops. -- Laurie Anderson
Aug 5 '06 #6

P: n/a
Walter Roberson wrote:
In article <0f*****************@newssvr25.news.prodigy.net> ,
Tim Prince <tp*****@nospammyrealbox.comwrote:
>steve yee wrote:
>>i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
>You invite bugs, as well as taking the question off the topic of
standard C, if you write source code which has to change.

Not necessarily.
>The usual
question is about (sizeof)<various pointer types>. If your question is
simply about (sizeof)int, you create problems by assuming it is
determined by whether you have a 32- or 64-bit platform. If you insist
on unions of pointers and ints, knowing whether it is 32 or 64 bits
won't save you. More so, if you are one of those who writes code which
depends on (sizeof)(size_t x).

You appear to have read a fair bit into the poster's question
that I don't think is justified by what the poster wrote.
OP didn't explain what the reason for non-portable code might be.
Certainly, there wasn't anything about finding the largest efficient
data types.
>
Suppose I have an algorithm, such as a cryptography algorithm, that
operates on chunks of bits at a time. The algorithm is the same
(except perhaps for a few constants) whether I'm computing with 32 or
64 bits, but the chunk size differs for the two cases. In such a case,
I -could- write the code using only the minimum guaranteed size,
but on most platforms it would be noticably more efficient to use
the larger chunk size *if the platform supports it*. The constants
for the algorithm could probably be computed at run-time and a
generic algorithm used, but in the real world, having the constants
available at compile time increases compiler optimization opportunities
leading to faster code.
So, your interpretation of 32- bit vs 64-bit is based on whether there
is an efficient implementation of 64-bit int. It's certainly likely,
but not assured, that a compiler would choose (sizeof)int in accordance
with efficiency. For example, most Windows compilers have 32-bit int
and long even on platforms which have efficient 64- and (limited)
128-bit instructions, where linux for the same platform supports 64-bit int.
If this is your goal, you need a configure script which checks existence
of various candidate data types, which don't necessarily appear in
<stdint.h>, and tests their efficiency.
Aug 5 '06 #7

P: n/a
"steve yee" <yi******@gmail.comwrites:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
The first step is to define exactly what you mean when you say "the
compile is 32 bits or 64 bits". There is no universal definition for
these terms.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Aug 5 '06 #8

P: n/a

Walter Roberson wrote:
In article <0f*****************@newssvr25.news.prodigy.net> ,
Tim Prince <tp*****@nospammyrealbox.comwrote:
steve yee wrote:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
You invite bugs, as well as taking the question off the topic of
standard C, if you write source code which has to change.

Not necessarily.
The usual
question is about (sizeof)<various pointer types>. If your question is
simply about (sizeof)int, you create problems by assuming it is
determined by whether you have a 32- or 64-bit platform. If you insist
on unions of pointers and ints, knowing whether it is 32 or 64 bits
won't save you. More so, if you are one of those who writes code which
depends on (sizeof)(size_t x).

You appear to have read a fair bit into the poster's question
that I don't think is justified by what the poster wrote.

Suppose I have an algorithm, such as a cryptography algorithm, that
operates on chunks of bits at a time. The algorithm is the same
(except perhaps for a few constants) whether I'm computing with 32 or
64 bits, but the chunk size differs for the two cases. In such a case,
I -could- write the code using only the minimum guaranteed size,
but on most platforms it would be noticably more efficient to use
the larger chunk size *if the platform supports it*. The constants
for the algorithm could probably be computed at run-time and a
generic algorithm used, but in the real world, having the constants
available at compile time increases compiler optimization opportunities
leading to faster code.
Indeed. How does knowing "if the compile is 32 bits or 64 bits" help
with this, though? What you're interested in is how big the available
types are.
In C, it is not an error to write int x = 40000; for use on
platforms that have an int of at least 17 bits. It is not maximally
portable, but it is not an error -- and the OP was asking for a
a compile-time method of selecting such code for platforms that allow it,
dropping back to smaller value assumptions when that is all the
platform supports.
If I'm building for my 17-bit-int machine, how is code which tells me
if the compile is 32 bit or 64 bit going to help, especially since the
answer is "no"? I need to know if ints have at least 17 value bits.

Knowing "if the compile is 32 bits or 64 bits" requires a lot of
subsequent non-portable assumptions for the information to be of any
use. Much better to check for the things you need to know rather than
making assumptions which will change from compiler to compiler. What
does the compile being 32 or 64 bit tell me about the size of a long,
for example?

Aug 5 '06 #9

P: n/a
Keith Thompson wrote:
"steve yee" <yi******@gmail.comwrites:
>>i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?


The first step is to define exactly what you mean when you say "the
compile is 32 bits or 64 bits". There is no universal definition for
these terms.
Very true, that's why tools like GNU autoconf exist.

Information regarding platform capabilities belongs in a configureation
file.

--
Ian Collins.
Aug 5 '06 #10

P: n/a

J. J. Farrell wrote:
Walter Roberson wrote:
In article <0f*****************@newssvr25.news.prodigy.net> ,
Tim Prince <tp*****@nospammyrealbox.comwrote:
>steve yee wrote:
>i want to detect if the compile is 32 bits or 64 bits in the source
>code itself. so different code are compiled respectively. how to do
>this?
>You invite bugs, as well as taking the question off the topic of
>standard C, if you write source code which has to change.
Not necessarily.
>The usual
>question is about (sizeof)<various pointer types>. If your question is
>simply about (sizeof)int, you create problems by assuming it is
>determined by whether you have a 32- or 64-bit platform. If you insist
>on unions of pointers and ints, knowing whether it is 32 or 64 bits
>won't save you. More so, if you are one of those who writes code which
>depends on (sizeof)(size_t x).
You appear to have read a fair bit into the poster's question
that I don't think is justified by what the poster wrote.

Suppose I have an algorithm, such as a cryptography algorithm, that
operates on chunks of bits at a time. The algorithm is the same
(except perhaps for a few constants) whether I'm computing with 32 or
64 bits, but the chunk size differs for the two cases. In such a case,
I -could- write the code using only the minimum guaranteed size,
but on most platforms it would be noticably more efficient to use
the larger chunk size *if the platform supports it*. The constants
for the algorithm could probably be computed at run-time and a
generic algorithm used, but in the real world, having the constants
available at compile time increases compiler optimization opportunities
leading to faster code.

Indeed. How does knowing "if the compile is 32 bits or 64 bits" help
with this, though? What you're interested in is how big the available
types are.
In C, it is not an error to write int x = 40000; for use on
platforms that have an int of at least 17 bits. It is not maximally
portable, but it is not an error -- and the OP was asking for a
a compile-time method of selecting such code for platforms that allow it,
dropping back to smaller value assumptions when that is all the
platform supports.

If I'm building for my 17-bit-int machine, how is code which tells me
if the compile is 32 bit or 64 bit going to help, especially since the
answer is "no"? I need to know if ints have at least 17 value bits.

Knowing "if the compile is 32 bits or 64 bits" requires a lot of
subsequent non-portable assumptions for the information to be of any
use. Much better to check for the things you need to know rather than
making assumptions which will change from compiler to compiler. What
does the compile being 32 or 64 bit tell me about the size of a long,
for example?
in fact, what i want to know is the size of the native long type of the
c compiler, at compile time. for exampe, the following code

if (sizeof (long) == 8)
{
// handle 64 bits long type
}
else if (sizeof (long) == 4)
{
// handle 32 bits long type
}

but, you know, this is done at run time, not compile time. i would like
write code like this:

#if sizeof (long) == 8
{
// handle 64 bits long type
}
#elif sizeof (long) == 4
{
// handle 32 bits long type
}
#endif

but this simply doesn't work.

of course autoconf can help to achieve this. but i think the compile
should provide a way to determine the size of each native types that it
supports. in fact sizeof operator is compile time computing, but the
compile does not allow the sizeof operator to be used in #if
preprocesser statement. it should not have this limitation. so i wonder
if there's an alternative way to do this.

btw, on windown platform, autoconf is not generally avalible.

Aug 6 '06 #11

P: n/a
steve yee posted:
#if sizeof (long) == 8
{
// handle 64 bits long type

Only if:

(1) 8 == CHAR_BIT

(2) "long" contains no padding bits.

}
#elif sizeof (long) == 4
{
// handle 32 bits long type

Similarly again.
of course autoconf can help to achieve this. but i think the compile
should provide a way to determine the size of each native types that it
supports. in fact sizeof operator is compile time computing, but the
compile does not allow the sizeof operator to be used in #if
preprocesser statement. it should not have this limitation. so i wonder
if there's an alternative way to do this.

There's the macros defined in "limits.h".

Plus there's the macro entitled "IMAX_BITS" -- you'll find it if you do a
Google Groups search of comp.lang.c.

--

Frederick Gotham
Aug 6 '06 #12

P: n/a
steve yee wrote:
J. J. Farrell wrote:
Walter Roberson wrote:
In article <0f*****************@newssvr25.news.prodigy.net> ,
Tim Prince <tp*****@nospammyrealbox.comwrote:
steve yee wrote:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
>
You invite bugs, as well as taking the question off the topic of
standard C, if you write source code which has to change.
>
Not necessarily.
>
The usual
question is about (sizeof)<various pointer types>. If your question is
simply about (sizeof)int, you create problems by assuming it is
determined by whether you have a 32- or 64-bit platform. If you insist
on unions of pointers and ints, knowing whether it is 32 or 64 bits
won't save you. More so, if you are one of those who writes code which
depends on (sizeof)(size_t x).
>
You appear to have read a fair bit into the poster's question
that I don't think is justified by what the poster wrote.
>
Suppose I have an algorithm, such as a cryptography algorithm, that
operates on chunks of bits at a time. The algorithm is the same
(except perhaps for a few constants) whether I'm computing with 32 or
64 bits, but the chunk size differs for the two cases. In such a case,
I -could- write the code using only the minimum guaranteed size,
but on most platforms it would be noticably more efficient to use
the larger chunk size *if the platform supports it*. The constants
for the algorithm could probably be computed at run-time and a
generic algorithm used, but in the real world, having the constants
available at compile time increases compiler optimization opportunities
leading to faster code.
Indeed. How does knowing "if the compile is 32 bits or 64 bits" help
with this, though? What you're interested in is how big the available
types are.
In C, it is not an error to write int x = 40000; for use on
platforms that have an int of at least 17 bits. It is not maximally
portable, but it is not an error -- and the OP was asking for a
a compile-time method of selecting such code for platforms that allow it,
dropping back to smaller value assumptions when that is all the
platform supports.
If I'm building for my 17-bit-int machine, how is code which tells me
if the compile is 32 bit or 64 bit going to help, especially since the
answer is "no"? I need to know if ints have at least 17 value bits.

Knowing "if the compile is 32 bits or 64 bits" requires a lot of
subsequent non-portable assumptions for the information to be of any
use. Much better to check for the things you need to know rather than
making assumptions which will change from compiler to compiler. What
does the compile being 32 or 64 bit tell me about the size of a long,
for example?

in fact, what i want to know is the size of the native long type of the
c compiler, at compile time. for exampe, the following code

if (sizeof (long) == 8)
{
// handle 64 bits long type
}
else if (sizeof (long) == 4)
{
// handle 32 bits long type
}

but, you know, this is done at run time, not compile time.
In theory, it is done at run time. In practice, it is done at compile
time. It's a very easy optimisation for compilers when the expression
is constant.

Aug 6 '06 #13

P: n/a

steve yee wrote:
>
in fact, what i want to know is the size of the native long type of the
c compiler, at compile time.
It's very likely that including <limits.hand making appropriate use
of the LONG_MIN, LONG_MAX, and ULONG_MAX with #if will give you what
you need.

Aug 6 '06 #14

P: n/a
steve yee wrote:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
#include <limits.h>
#include <stdio.h>

int main(void)
{

#if UINT_MAX == 65535U

printf(" 16 bit int \n");

#elif UINT_MAX == 4294967295UL

#if ULONG_MAX == 4294967295UL

printf(" 32 bit int and long \n");

#elif ULONG_MAX == 18446744073709551615ULL

printf(" 32 bit int and 64 bit long \n");

#else

printf(" 32 bit int but neither 32 or 64 bit long \n");

#endif

#elif UINT_MAX == 18446744073709551615ULL

printf(" 64 bit int \n");

#else

printf(" neither 16, 32 or 64 bit int \n");

#endif

return 0;
}
Aug 6 '06 #15

P: n/a
steve yee wrote:
>
in fact, what i want to know is the size of the native long type of the
c compiler, at compile time. [...]
You can discover the bounds of the `long' type easily
enough by using the <limits.hmacros:

#include <limits.h>
#if LONG_MAX == 0x7fffffff
/* 32-bit long, the minimum size */
#elif (LONG_MAX >32) >= 0x7fffffff
/* 64 or more bits. Note that the `>>32' is
* not attempted until we've already established
* that `long' has >32 bits.
*/
#else
/* somewhere in 32 < bits < 64 */
#endif

You could, of course, use additional range tests to refine
the result further.

However, none of this will tell you whether the long
type is "native" in any useful sense. You cannot tell by
examining the range or size of `long' whether it is directly
supported by the hardware or emulated in software -- observe
that even an 8-bit CPU must support a `long' of at least 32
bits. Some machines even use a combination of hardware and
emulation: A system might add, subtract, and multiply `long'
values in hardware but use software for division.

At run time, you might use clock() to time a large number
of `int' and `long' calculations and try to guess from the
difference in CPU time whether there's software emulation
going on -- but that's going to be an iffy business at best
(note that the 8-bit CPU must use emulation even for `int').
You'd get a far better answer by reading the implementation's
documentation.

--
Eric Sosman
es*****@acm-dot-org.invalid

Aug 6 '06 #16

This discussion thread is closed

Replies have been disabled for this discussion.