473,387 Members | 1,481 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,387 software developers and data experts.

how to detect the compile is 32 bits or 64 bits?

i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?

Aug 5 '06 #1
15 4743
steve yee wrote:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
There is no portable way.

There *are* portable ways to check the ranges of arithmetic
types at compile time, if that's what your "different code" needs.
Use the macros defined in the <limits.hheader.

--
Eric Sosman
es*****@acm-dot-org.invalid
Aug 5 '06 #2
steve yee posted:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?

You shouldn't need to if you're writing portable code.

Nonetheless, you could try something like:

#include <limits.h>

/* Number of bits in inttype_MAX, or in any (1<<k)-1 where 0 <= k < 3.2E+10
*/
#define IMAX_BITS(m) ((m) /((m)%0x3fffffffL+1) /0x3fffffffL %0x3fffffffL *
30 \
+ (m)%0x3fffffffL /((m)%31+1)/31%31*5 + 4-12/((m)%31+3))

#if IMAX_BITS(UINT_MAX) == 32
#define BIT32
#else
#if IMAX_BITS(UINT_MAX) == 64
#define BIT64
#else
#error "System must be either 32-Bit or 64-Bit."
#endif
#endif

int main(void)
{
/* Here's my non-portable code */
}

--

Frederick Gotham
Aug 5 '06 #3
steve yee wrote:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
You invite bugs, as well as taking the question off the topic of
standard C, if you write source code which has to change. The usual
question is about (sizeof)<various pointer types>. If your question is
simply about (sizeof)int, you create problems by assuming it is
determined by whether you have a 32- or 64-bit platform. If you insist
on unions of pointers and ints, knowing whether it is 32 or 64 bits
won't save you. More so, if you are one of those who writes code which
depends on (sizeof)(size_t x).
Aug 5 '06 #4
Frederick Gotham wrote:
steve yee posted:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?


You shouldn't need to if you're writing portable code.

Nonetheless, you could try something like:

#include <limits.h>

/* Number of bits in inttype_MAX, or in any (1<<k)-1 where 0 <= k < 3.2E+10
*/
#define IMAX_BITS(m) ((m) /((m)%0x3fffffffL+1) /0x3fffffffL %0x3fffffffL *
30 \
+ (m)%0x3fffffffL /((m)%31+1)/31%31*5 + 4-12/((m)%31+3))

#if IMAX_BITS(UINT_MAX) == 32
#if UINT_MAX == 0xFFFFFFFF
#define BIT32
#else
#if IMAX_BITS(UINT_MAX) == 64
#elif UINT_MAX == 0xFFFFFFFFFFFFFFFF
#define BIT64
#else
#error "System must be either 32-Bit or 64-Bit."
#endif
#endif
Aug 5 '06 #5
In article <0f*****************@newssvr25.news.prodigy.net> ,
Tim Prince <tp*****@nospammyrealbox.comwrote:
>steve yee wrote:
>i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
>You invite bugs, as well as taking the question off the topic of
standard C, if you write source code which has to change.
Not necessarily.
>The usual
question is about (sizeof)<various pointer types>. If your question is
simply about (sizeof)int, you create problems by assuming it is
determined by whether you have a 32- or 64-bit platform. If you insist
on unions of pointers and ints, knowing whether it is 32 or 64 bits
won't save you. More so, if you are one of those who writes code which
depends on (sizeof)(size_t x).
You appear to have read a fair bit into the poster's question
that I don't think is justified by what the poster wrote.

Suppose I have an algorithm, such as a cryptography algorithm, that
operates on chunks of bits at a time. The algorithm is the same
(except perhaps for a few constants) whether I'm computing with 32 or
64 bits, but the chunk size differs for the two cases. In such a case,
I -could- write the code using only the minimum guaranteed size,
but on most platforms it would be noticably more efficient to use
the larger chunk size *if the platform supports it*. The constants
for the algorithm could probably be computed at run-time and a
generic algorithm used, but in the real world, having the constants
available at compile time increases compiler optimization opportunities
leading to faster code.

In C, it is not an error to write int x = 40000; for use on
platforms that have an int of at least 17 bits. It is not maximally
portable, but it is not an error -- and the OP was asking for a
a compile-time method of selecting such code for platforms that allow it,
dropping back to smaller value assumptions when that is all the
platform supports.
--
Okay, buzzwords only. Two syllables, tops. -- Laurie Anderson
Aug 5 '06 #6
Walter Roberson wrote:
In article <0f*****************@newssvr25.news.prodigy.net> ,
Tim Prince <tp*****@nospammyrealbox.comwrote:
>steve yee wrote:
>>i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
>You invite bugs, as well as taking the question off the topic of
standard C, if you write source code which has to change.

Not necessarily.
>The usual
question is about (sizeof)<various pointer types>. If your question is
simply about (sizeof)int, you create problems by assuming it is
determined by whether you have a 32- or 64-bit platform. If you insist
on unions of pointers and ints, knowing whether it is 32 or 64 bits
won't save you. More so, if you are one of those who writes code which
depends on (sizeof)(size_t x).

You appear to have read a fair bit into the poster's question
that I don't think is justified by what the poster wrote.
OP didn't explain what the reason for non-portable code might be.
Certainly, there wasn't anything about finding the largest efficient
data types.
>
Suppose I have an algorithm, such as a cryptography algorithm, that
operates on chunks of bits at a time. The algorithm is the same
(except perhaps for a few constants) whether I'm computing with 32 or
64 bits, but the chunk size differs for the two cases. In such a case,
I -could- write the code using only the minimum guaranteed size,
but on most platforms it would be noticably more efficient to use
the larger chunk size *if the platform supports it*. The constants
for the algorithm could probably be computed at run-time and a
generic algorithm used, but in the real world, having the constants
available at compile time increases compiler optimization opportunities
leading to faster code.
So, your interpretation of 32- bit vs 64-bit is based on whether there
is an efficient implementation of 64-bit int. It's certainly likely,
but not assured, that a compiler would choose (sizeof)int in accordance
with efficiency. For example, most Windows compilers have 32-bit int
and long even on platforms which have efficient 64- and (limited)
128-bit instructions, where linux for the same platform supports 64-bit int.
If this is your goal, you need a configure script which checks existence
of various candidate data types, which don't necessarily appear in
<stdint.h>, and tests their efficiency.
Aug 5 '06 #7
"steve yee" <yi******@gmail.comwrites:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
The first step is to define exactly what you mean when you say "the
compile is 32 bits or 64 bits". There is no universal definition for
these terms.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Aug 5 '06 #8

Walter Roberson wrote:
In article <0f*****************@newssvr25.news.prodigy.net> ,
Tim Prince <tp*****@nospammyrealbox.comwrote:
steve yee wrote:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
You invite bugs, as well as taking the question off the topic of
standard C, if you write source code which has to change.

Not necessarily.
The usual
question is about (sizeof)<various pointer types>. If your question is
simply about (sizeof)int, you create problems by assuming it is
determined by whether you have a 32- or 64-bit platform. If you insist
on unions of pointers and ints, knowing whether it is 32 or 64 bits
won't save you. More so, if you are one of those who writes code which
depends on (sizeof)(size_t x).

You appear to have read a fair bit into the poster's question
that I don't think is justified by what the poster wrote.

Suppose I have an algorithm, such as a cryptography algorithm, that
operates on chunks of bits at a time. The algorithm is the same
(except perhaps for a few constants) whether I'm computing with 32 or
64 bits, but the chunk size differs for the two cases. In such a case,
I -could- write the code using only the minimum guaranteed size,
but on most platforms it would be noticably more efficient to use
the larger chunk size *if the platform supports it*. The constants
for the algorithm could probably be computed at run-time and a
generic algorithm used, but in the real world, having the constants
available at compile time increases compiler optimization opportunities
leading to faster code.
Indeed. How does knowing "if the compile is 32 bits or 64 bits" help
with this, though? What you're interested in is how big the available
types are.
In C, it is not an error to write int x = 40000; for use on
platforms that have an int of at least 17 bits. It is not maximally
portable, but it is not an error -- and the OP was asking for a
a compile-time method of selecting such code for platforms that allow it,
dropping back to smaller value assumptions when that is all the
platform supports.
If I'm building for my 17-bit-int machine, how is code which tells me
if the compile is 32 bit or 64 bit going to help, especially since the
answer is "no"? I need to know if ints have at least 17 value bits.

Knowing "if the compile is 32 bits or 64 bits" requires a lot of
subsequent non-portable assumptions for the information to be of any
use. Much better to check for the things you need to know rather than
making assumptions which will change from compiler to compiler. What
does the compile being 32 or 64 bit tell me about the size of a long,
for example?

Aug 5 '06 #9
Keith Thompson wrote:
"steve yee" <yi******@gmail.comwrites:
>>i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?


The first step is to define exactly what you mean when you say "the
compile is 32 bits or 64 bits". There is no universal definition for
these terms.
Very true, that's why tools like GNU autoconf exist.

Information regarding platform capabilities belongs in a configureation
file.

--
Ian Collins.
Aug 5 '06 #10

J. J. Farrell wrote:
Walter Roberson wrote:
In article <0f*****************@newssvr25.news.prodigy.net> ,
Tim Prince <tp*****@nospammyrealbox.comwrote:
>steve yee wrote:
>i want to detect if the compile is 32 bits or 64 bits in the source
>code itself. so different code are compiled respectively. how to do
>this?
>You invite bugs, as well as taking the question off the topic of
>standard C, if you write source code which has to change.
Not necessarily.
>The usual
>question is about (sizeof)<various pointer types>. If your question is
>simply about (sizeof)int, you create problems by assuming it is
>determined by whether you have a 32- or 64-bit platform. If you insist
>on unions of pointers and ints, knowing whether it is 32 or 64 bits
>won't save you. More so, if you are one of those who writes code which
>depends on (sizeof)(size_t x).
You appear to have read a fair bit into the poster's question
that I don't think is justified by what the poster wrote.

Suppose I have an algorithm, such as a cryptography algorithm, that
operates on chunks of bits at a time. The algorithm is the same
(except perhaps for a few constants) whether I'm computing with 32 or
64 bits, but the chunk size differs for the two cases. In such a case,
I -could- write the code using only the minimum guaranteed size,
but on most platforms it would be noticably more efficient to use
the larger chunk size *if the platform supports it*. The constants
for the algorithm could probably be computed at run-time and a
generic algorithm used, but in the real world, having the constants
available at compile time increases compiler optimization opportunities
leading to faster code.

Indeed. How does knowing "if the compile is 32 bits or 64 bits" help
with this, though? What you're interested in is how big the available
types are.
In C, it is not an error to write int x = 40000; for use on
platforms that have an int of at least 17 bits. It is not maximally
portable, but it is not an error -- and the OP was asking for a
a compile-time method of selecting such code for platforms that allow it,
dropping back to smaller value assumptions when that is all the
platform supports.

If I'm building for my 17-bit-int machine, how is code which tells me
if the compile is 32 bit or 64 bit going to help, especially since the
answer is "no"? I need to know if ints have at least 17 value bits.

Knowing "if the compile is 32 bits or 64 bits" requires a lot of
subsequent non-portable assumptions for the information to be of any
use. Much better to check for the things you need to know rather than
making assumptions which will change from compiler to compiler. What
does the compile being 32 or 64 bit tell me about the size of a long,
for example?
in fact, what i want to know is the size of the native long type of the
c compiler, at compile time. for exampe, the following code

if (sizeof (long) == 8)
{
// handle 64 bits long type
}
else if (sizeof (long) == 4)
{
// handle 32 bits long type
}

but, you know, this is done at run time, not compile time. i would like
write code like this:

#if sizeof (long) == 8
{
// handle 64 bits long type
}
#elif sizeof (long) == 4
{
// handle 32 bits long type
}
#endif

but this simply doesn't work.

of course autoconf can help to achieve this. but i think the compile
should provide a way to determine the size of each native types that it
supports. in fact sizeof operator is compile time computing, but the
compile does not allow the sizeof operator to be used in #if
preprocesser statement. it should not have this limitation. so i wonder
if there's an alternative way to do this.

btw, on windown platform, autoconf is not generally avalible.

Aug 6 '06 #11
steve yee posted:
#if sizeof (long) == 8
{
// handle 64 bits long type

Only if:

(1) 8 == CHAR_BIT

(2) "long" contains no padding bits.

}
#elif sizeof (long) == 4
{
// handle 32 bits long type

Similarly again.
of course autoconf can help to achieve this. but i think the compile
should provide a way to determine the size of each native types that it
supports. in fact sizeof operator is compile time computing, but the
compile does not allow the sizeof operator to be used in #if
preprocesser statement. it should not have this limitation. so i wonder
if there's an alternative way to do this.

There's the macros defined in "limits.h".

Plus there's the macro entitled "IMAX_BITS" -- you'll find it if you do a
Google Groups search of comp.lang.c.

--

Frederick Gotham
Aug 6 '06 #12
steve yee wrote:
J. J. Farrell wrote:
Walter Roberson wrote:
In article <0f*****************@newssvr25.news.prodigy.net> ,
Tim Prince <tp*****@nospammyrealbox.comwrote:
steve yee wrote:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
>
You invite bugs, as well as taking the question off the topic of
standard C, if you write source code which has to change.
>
Not necessarily.
>
The usual
question is about (sizeof)<various pointer types>. If your question is
simply about (sizeof)int, you create problems by assuming it is
determined by whether you have a 32- or 64-bit platform. If you insist
on unions of pointers and ints, knowing whether it is 32 or 64 bits
won't save you. More so, if you are one of those who writes code which
depends on (sizeof)(size_t x).
>
You appear to have read a fair bit into the poster's question
that I don't think is justified by what the poster wrote.
>
Suppose I have an algorithm, such as a cryptography algorithm, that
operates on chunks of bits at a time. The algorithm is the same
(except perhaps for a few constants) whether I'm computing with 32 or
64 bits, but the chunk size differs for the two cases. In such a case,
I -could- write the code using only the minimum guaranteed size,
but on most platforms it would be noticably more efficient to use
the larger chunk size *if the platform supports it*. The constants
for the algorithm could probably be computed at run-time and a
generic algorithm used, but in the real world, having the constants
available at compile time increases compiler optimization opportunities
leading to faster code.
Indeed. How does knowing "if the compile is 32 bits or 64 bits" help
with this, though? What you're interested in is how big the available
types are.
In C, it is not an error to write int x = 40000; for use on
platforms that have an int of at least 17 bits. It is not maximally
portable, but it is not an error -- and the OP was asking for a
a compile-time method of selecting such code for platforms that allow it,
dropping back to smaller value assumptions when that is all the
platform supports.
If I'm building for my 17-bit-int machine, how is code which tells me
if the compile is 32 bit or 64 bit going to help, especially since the
answer is "no"? I need to know if ints have at least 17 value bits.

Knowing "if the compile is 32 bits or 64 bits" requires a lot of
subsequent non-portable assumptions for the information to be of any
use. Much better to check for the things you need to know rather than
making assumptions which will change from compiler to compiler. What
does the compile being 32 or 64 bit tell me about the size of a long,
for example?

in fact, what i want to know is the size of the native long type of the
c compiler, at compile time. for exampe, the following code

if (sizeof (long) == 8)
{
// handle 64 bits long type
}
else if (sizeof (long) == 4)
{
// handle 32 bits long type
}

but, you know, this is done at run time, not compile time.
In theory, it is done at run time. In practice, it is done at compile
time. It's a very easy optimisation for compilers when the expression
is constant.

Aug 6 '06 #13

steve yee wrote:
>
in fact, what i want to know is the size of the native long type of the
c compiler, at compile time.
It's very likely that including <limits.hand making appropriate use
of the LONG_MIN, LONG_MAX, and ULONG_MAX with #if will give you what
you need.

Aug 6 '06 #14
steve yee wrote:
i want to detect if the compile is 32 bits or 64 bits in the source
code itself. so different code are compiled respectively. how to do
this?
#include <limits.h>
#include <stdio.h>

int main(void)
{

#if UINT_MAX == 65535U

printf(" 16 bit int \n");

#elif UINT_MAX == 4294967295UL

#if ULONG_MAX == 4294967295UL

printf(" 32 bit int and long \n");

#elif ULONG_MAX == 18446744073709551615ULL

printf(" 32 bit int and 64 bit long \n");

#else

printf(" 32 bit int but neither 32 or 64 bit long \n");

#endif

#elif UINT_MAX == 18446744073709551615ULL

printf(" 64 bit int \n");

#else

printf(" neither 16, 32 or 64 bit int \n");

#endif

return 0;
}
Aug 6 '06 #15
steve yee wrote:
>
in fact, what i want to know is the size of the native long type of the
c compiler, at compile time. [...]
You can discover the bounds of the `long' type easily
enough by using the <limits.hmacros:

#include <limits.h>
#if LONG_MAX == 0x7fffffff
/* 32-bit long, the minimum size */
#elif (LONG_MAX >32) >= 0x7fffffff
/* 64 or more bits. Note that the `>>32' is
* not attempted until we've already established
* that `long' has >32 bits.
*/
#else
/* somewhere in 32 < bits < 64 */
#endif

You could, of course, use additional range tests to refine
the result further.

However, none of this will tell you whether the long
type is "native" in any useful sense. You cannot tell by
examining the range or size of `long' whether it is directly
supported by the hardware or emulated in software -- observe
that even an 8-bit CPU must support a `long' of at least 32
bits. Some machines even use a combination of hardware and
emulation: A system might add, subtract, and multiply `long'
values in hardware but use software for division.

At run time, you might use clock() to time a large number
of `int' and `long' calculations and try to guess from the
difference in CPU time whether there's software emulation
going on -- but that's going to be an iffy business at best
(note that the 8-bit CPU must use emulation even for `int').
You'd get a far better answer by reading the implementation's
documentation.

--
Eric Sosman
es*****@acm-dot-org.invalid

Aug 6 '06 #16

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: dever | last post by:
I checked my color printers's Printersettings.SupportsColor, which all showed false. What is wrong that I could not use this property. Also, what other way to detect if a printer is color...
4
by: Dave Rahardja | last post by:
I have the following program that uses an array of chars to simulate a bit set: --------- // An out-of-bounds exception class BoundsException {}; template <int bits = 1> class Bitset
4
by: chrisstankevitz | last post by:
This code does not compile on gcc 3.4.4. Should it? Thanks for your help, Chris //================ #include <set> int main()
7
by: yinglcs | last post by:
Hi, I have a function which calls stl sort(). I pass in a STL list of 'Rect' (my own class), like this: void sortListY(const list<Rect>& rectList) { sort(rectList.begin(),...
5
by: cranium.2003 | last post by:
hi, Here is my code #include <iostream.h> int main() { cout <<"HI"; return 0; } and using following command to compile a C++ program g++ ex1.cpp -o ex1
12
by: Frederick Gotham | last post by:
Over on comp.lang.c, Hallvard B Furuseth devised a compile-time constant representing the amount of value representation bits in an unsigned integer type. In true C fashion, here it is as a macro:...
13
by: rrs.matrix | last post by:
hi i have to detect the type of CPU. whether it is 32-bit or 64-bit.. how can this be done.. can anyone please help me.. thanks.
11
by: Jonathan Wilson | last post by:
Is there some way I can do something like this #if <running on Visual C++ 2005 SP1> //SP1 code here #else //other code here #endif
8
by: Martin Wells | last post by:
Anyone know a good compile-time constant to tell you how many bits are needed to store a given value? For instance: CALC_BITS_NEEDED(0) == 1 CALC_BITS_NEEDED(1) == 1 CALC_BITS_NEEDED(2) ...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.