By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
446,331 Members | 1,476 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 446,331 IT Pros & Developers. It's quick & easy.

Defining a 32 Bit Integer on every platform?

P: n/a
Hi,

i want to write a chat program based on a self writen network
protocol. It should run on any platform so there is one problem: On
one platform an Integer 32 Bit on other 64. But i need exact a 32 Bit
one (for compatibly issues). It should be also possible to add and
subtract etc from this type... So how can i define this type?
Thx Frank
Jul 22 '05 #1
Share this Question
Share on Google+
7 Replies


P: n/a
Frank Tombe wrote:
Hi,

i want to write a chat program based on a self writen network
protocol. It should run on any platform so there is one problem: On
one platform an Integer 32 Bit on other 64. But i need exact a 32 Bit
one (for compatibly issues). It should be also possible to add and
subtract etc from this type... So how can i define this type?
Thx Frank


What about byte order ?
Anyhow, the usual way to solve this is to use a typedef that is defined
for each platform.

There are some tricks to make the byte-ordering also transparent, if
you're interested, just ask.

Jul 22 '05 #2

P: n/a
On 8 May 2004 08:29:22 -0700 in comp.lang.c++, nc******@freequote.net
(Frank Tombe) wrote,
i want to write a chat program based on a self writen network
protocol. It should run on any platform so there is one problem: On
one platform an Integer 32 Bit on other 64. But i need exact a 32 Bit
one (for compatibly issues). It should be also possible to add and
subtract etc from this type... So how can i define this type?


Does your 64bit platform offer any 32bit integer datatype at all?
No reason why it has to, but perhaps that platform is especially
unsuited to your requirements and should not be used.

Out of curiosity, what is it?

Compare <boost/cstdint.hpp> from http://www.boost.org

Jul 22 '05 #3

P: n/a
Gianni Mariani posted:
Frank Tombe wrote:
Hi,

i want to write a chat program based on a self writen network
protocol. It should run on any platform so there is one problem: On
one platform an Integer 32 Bit on other 64. But i need exact a 32 Bit
one (for compatibly issues). It should be also possible to add and
subtract etc from this type... So how can i define this type?
Thx Frank


What about byte order ?
Anyhow, the usual way to solve this is to use a typedef that is defined
for each platform.

There are some tricks to make the byte-ordering also transparent, if
you're interested, just ask.

I myself have done the following to get out of the byte-order problemo:
union Numbr
{
struct {
unsigned __int8 a;
unsigned __int8 b;
unsigned __int8 c;
unsigned __int8 d; }

unsigned __int32 ThirtyTwoBit;
};
Jul 22 '05 #4

P: n/a
JKop wrote:
Gianni Mariani posted:

....

There are some tricks to make the byte-ordering also transparent, if
you're interested, just ask.


I myself have done the following to get out of the byte-order problemo:
union Numbr
{
struct {
unsigned __int8 a;
unsigned __int8 b;
unsigned __int8 c;
unsigned __int8 d; }

unsigned __int32 ThirtyTwoBit;
};


I posted an answer a while ago - here is the google-groups cache of the
post.

http://tinyurl.com/2ffdw

Jul 22 '05 #5

P: n/a
Ian
JKop wrote:
Gianni Mariani posted:

Frank Tombe wrote:
Hi,

i want to write a chat program based on a self writen network
protocol. It should run on any platform so there is one problem: On
one platform an Integer 32 Bit on other 64. But i need exact a 32 Bit
one (for compatibly issues). It should be also possible to add and
subtract etc from this type... So how can i define this type?
Thx Frank


What about byte order ?
Anyhow, the usual way to solve this is to use a typedef that is defined
for each platform.

There are some tricks to make the byte-ordering also transparent, if
you're interested, just ask.


I myself have done the following to get out of the byte-order problemo:
union Numbr
{
struct {
unsigned __int8 a;
unsigned __int8 b;
unsigned __int8 c;
unsigned __int8 d; }

unsigned __int32 ThirtyTwoBit;
};

Whats wrong with using uint32_t and htonl/ntohl and friends?

Ian
Jul 22 '05 #6

P: n/a
Ian wrote:
JKop wrote:
....
Whats wrong with using uint32_t and htonl/ntohl and friends?


htonl/ntohl only works with 32 or 16 bit numbers and you have to cast
back and forth for types other than unsigned. class NetworkOrder (see
http://tinyurl.com/2ffdw ) work with any byte order specific type and
does the magic of applying the endian translations for you, not to
mention that on a platform that has the right endianness, it will be
pretty fast compared to a function call.

Jul 22 '05 #7

P: n/a
Ian
Gianni Mariani wrote:
Ian wrote:
JKop wrote:

...

Whats wrong with using uint32_t and htonl/ntohl and friends?

htonl/ntohl only works with 32 or 16 bit numbers and you have to cast
back and forth for types other than unsigned. class NetworkOrder (see
http://tinyurl.com/2ffdw ) work with any byte order specific type and
does the magic of applying the endian translations for you, not to
mention that on a platform that has the right endianness, it will be
pretty fast compared to a function call.

On a platform with the right endianness, htonl/ntohl will be #defined to
do nothing, which is faster still.

Never mind, I see your point.

Ian

Jul 22 '05 #8

This discussion thread is closed

Replies have been disabled for this discussion.