473,394 Members | 1,750 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,394 software developers and data experts.

forcing 32-bit long on 64-bit machine?

Hi folks,

I have a 64-bit machine and on this machine I want to run
an old program that was written to assume that longs are 32 bits.
I've discovered however that gcc on this machine automatically
makes longs 64 bits.

Rather than go through the code and convert all longs
to ints (which gcc makes 32 bits), is there a switch to make gcc
define longs as 32 bits?

Thanks.

Nov 23 '05 #1
21 5354
CoffeeGood wrote:
I have a 64-bit machine and on this machine I want to run
an old program that was written to assume that longs are 32 bits.
I've discovered however that gcc on this machine automatically
makes longs 64 bits.

Rather than go through the code and convert all longs
to ints (which gcc makes 32 bits), is there a switch to make gcc
define longs as 32 bits?

Ask the gcc folks at gnu.gcc; this is a compiler-specific question.

If you do decide to convert the longs rather than change the platform
(which is what you're trying to do), #include <stdint.h> and convert
them to (u)int32_t. That's an (unsigned) integer type with exactly 32
bits, if it exists, and solves the problem once and for all. Well, other
than rewriting the code to make it not use such assumptions in the first
place.

S.
Nov 23 '05 #2
CoffeeGood said:
Hi folks,

I have a 64-bit machine and on this machine I want to run
an old program that was written to assume that longs are 32 bits.
Bad code.
I've discovered however that gcc on this machine automatically
makes longs 64 bits.
Good compiler. Better compiler, because it exposes bad code!
Rather than go through the code and convert all longs
to ints (which gcc makes 32 bits), is there a switch to make gcc
define longs as 32 bits?


Have you considered asking in a gcc newsgroup? gnu.gcc.help might be worth a
go.

--
Richard Heathfield
"Usenet is a strange place" - dmr 29/7/1999
http://www.cpax.org.uk
email: rjh at above domain (but drop the www, obviously)
Nov 23 '05 #3
CoffeeGood wrote
(in article
<11*********************@g14g2000cwa.googlegroups. com>):
Hi folks,

I have a 64-bit machine and on this machine I want to run
an old program that was written to assume that longs are 32 bits.
Okay. Does that assumption imply that it will break if they are
bigger? Sometimes apps like this have a minimum requirement for
data size, but if it's bigger it will work. Have you made sure
that isn't the case here?
I've discovered however that gcc on this machine automatically
makes longs 64 bits.
you can usually tell the compiler to generate a 32-bit binary,
even when running on a 64-bit kernel if it is (as I suspect)
part of the AMD/Intel 'x86-64' family. See the man page for
gcc, or even more appropriately, take this up in one of the gcc
specific newsgroups.
Rather than go through the code and convert all longs
to ints (which gcc makes 32 bits), is there a switch to make gcc
define longs as 32 bits?


I suppose fixing the code so that it didn't make unwarranted
assumptions about data type sizes never came up as an option?
--
Randy Howard (2reply remove FOOBAR)
"The power of accurate observation is called cynicism by those
who have not got it." - George Bernard Shaw

Nov 23 '05 #4
Randy Howard wrote:
CoffeeGood wrote
(in article
<11*********************@g14g2000cwa.googlegroups. com>):

Hi folks,

I have a 64-bit machine and on this machine I want to run
an old program that was written to assume that longs are 32 bits.

[snip]

Rather than go through the code and convert all longs
to ints (which gcc makes 32 bits), is there a switch to make gcc
define longs as 32 bits?

I suppose fixing the code so that it didn't make unwarranted
assumptions about data type sizes never came up as an option?

If you don't, then what will you do about pointer sizes which will also
have increased by default from 32 bits to 64 bits?

Robert
Nov 23 '05 #5
>I suppose fixing the code so that it didn't make unwarranted
assumptions about data type sizes never came up as an option?


I don't know, I merely inherited the code from others.

Methinks what C needs is to do away with vague
terms like int, short, long and require the use of
terms like u8, u16, u32, s64. Then if people still want
to use "long", let that be a typedef or #define.

Nov 23 '05 #6
In article <11**********************@g49g2000cwa.googlegroups .com>,
CoffeeGood <fb***@yahoo.com> wrote:
Methinks what C needs is to do away with vague
terms like int, short, long and require the use of
terms like u8, u16, u32, s64. Then if people still want
to use "long", let that be a typedef or #define.


And if one is trying to program on a machine whose integer size is 36
bits, but one is trying to write a program that would also work on
a machine whose integer size is 32?

--
I was very young in those days, but I was also rather dim.
-- Christopher Priest
Nov 23 '05 #7
>And if one is trying to program on a machine whose integer size is 36 bits...

typedef s36 int;
typedef u36 unsigned;

Nov 23 '05 #8
CoffeeGood wrote:
And if one is trying to program on a machine whose integer size is 36 bits...

typedef s36 int;
typedef u36 unsigned;

When replying, please retain attribution, and don't cut off sentences
when they have relevant information.

Your answer is rendered bogus by omission of "but one is trying to write
a program that would also work on a machine whose integer size is 32?"

Suppose I want to write a program that works on a machine with an
integer type that is at least 32 bits? Well, you'd use standard C's
"long". No typedefs needed.

Your approach means I must have typedefs if I *don't* care about the
exact size of integers. ANSI C's approach is that you must have typedefs
if you *do* care about the exact size of integers. The latter is more
portable, even if it also makes life harder for programmers. Assuming
exact sizes makes some things easier. It also makes other things
impossible. The tradeoff is warranted.

Exact-size integer types are provided by <stdint.h>, where available.

S.
Nov 23 '05 #9
"CoffeeGood" <fb***@yahoo.com> writes:
Methinks what C needs is to do away with vague
terms like int, short, long and require the use of
terms like u8, u16, u32, s64. Then if people still want
to use "long", let that be a typedef or #define.


Most of the time I don't care what size my variables really are.
I just need them to be some minimum size. C's types work fine
for that. When I do need a specific size, there's always
<stdint.h>.

(You could always use Java if you want fixed-size types.)
--
"In My Egotistical Opinion, most people's C programs should be indented six
feet downward and covered with dirt." -- Blair P. Houghton
Nov 23 '05 #10
Robert Harris wrote
(in article <BN*******************@fe1.news.blueyonder.co.uk>) :
If you don't, then what will you do about pointer sizes which will also
have increased by default from 32 bits to 64 bits?


Sorry, but if you code that assumes the size of a pointer, you a
broken. Badly.
--
Randy Howard (2reply remove FOOBAR)
"The power of accurate observation is called cynicism by those
who have not got it." - George Bernard Shaw

Nov 23 '05 #11
Randy Howard <ra*********@FOOverizonBAR.net> writes:
Robert Harris wrote
(in article <BN*******************@fe1.news.blueyonder.co.uk>) :
If you don't, then what will you do about pointer sizes which will also
have increased by default from 32 bits to 64 bits?


Sorry, but if you code that assumes the size of a pointer, you a
broken. Badly.


[u]intptr_t can be helpful here (although it isn't guaranteed to
exist).
--
"To get the best out of this book, I strongly recommend that you read it."
--Richard Heathfield
Nov 23 '05 #12

"Randy Howard" <ra*********@FOOverizonBAR.net> wrote

Sorry, but if you code that assumes the size of a pointer, you a
broken. Badly.

Windows allows you to set a "user long" to the window.
It is extremely tempting to make this long into a pointer, to hang
arbitrary data on your window. In fact i don't know of any other good way of
achieving the same thing.
Nov 23 '05 #13
Malcolm wrote:
"Randy Howard" <ra*********@FOOverizonBAR.net> wrote
Sorry, but if you code that assumes the size of a pointer, you a
broken. Badly.


Windows allows you to set a "user long" to the window.
It is extremely tempting to make this long into a pointer, to hang
arbitrary data on your window. In fact i don't know of any other good way of
achieving the same thing.

<OT>I do: use SetWindowLongPtr(), which supersedes SetWindowLong() for
*exactly this reason*: a long cannot portably be assumed to have the
same size as a pointer (and this will not work on 64-bit Windows).

In other words, you can certainly write code that makes this assumption,
but it's not a good idea to actually do so, because your code won't go
far. Unfortunately for you, this is true even if your environment forces
you to do this...

S.
Nov 23 '05 #14
"Malcolm" <re*******@btinternet.com> wrote:
"Randy Howard" <ra*********@FOOverizonBAR.net> wrote

Sorry, but if you code that assumes the size of a pointer, you a
broken. Badly.

Windows allows you to set a "user long" to the window.
It is extremely tempting to make this long into a pointer, to hang
arbitrary data on your window. In fact i don't know of any other good way of
achieving the same thing.


Sadly, the MS Windows API is replete with this sort of errant
sub-hackery. Whenever I look into the declarations of types under MS
Windows and the trouble they had to go to to make Win32S, Win'98 and
WinNT _almost_ compatible, the importance of properly portable C code is
pressed upon me again.

Richard
Nov 23 '05 #15
>Most of the time I don't care what size my variables really are.
I just need them to be some minimum size.


That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.

Nov 23 '05 #16
CoffeeGood wrote:
Most of the time I don't care what size my variables really are.
I just need them to be some minimum size.


That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.


No.
Ben is talking about portable programming.
If you need an unsigned type with at least 32 bits,
then unsigned long is the portable choice.

--
pete
Nov 23 '05 #17
CoffeeGood wrote
(in article
<11*********************@f14g2000cwb.googlegroups. com>):
Most of the time I don't care what size my variables really are.
I just need them to be some minimum size.


That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.


Incorrect. For a lot of algorithms and/or program needs, this
is perfectly reasonable.

--
Randy Howard (2reply remove FOOBAR)
"The power of accurate observation is called cynicism by those
who have not got it." - George Bernard Shaw

Nov 23 '05 #18
CoffeeGood wrote:
Most of the time I don't care what size my variables really are.
I just need them to be some minimum size.

That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.

Err.... why? The topic of this ng is standard C. Writing standard C in
fact *requires* that you only rely on guaranteed minimum sizes.

"Heresy" would be assuming that a short is 16 bits, an int 32 bits, that
you can safely convert pointers of all types to ints and back, that a
null pointer is represented by all-bits-zero... that sort of thing.

I think you're confusing a minimum size with an exact size. That is, if
you do something like this:

typedef char int8_t;
typedef short int16_t;
typedef int int32_t;

With the intention that intN_t is a signed integer type of exactly N
bits. This renders code unportable on all platforms that don't meet the
assumptions this implies: that a char is 8 bits (and plain char is
signed), that a short is 16 bits, an int 32 bits.

If you isolate these typedefs and make it clear they are to be
customized for every platform, you're still relying on the assumption
that the types you want exist at all. Some platforms will simply not
*have* signed integer types of exactly 8, 16 or 32 bits.

If, as is usual, you actually don't need *exact* sizes (since this is
mostly interesting for interfacing with low-level bits that don't need
much in the way of portability anyway) but are content with minimally
adequate types, portability is easier. C89 already gives you standard
integer types of at least 8, 16 and 32 bits: char, int and long.

C99 makes this even easier with <stdint.h>, which includes the intN_t
typedefs above (where the corresponding types exist), as well as uintN_t
for the unsigned counterparts, and (u)int_leastN_t for integers of at
least a certain size. C99 furthermore requires that (u)int_leastN_t
types exist for N = 8, 16, 32 or 64, and defines macros for the minimum
and maximum values of all of these types.

This makes writing portable code that much easier. You can do, for example:

#include <stdint.h>
typedef uint16_t word;

For poorly written or platform-specific code that requires exact 16-bit
quantities. This code will fail to compile on platforms that don't have
such a type, but will not require any modification for those platforms
that do. You can even do:

#include <stdint.h>

#ifndef INT_LEAST24_MIN
typedef int_least32_t int_least24_t;
#define INT_LEAST24_MIN INT_LEAST32_MIN
#define INT_LEAST24_MAX INT_LEAST32_MAX
#define INT24_C(x) INT32_C(x)
#endif

This gives you a platform-specific integer type of at least 24 bits
long, using the platform's 32-bit integer type if necessary.

S.
Nov 23 '05 #19
On 2005-11-23 08:25:27 -0500, "CoffeeGood" <fb***@yahoo.com> said:
Most of the time I don't care what size my variables really are.
I just need them to be some minimum size.


That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.


You've actually got it backwards. Writing portable code implies *only*
relying on the guarantees offered by the standard; including the
minimum sizes for integers.

The problem is when people make unwarranted assumptions like:

"int is always at least 32-bits"
"long is always 32-bits"
"char is always 8-bits"
etc.

None of which are true.

--
Clark S. Cox, III
cl*******@gmail.com

Nov 23 '05 #20

"CoffeeGood" <fb***@yahoo.com> wrote in message
news:11*********************@f14g2000cwb.googlegro ups.com...
Most of the time I don't care what size my variables really are.
I just need them to be some minimum size.


That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.


No you've got that bass ackwards. Saying the opposite
is what might be called hair a C.

-Mike
Nov 23 '05 #21
"CoffeeGood" <fb***@yahoo.com> writes:
Most of the time I don't care what size my variables really are.
I just need them to be some minimum size.


That's reasonable to me, but in this newsgroup saying
such a thing could be considered unforgivable heresy.


Are you inserting the quoted text manually? I think Google, if you
use it properly, prefixes each quoted line with "> ", not just ">".
It also adds correct attribution lines, which your followups often
lack.

Surely you've been here long enough to see these instructions:

If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Nov 23 '05 #22

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
by: Hugh Lutley | last post by:
I'm trying to install the Device::SerialPort module using CPAN but whilst CPAN is running 'make test' a couple of errors are flagging so I cannot continue the install without forcing it. I'm not...
12
by: Ritz, Bruno | last post by:
hi in java i found that when a method has a throws clause in the definition, callers must either handle the exceptions thrown by the method they are calling or "forward" the exception to the...
8
by: Michael Gaab | last post by:
How would I force the compiler to throw an error for the following: function signature - void foo(short); function call - foo('d'); My compiler does not complain when I call foo() with a...
1
by: John M | last post by:
Hello All, The code below is something I have been working on and just can't get past. I know it must be very simple but here goes. The Form Method line text is invisible on browers, but still...
40
by: Neo The One | last post by:
I think C# is forcing us to write more code by enforcing a rule that can be summarized as 'A local variable must be assgined *explicitly* before reading its value.' If you are interested in what...
12
by: Howard Kaikow | last post by:
In the code below, ALL the lines that start with PPTfile << are getting the following error at build time. "//i:\C++\C++Code\FileOperations\Form1.h(127) : warning C4800: 'System::String __gc *'...
4
by: WAZOO | last post by:
Sorry if this is obvious to everyone (except me). I've done a Google search and I'm not seeing anything encouraging that addresses my issue. My very large multispecialty medical practice needs...
7
by: korund | last post by:
I want solve a small problem. Since javascript can't read and write as a file, VBScript is used in a code. When a user click's on a 'Yes' button on a Internet Explorer activeX prompt, a VBScript...
4
by: ATS16805 | last post by:
Hi. I wonder if it's possible to "force" a browser to "switch to SSR mode" for any given document. Specifically, I'm looking for a solution, not to a User Agent issue (i think), but a coding idea;...
5
by: Lord Zoltar | last post by:
Hello, I'm trying to force a listView to scroll to some location when a certain button is clicked on. I've found that the SendMessage function seems to be the choice way of doing it. Here's what...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.