Hi,
Is there some kind of canonical list, or would someone like to give a
brief rundown, as to:
sizeof(int)
sizeof(long int)
sizeof(long long)
etc
or perhaps even the vector types, for current hardware (Pentium
4/Athlon/G4/G5/Playstation 3/etc) used with current compilers? Or
perhaps a list of what size int_t you can define and expect the
processor to handle natively? 28 3593
"Richard Cavell" <ri***********@mail.com> wrote... Is there some kind of canonical list, or would someone like to give a brief rundown, as to:
sizeof(int) sizeof(long int) sizeof(long long) etc
sizeof(int) is the size of an int object expressed in bytes.
sizeof(long int) is the size of a long int object.
sizeof(long long) is a syntax error since C++ has no "long long" type.
I am not sure what "canonical list" you're talking about.
or perhaps even the vector types, for current hardware (Pentium 4/Athlon/G4/G5/Playstation 3/etc) used with current compilers? Or perhaps a list of what size int_t you can define and expect the processor to handle natively?
If you need something compiler-specific, please ask in a newsgroup
dedicated to that compiler. C++ discussed here is compiler-independent.
If you need something hardware-specific, please ask in a newsgroup
dedicated to that hardware. C++ is a hardware-independent language.
V
On 27/2/05 3:03 AM, Victor Bazarov wrote: C++ is a hardware-independent language.
No, it ain't.
I have to write my program differently depending on what these sizeofs
are. So it's not hardware-independent at all.
"Richard Cavell" <ri***********@mail.com> wrote... On 27/2/05 3:03 AM, Victor Bazarov wrote:
C++ is a hardware-independent language.
No, it ain't.
I have to write my program differently depending on what these sizeofs are. So it's not hardware-independent at all.
It's not the language, silly. It's your algorithm, it's what you want
to do, that makes it hardware-dependent.
"Richard Cavell" <ri***********@mail.com> wrote in message
news:cv**********@nnrp.waia.asn.au... Hi,
Is there some kind of canonical list, or would someone like to give a brief rundown, as to:
sizeof(int) sizeof(long int) sizeof(long long) etc
or perhaps even the vector types, for current hardware (Pentium 4/Athlon/G4/G5/Playstation 3/etc) used with current compilers? Or perhaps a list of what size int_t you can define and expect the processor to handle natively?
usually an int is the size of the register used, on the 286 this was 16
bits, a long was 32.
on 386 protected, 486, int is 32 bits, and long int, is 32 bits as well,
Double Word is 64 bits.
A char is always 8 bits. The ratio of Char size to int size will tell you
the hardware type, ie the register width.
DHOLLINGSWORTH2 wrote: A char is always 8 bits.
Why can't a char be 16 bits?
On 27/2/05 2:28 PM, Victor Bazarov wrote: "Richard Cavell" <ri***********@mail.com> wrote...
On 27/2/05 3:03 AM, Victor Bazarov wrote:
C++ is a hardware-independent language.
No, it ain't.
I have to write my program differently depending on what these sizeofs are. So it's not hardware-independent at all.
It's not the language, silly. It's your algorithm, it's what you want to do, that makes it hardware-dependent.
That makes no sense at all. If C++ ran inside a virtual machine then it
wouldn't matter to me whether my processor could do 16-bit integers at a
time, or 32, or whatever. But my program actually does different things
based on sizeof(int) whether I want it to or not.
Richard Cavell wrote: On 27/2/05 2:28 PM, Victor Bazarov wrote: It's not the language, silly. It's your algorithm, it's what you
want to do, that makes it hardware-dependent. That makes no sense at all. If C++ ran inside a virtual machine then
it wouldn't matter to me whether my processor could do 16-bit integers
at a time, or 32, or whatever. But my program actually does different
things based on sizeof(int) whether I want it to or not.
What exactly is your program trying to do? A portable program should
not need to depend on sizeof(int) being a certain size, unless you're
doing some form of externalization.
Richard Cavell wrote: On 27/2/05 2:28 PM, Victor Bazarov wrote: "Richard Cavell" <ri***********@mail.com> wrote... On 27/2/05 3:03 AM, Victor Bazarov wrote:
C++ is a hardware-independent language.
No, it ain't.
I have to write my program differently depending on what these sizeofs are. So it's not hardware-independent at all.
It's not the language, silly. It's your algorithm, it's what you want to do, that makes it hardware-dependent.
That makes no sense at all. If C++ ran inside a virtual machine then it wouldn't matter to me whether my processor could do 16-bit integers at a time, or 32, or whatever. But my program actually does different things based on sizeof(int) whether I want it to or not.
The following is a hardware-dependent program, in your sense:
#include <iostream>
int main()
{
if (sizeof(int) == 19)
std::cout << "hello 19 world\n";
else
std::cout << "hello non-19 world\n";
}
Is this what bothers you? You don't want your program to be able to detect
anything that varies from system to system?
It would be more worthwile to ask about ways to write programs which work on a
variety of platforms. There are lots of ways to do it.
Jonathan
Richard Cavell wrote: On 27/2/05 2:28 PM, Victor Bazarov wrote: That makes no sense at all. If C++ ran inside a virtual machine then it wouldn't matter to me whether my processor could do 16-bit integers at a time, or 32, or whatever. But my program actually does different things based on sizeof(int) whether I want it to or not.
Then I suggest you have sizeof(int) in your program, such as:
if( sizeof(int) == 2 )
{
// oh schnitt, a 16 bit machine!!!
}
else if( sizeof(int) == 4 )
{
// ahh, mainstream life...
}
else if( sizeof(int) == 8 )
{
// where the {expletive} am I now?
}
If you're curious on your machine, just dump them:
#include<iostream>
int main( int argc, char* argv[] )
{
std::cout << "int " << sizeof(int) << std::endl;
std::cout << "long " << sizeof(long) << std::endl;
return 0;
}
Shezan Baig wrote: DHOLLINGSWORTH2 wrote:
A char is always 8 bits. Why can't a char be 16 bits?
It can be, but it is unlikely. Unfortunately one of the
great defects in C++ is that char has a double role: that
of the basic character AND that of the smallest addressable
storage unit. Ideally, bytes and characters should be dissassociated.
"Ron Natalie" <ro*@sensor.com> wrote... Shezan Baig wrote: DHOLLINGSWORTH2 wrote:
A char is always 8 bits. Why can't a char be 16 bits? It can be, but it is unlikely. Unfortunately one of the great defects in C++ is that char has a double role: that of the basic character AND that of the smallest addressable storage unit. Ideally, bytes and characters should be dissassociated.
I think you misunderstand the role of a char WRT basic character set.
Nothing prevents a 32-bit char from containing an element from the basic
character set. The requirement is only that the char type to be large
enough for that. There is no requirement that it has to be no larger
than necessary to contain the basic characters. There is no defect in
the language, only in your understanding of it.
V
"DHOLLINGSWORTH2" <DH*************@cox.net> wrote in message news:<gpcUd.18605$yr.10961@okepread05>... "Richard Cavell" <ri***********@mail.com> wrote in message news:cv**********@nnrp.waia.asn.au...
[snip] A char is always 8 bits. The ratio of Char size to int size will tell you the hardware type, ie the register width.
Not always. A char is at least 8 bits, but on the hardware I program
it is 16 bits (as is int, so sizeof(int)==1). There are other chip
manufacturers in the world than Intel you know.
Richard Cavell wrote:
[ ... ] That makes no sense at all. If C++ ran inside a virtual machine then
it wouldn't matter to me whether my processor could do 16-bit integers
at a time, or 32, or whatever. But my program actually does different
things based on sizeof(int) whether I want it to or not.
If you need something you're certain is at least 32 bits, use long. If
you need something you need to be certain is only 16 bits, use a
bitfield.
In the end, you're right: running in a VM could provide greater
isolation from the hardware, and with it greater portability. OTOH, you
need to use SOME language to implement the VM, and the operating system
it runs on, and he device drivers IT uses to talk to the hardware --
and right now, the languages of choice for all those things UNDER the
VM are C and C++. If you insist on them running inside of the VM, you
have to come up with something else to implement all the other things
under the VM.
Based on experience, I'd put the chances at roughly 99% that if you
attempted to create a language to replace C and C++ in those roles,
you'd end up with something that did NOT provide as good of a tradeoff
between portability and access to the hardware as C and C++ provide
right now. Of all the gazillions of attempts at such programming
languages, there's only about _one_ that's really competitive with C or
C++ for these tasks.
--
Later,
Jerry.
The universe is a figment of its own imagination.
Richard Cavell wrote: That makes no sense at all. If C++ ran inside a virtual machine then it wouldn't matter to me whether my processor could do 16-bit integers at a time, or 32, or whatever. But my program actually does different things based on sizeof(int) whether I want it to or not.
I do not think the VM makes the difference you are thinking it does. A
VM is a machine as its name implies with its own assembly language.
For example, .NET is a CLI VM and here is a book about .NET's (CLI) VM
assembly language: http://www.amazon.com/exec/obidos/tg...glance&s=books
CLI assembly language is defined in the freely available CLI standard,
so you can view in the specification how it looks like: http://www.ecma-international.org/pu...s/Ecma-335.htm
--
Ioannis Vranos http://www23.brinkster.com/noicys
"Ron Natalie" <ro*@sensor.com> wrote in message
news:42***********************@news.newshosting.co m... Shezan Baig wrote: DHOLLINGSWORTH2 wrote:
A char is always 8 bits. Why can't a char be 16 bits? It can be, but it is unlikely. Unfortunately one of the great defects in C++ is that char has a double role: that of the basic character AND that of the smallest addressable storage unit. Ideally, bytes and characters should be dissassociated.
You are correct, I should have said "Byte".
"DHOLLINGSWORTH2" <DH*************@cox.net> wrote... "Ron Natalie" <ro*@sensor.com> wrote in message news:42***********************@news.newshosting.co m... Shezan Baig wrote: DHOLLINGSWORTH2 wrote:
A char is always 8 bits. Why can't a char be 16 bits? It can be, but it is unlikely. Unfortunately one of the great defects in C++ is that char has a double role: that of the basic character AND that of the smallest addressable storage unit. Ideally, bytes and characters should be dissassociated.
You are correct, I should have said "Byte".
In C++ terms 'char' and 'byte' are interchangeable. You should have
said "octet", maybe.
V I think you misunderstand the role of a char WRT basic character set.
I understand it perfoectly.
Nothing prevents a 32-bit char from containing an element from the basic character set.
And who cares? The issue is not the maximum size. The issue is that if
you choose to use a larger character size (imagine UNICODE as the basic
character set), you can't make the char 32 bits if you want to still be
able to address 8 bit pieces of memory somewhere.
The requirement is only that the char type to be large enough for that. There is no requirement that it has to be no larger than necessary to contain the basic characters. There is no defect in the language, only in your understanding of it.
My understnading is fine, your understanding of the limitations is what
is deficient.
On Sat, 26 Feb 2005 23:20:45 -0700, Phil Staite <ph**@nospam.com>
wrote in comp.lang.c++: Richard Cavell wrote: On 27/2/05 2:28 PM, Victor Bazarov wrote: That makes no sense at all. If C++ ran inside a virtual machine then it wouldn't matter to me whether my processor could do 16-bit integers at a time, or 32, or whatever. But my program actually does different things based on sizeof(int) whether I want it to or not.
Then I suggest you have sizeof(int) in your program, such as:
if( sizeof(int) == 2 ) { // oh schnitt, a 16 bit machine!!! } else if( sizeof(int) == 4 ) { // ahh, mainstream life... } else if( sizeof(int) == 8 ) { // where the {expletive} am I now? }
else if (sizeof(int) == 1)
{
// either a 16, 24, 32 bit platform (probably DSP)
// or a Cray???
}
If you're curious on your machine, just dump them:
#include<iostream>
int main( int argc, char* argv[] ) { std::cout << "int " << sizeof(int) << std::endl; std::cout << "long " << sizeof(long) << std::endl; return 0; }
--
Jack Klein
Home: http://JK-Technology.Com
FAQs for
comp.lang.c http://www.eskimo.com/~scs/C-faq/top.html
comp.lang.c++ http://www.parashift.com/c++-faq-lite/
alt.comp.lang.learn.c-c++ http://www.contrib.andrew.cmu.edu/~a...FAQ-acllc.html
On Sun, 27 Feb 2005 16:13:54 +1100, Richard Cavell
<ri***********@mail.com> wrote in comp.lang.c++: On 27/2/05 2:28 PM, Victor Bazarov wrote: "Richard Cavell" <ri***********@mail.com> wrote...
On 27/2/05 3:03 AM, Victor Bazarov wrote:
C++ is a hardware-independent language.
No, it ain't.
I have to write my program differently depending on what these sizeofs are. So it's not hardware-independent at all.
It's not the language, silly. It's your algorithm, it's what you want to do, that makes it hardware-dependent.
That makes no sense at all. If C++ ran inside a virtual machine then it wouldn't matter to me whether my processor could do 16-bit integers at a time, or 32, or whatever. But my program actually does different things based on sizeof(int) whether I want it to or not.
Then you are using int when you shouldn't be.
--
Jack Klein
Home: http://JK-Technology.Com
FAQs for
comp.lang.c http://www.eskimo.com/~scs/C-faq/top.html
comp.lang.c++ http://www.parashift.com/c++-faq-lite/
alt.comp.lang.learn.c-c++ http://www.contrib.andrew.cmu.edu/~a...FAQ-acllc.html
On Sat, 26 Feb 2005 22:55:40 -0600, "DHOLLINGSWORTH2"
<DH*************@cox.net> wrote in comp.lang.c++: "Richard Cavell" <ri***********@mail.com> wrote in message news:cv**********@nnrp.waia.asn.au... Hi,
Is there some kind of canonical list, or would someone like to give a brief rundown, as to:
sizeof(int) sizeof(long int) sizeof(long long) etc
or perhaps even the vector types, for current hardware (Pentium 4/Athlon/G4/G5/Playstation 3/etc) used with current compilers? Or perhaps a list of what size int_t you can define and expect the processor to handle natively?
What do all of these references to x86 compatible and other desktop
processors have to do with anything? They comprise something like 15%
of the processors made each year.
usually an int is the size of the register used, on the 286 this was 16 bits, a long was 32.
Um, there are literally billions of 8-bit processors sold every year,
where sizeof int is 16, but the size of a register is 8.
on 386 protected, 486, int is 32 bits, and long int, is 32 bits as well, Double Word is 64 bits. A char is always 8 bits. The ratio of Char size to int size will tell you the hardware type, ie the register width.
No, it's not. There are platforms where char has 16 bits. At least
one with 24 bits, and several where char has 32 bits.
You seem to think that C++ exists to standardize the various modes of
the x86 and desktop style processors. This is an incorrect notion.
--
Jack Klein
Home: http://JK-Technology.Com
FAQs for
comp.lang.c http://www.eskimo.com/~scs/C-faq/top.html
comp.lang.c++ http://www.parashift.com/c++-faq-lite/
alt.comp.lang.learn.c-c++ http://www.contrib.andrew.cmu.edu/~a...FAQ-acllc.html
Shezan Baig wrote: DHOLLINGSWORTH2 wrote: A char is always 8 bits.
Why can't a char be 16 bits?
it can. Most compilers for a given processor will declare it as 8 bits,
if the processor supports accesses on 8-bit entities. However please be
aware that there *are* processors that cannot access 8-bit entities, so
sizeof(char) is *not* guaranteed to be exactly 8 bits.
David
On Sun, 27 Feb 2005 09:53:01 -0500, Ron Natalie <ro*@sensor.com> wrote
in comp.lang.c++: Shezan Baig wrote: DHOLLINGSWORTH2 wrote:
A char is always 8 bits. Why can't a char be 16 bits? It can be, but it is unlikely. Unfortunately one of the great defects in C++ is that char has a double role: that of the basic character AND that of the smallest addressable storage unit. Ideally, bytes and characters should be dissassociated.
Interesting. Just today I was working with an architecture where char
has 16 bits, which is indeed the smallest addressable storage unit.
What possible gain from defining a char as 8 bits when a byte is 16
bits? You can only store one char per byte, unless you use bit fields
or bit-wise operators to merge and separate them.
And there are platforms where char and in fact all the integer types
are 32 bits. Period.
--
Jack Klein
Home: http://JK-Technology.Com
FAQs for
comp.lang.c http://www.eskimo.com/~scs/C-faq/top.html
comp.lang.c++ http://www.parashift.com/c++-faq-lite/
alt.comp.lang.learn.c-c++ http://www.contrib.andrew.cmu.edu/~a...FAQ-acllc.html
Victor Bazarov wrote: "DHOLLINGSWORTH2" <DH*************@cox.net> wrote... "Ron Natalie" <ro*@sensor.com> wrote in message news:42***********************@news.newshosting.co m... Shezan Baig wrote: DHOLLINGSWORTH2 wrote:
>A char is always 8 bits. Why can't a char be 16 bits?
It can be, but it is unlikely. Unfortunately one of the great defects in C++ is that char has a double role: that of the basic character AND that of the smallest addressable storage unit. Ideally, bytes and characters should be dissassociated.
You are correct, I should have said "Byte".
In C++ terms 'char' and 'byte' are interchangeable. You should have said "octet", maybe.
on *most* processors this is true. As I mentioned some processors cannot
access data in byte-sized chunks. That is why the standard describes the
relationship of the various integral types in relative terms rather than
giving specific sizes.
"David Lindauer" <ca*****@bluegrass.net> wrote...
Victor Bazarov wrote:
"DHOLLINGSWORTH2" <DH*************@cox.net> wrote... > > "Ron Natalie" <ro*@sensor.com> wrote in message > news:42***********************@news.newshosting.co m... >> Shezan Baig wrote: >>> DHOLLINGSWORTH2 wrote: >>> >>>>A char is always 8 bits. >>> >>> >>> >>> Why can't a char be 16 bits? >>> >> It can be, but it is unlikely. Unfortunately one of the >> great defects in C++ is that char has a double role: that >> of the basic character AND that of the smallest addressable >> storage unit. Ideally, bytes and characters should be >> dissassociated. > > You are correct, I should have said "Byte". In C++ terms 'char' and 'byte' are interchangeable. You should have said "octet", maybe.
on *most* processors this is true.
On most processors WHAT is true? And WTF does it have to do with
processors?
C++ Standard defines a byte as the smallest addressable unit of computer
memory and says that sizeof(char) is 1 byte. Where, on what processors, I
ask, would it NOT be true?
As I mentioned some processors cannot access data in byte-sized chunks.
Here we go again. You're still falling into the trap that "byte" means 8
bits.
It does NOT. If your hardware allows a C++ implementation to exist on it
(or
*for* it), that means that _bytes_ *in C++ sense* cat be addressed. Period.
That is why the standard describes the relationship of the various integral types in relative terms rather than giving specific sizes.
Pardon? The Standard describes everything in terms of _bytes_ because some
processors cannot address "byte-sized chunks"? Am I reading this wrong or
have you made a few wrong turns yourself?
V
Jack Klein wrote: else if (sizeof(int) == 1) { // either a 16, 24, 32 bit platform (probably DSP) // or a Cray??? }
Cray's I know about. The Cray really has two data sizes 24 bit
pointers and 64 everything else, that's hardware. The C compiler
in fact emulates 8 bit chars in software (by packing them into 64
containers) and sizeof() every other numeric type AND pointers are
8 (64 bits). The char pointer is made up by shoving the word pointer
into the low 24 bits and the byte within word offset in the high three bits.
Jack Klein wrote: Interesting. Just today I was working with an architecture where char has 16 bits, which is indeed the smallest addressable storage unit.
What possible gain from defining a char as 8 bits when a byte is 16 bits? You can only store one char per byte, unless you use bit fields or bit-wise operators to merge and separate them.
And there are platforms where char and in fact all the integer types are 32 bits. Period.
I was imagining a case where the minimum addressable storage unit is
8 bits, but the basic character size is 16 bits (or larger). You
can't do this in C++.
On 2005-02-27 23:07:20 -0500, David Lindauer <ca*****@bluegrass.net> said:
Victor Bazarov wrote:
"DHOLLINGSWORTH2" <DH*************@cox.net> wrote... "Ron Natalie" <ro*@sensor.com> wrote in message news:42***********************@news.newshosting.co m... Shezan Baig wrote: > DHOLLINGSWORTH2 wrote: > >> A char is always 8 bits. > > > > Why can't a char be 16 bits? > It can be, but it is unlikely. Unfortunately one of the great defects in C++ is that char has a double role: that of the basic character AND that of the smallest addressable storage unit. Ideally, bytes and characters should be dissassociated.
You are correct, I should have said "Byte".
In C++ terms 'char' and 'byte' are interchangeable. You should have said "octet", maybe.
on *most* processors this is true. As I mentioned some processors cannot access data in byte-sized chunks.
*bzzzzzt*
ITYM: "some processors cannot access data in *octet*-sized chunks".
As far as C++ is concerned, every processor can access byte sized chunks.
--
Clark S. Cox, III cl*******@gmail.com This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics
by: Roy Yao |
last post by:
Does it mean "(sizeof(int))* (p)" or "sizeof( (int)(*p) )" ?
According to my analysis, operator sizeof, (type) and * have the same
precedence, and they combine from right to left. Then this...
|
by: blufox |
last post by:
I read somewhere that standard Ansi C on 32 bit architecures assumes a
lot of things which lead to poor code.
For example
sizeof( int ) = sizeof( void * )
sizeof( long ) = sizeof( int )
Why...
|
by: write2gops |
last post by:
Hi,
Is there a way to change the sizeof int type? Is it posible to make it
return 3 or 5 by changing some header file in the compiler?
Thanks in advance.
|
by: Francois Grieu |
last post by:
Does this reliably cause a compile-time error
when int is not 4 bytes ?
enum { int_size_checked = 1/(sizeof(int)==4) };
Any better way to check the value of an expression
involving sizeof...
|
by: Spiros Bousbouras |
last post by:
Do you have an example of an implementation where
sizeof(short int) does not divide sizeof(int) or
sizeof(int) does not divide sizeof(long int) or
sizeof(long int) does not divide sizeof(long long...
|
by: aarklon |
last post by:
Is y >(8 * (sizeof(int) -1)) portable expression to find The MSB of
an unsigned integer y ??
will this work in all the cases, all three of the representations C
allows:
1) two's complement...
|
by: aarklon |
last post by:
Hi all,
why printf("....%d",sizeof((int)(double)(char) i)) always gives the
size of int ???
is it because sizeof doesn't evaluate its operand....???
|
by: Charles Arthur |
last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
|
by: ryjfgjl |
last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
|
by: BarryA |
last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
|
by: Sonnysonu |
last post by:
This is the data of csv file
1 2 3
1 2 3
1 2 3
1 2 3
2 3
2 3
3
the lengths should be different i have to store the data by column-wise with in the specific length.
suppose the i have to...
|
by: Hystou |
last post by:
There are some requirements for setting up RAID:
1. The motherboard and BIOS support RAID configuration.
2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
|
by: Oralloy |
last post by:
Hello folks,
I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>".
The problem is that using the GNU compilers,...
|
by: jinu1996 |
last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
|
by: Hystou |
last post by:
Overview:
Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
|
by: tracyyun |
last post by:
Dear forum friends,
With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
| |