469,579 Members | 1,899 Online

Maximum Array int is 330 M, not 2.1 billion

The maximum int for an array on my machine (a Pentium IV with 2 GB
RAM) is < 330 Million...before you get an "out of memory" exception.
I simply filled an array of this size with ints...I got as far as 320
M. So, myArray[320000000] is a big as I can get.

In theory a 32 bit int is 2.1 billion, but in practice, you cannot
fill an array having anywhere near that number of elements. I suspect
that it's because every element of an array takes up space equal to:
2.1E9/320E6 = 6.56 = 7 bytes of 32 bit memory

RL
Oct 19 '08 #1
18 4331
raylopez99 wrote:
The maximum int for an array on my machine (a Pentium IV with 2 GB
RAM) is < 330 Million...before you get an "out of memory" exception.
I simply filled an array of this size with ints...I got as far as 320
M. So, myArray[320000000] is a big as I can get.
I assume you can make your array 320 million integers before you run
out of memory?
In theory a 32 bit int is 2.1 billion,
Not exactly. Assume 2GB is 2 billion BYTES, then it is only 0.5 billion
INTEGERS.

--
Rudy Velthuis http://rvelthuis.de

"Hell is paved with good samaritans." -- William M. Holden
Oct 19 '08 #2
Rudy Velthuis wrote:
raylopez99 wrote:
The maximum int for an array on my machine (a Pentium IV with 2 GB
RAM) is < 330 Million...before you get an "out of memory" exception.
I simply filled an array of this size with ints...I got as far as
320 M. So, myArray[320000000] is a big as I can get.

I assume you can make your array 320 million integers before you run
out of memory?
In theory a 32 bit int is 2.1 billion,

Not exactly. Assume 2GB is 2 billion BYTES, then it is only 0.5
billion INTEGERS.
And well, even with swapping and virtual memory and all that, the OS
and other programs (including VS, I guess), also need some of your
memory.

--
Rudy Velthuis http://rvelthuis.de

"Only a free and unrestrained press can effectively expose
deception in government."
-- Hugo Black, Supreme Court Justice
Oct 19 '08 #3
Rudy Velthuis wrote:
Rudy Velthuis wrote:
>raylopez99 wrote:
>>The maximum int for an array on my machine (a Pentium IV with 2 GB
RAM) is < 330 Million...before you get an "out of memory" exception.
I simply filled an array of this size with ints...I got as far as
320 M. So, myArray[320000000] is a big as I can get.
I assume you can make your array 320 million integers before you run
out of memory?
>>In theory a 32 bit int is 2.1 billion,
Not exactly. Assume 2GB is 2 billion BYTES, then it is only 0.5
billion INTEGERS.

And well, even with swapping and virtual memory and all that, the OS
and other programs (including VS, I guess), also need some of your
memory.
That does not count.

The limit is on virtual memory. And on 32 bit Windows you app has
2 GB of virtual address space to use, but Windows has another 2 GB
of virtual address space. And VS will be running in its own
2 GB process space.

Arne
Oct 19 '08 #4
raylopez99 wrote:
The maximum int for an array on my machine (a Pentium IV with 2 GB
RAM) is < 330 Million...before you get an "out of memory" exception.
I simply filled an array of this size with ints...I got as far as 320
M. So, myArray[320000000] is a big as I can get.

In theory a 32 bit int is 2.1 billion,
If you're talking about the maximum value of a signed 32-bit int
(2147483647), that's not relevant here, as managed allocation size isn't
restricted to that. What is relevant is the address space limit of your
application, which happens to be 2 GiB on 32-bit Windows, which happens to
be the maximum value of a signed 32-bit int as well -- but this is
coincidental. You can make Windows run with a 3 GiB virtual address space
for each application, for example, and 64-bit applications have a (for now)
unfathomably bigger limit that I can't recall right now, but you'll hit the
limits of your physical memory far sooner than that.
but in practice, you cannot fill an array having anywhere near that
number of elements. I suspect that it's because every element of an array
takes up space equal to: 2.1E9/320E6 = 6.56 = 7 bytes of 32 bit memory
You suspect wrongly. There is no extra per-element overhead for managed
arrays (there is some small per-object overhead, but that's for the array as
a whole). So an array of 330 million integers takes up very close to 330e6 *
4 = 1.32 billion bytes, which is 1.23 GiB.

This is apparently the maximum amount of contiguous memory in your process'
address space. You can probably squeeze more out of it by allocating smaller
arrays to fill in the noncontiguous parts. You won't reach 2.0 GiB exactly
in any case, because the runtime obviously needs memory too (and a small
part of the address space is invalidated by the kernel), and you will never
be able to exceed 2.0 GiB no matter how much physical memory you have,

--
J.
Oct 19 '08 #5
"Jeroen Mostert" <jm******@xs4all.nlwrote in message
news:48*********************@news.xs4all.nl...
If you're talking about the maximum value of a signed 32-bit int
(2147483647), that's not relevant here, as managed allocation size isn't
restricted to that. What is relevant is the address space limit of your
application, which happens to be 2 GiB on 32-bit Windows, which happens to
be the maximum value of a signed 32-bit int as well -- but this is
coincidental. You can make Windows run with a 3 GiB virtual address space
for each application, for example, and 64-bit applications have a (for
now) unfathomably bigger limit that I can't recall right now, but you'll
hit the limits of your physical memory far sooner than that.
Interesting... I can remember when (in DOS days) no data structure could be
more than 64k bytes. It really cramped our style...
Oct 19 '08 #6
raylopez99 <ra********@yahoo.comwrote:
The maximum int for an array on my machine (a Pentium IV with 2 GB
RAM) is < 330 Million...before you get an "out of memory" exception.
I simply filled an array of this size with ints...I got as far as 320
M. So, myArray[320000000] is a big as I can get.

In theory a 32 bit int is 2.1 billion, but in practice, you cannot
fill an array having anywhere near that number of elements. I suspect
that it's because every element of an array takes up space equal to:
2.1E9/320E6 = 6.56 = 7 bytes of 32 bit memory
Others have explained why your suspicion is wrong. Here's a hint to get
more than 320M entries though: try a byte[] instead of an int[]. You
still won't get up to int.MaxValue, but you'll get a lot higher (well,

--
Jon Skeet - <sk***@pobox.com>
Web site: http://www.pobox.com/~skeet
Blog: http://www.msmvps.com/jon.skeet
C# in Depth: http://csharpindepth.com
Oct 19 '08 #7
Arne Vajhřj wrote:
That does not count.

The limit is on virtual memory. And on 32 bit Windows you app has
2 GB of virtual address space to use, but Windows has another 2 GB
of virtual address space. And VS will be running in its own
2 GB process space.
What I meant to say is that 320 million integers is not such a bad deal
at all.

--
Rudy Velthuis http://rvelthuis.de

"Heaven is an American salary, a Chinese cook, an English house,
and a Japanese wife. Hell is defined as having a Chinese salary,
an English cook, a Japanese house, and an American wife."
-- James H. Kabbler III.
Oct 19 '08 #8
On Oct 19, 12:06*am, Jon Skeet [C# MVP] <sk...@pobox.comwrote:
Others have explained why your suspicion is wrong. Here's a hint to get
more than 320M entries though: try a byte[] instead of an int[]. You
still won't get up to int.MaxValue, but you'll get a lot higher (well,
OK, but how do I count in bytes? I am trying to count to a maximum or
a minimum value which could be as high as 3 billion*.... I guess I
could use Decimal, which allows you to count to a very high number,
no?

RL

*PS--actually for my purposes 300M will do, but I'm curious for future
reference...let's say I was trying to count the number of stars, that
kind of thing.
Oct 19 '08 #9
Thanks to you and Rudy Velthuis I have noted this.

RL
Oct 19 '08 #10
"raylopez99" <ra********@yahoo.comwrote in message
On Oct 19, 12:06 am, Jon Skeet [C# MVP] <sk...@pobox.comwrote:
Others have explained why your suspicion is wrong. Here's a hint to get
more than 320M entries though: try a byte[] instead of an int[]. You
still won't get up to int.MaxValue, but you'll get a lot higher (well,
OK, but how do I count in bytes? I am trying to count to a maximum or
a minimum value which could be as high as 3 billion*.... I guess I
could use Decimal, which allows you to count to a very high number,
no?
>>>>>>>>>>>
Use a long instead of an int, its a 64bit integer or if your range doesn't
extend into the negative you could use uint which tops out at 4Billion-ish
at the expense of starting at 0.
--
Anthony Jones - MVP ASP/ASP.NET

Oct 19 '08 #11
raylopez99 <ra********@yahoo.comwrote:
Others have explained why your suspicion is wrong. Here's a hint to get
more than 320M entries though: try a byte[] instead of an int[]. You
still won't get up to int.MaxValue, but you'll get a lot higher (well,

OK, but how do I count in bytes? I am trying to count to a maximum or
a minimum value which could be as high as 3 billion*.... I guess I
could use Decimal, which allows you to count to a very high number,
no?
There's a big difference between *counting* and storing in an array.
You could count to more than 320M using plain "int" - int.MaxValue is
2,147,483,647. If you want more than that, you could use long or
decimal.

Creating an *array* of those large sizes, however, is a very different
matter. Which are you trying to do?

--
Jon Skeet - <sk***@pobox.com>
Web site: http://www.pobox.com/~skeet
Blog: http://www.msmvps.com/jon.skeet
C# in Depth: http://csharpindepth.com
Oct 19 '08 #12
raylopez99 wrote:
OK, but how do I count in bytes? I am trying to count to a maximum or
a minimum value which could be as high as 3 billion*.... I guess I
could use Decimal, which allows you to count to a very high number,
no?
long would be obvious.

Arne
Oct 19 '08 #13
raylopez99 wrote:
On Oct 19, 12:06 am, Jon Skeet [C# MVP] <sk...@pobox.comwrote:
>Others have explained why your suspicion is wrong. Here's a hint to get
more than 320M entries though: try a byte[] instead of an int[]. You
still won't get up to int.MaxValue, but you'll get a lot higher (well,

OK, but how do I count in bytes? I am trying to count to a maximum or
a minimum value which could be as high as 3 billion*.... I guess I
could use Decimal, which allows you to count to a very high number,
no?

RL

*PS--actually for my purposes 300M will do, but I'm curious for future
reference...let's say I was trying to count the number of stars, that
kind of thing.
What do you mean by "count"?

If you want a data type that can hold a very large number, it doesn't
have to be very large. A 1024 bit number for example could probably be
used to count the number of atoms in the universe, and that only uses
128 bytes. Not 128 MiB or 128 kiB, but 128 bytes.

--
_____
http://www.guffa.com
Oct 19 '08 #14
On Oct 19, 2:51*pm, Göran Andersson <gu...@guffa.comwrote:
What do you mean by "count"?

If you want a data type that can hold a very large number, it doesn't
have to be very large. A 1024 bit number for example could probably be
used to count the number of atoms in the universe, and that only uses
128 bytes. Not 128 MiB or 128 kiB, but 128 bytes.
Aha! You and Jon Skeet hit the nail on the head; the others seem to
have missed this or not understood.
store. But I can count to 4 billion using uint, but if I don't want
to *store* every number counted, then I can do this, no problem. Just
count as you go.

So two questions:

(1) how to count to, say 10 billion integers (note 10 billion 4
billion, so uint is out).

(2) how to *store*, 10 billion integers on a PC (Pentium IV) with 2 GB
RAM. Perhaps it is not possible unless you split the integers into
chunks and then serialize them?

RL

Oct 20 '08 #15
raylopez99 <ra********@yahoo.comwrote:
So two questions:

(1) how to count to, say 10 billion integers (note 10 billion 4
billion, so uint is out).
long or ulong.
(2) how to *store*, 10 billion integers on a PC (Pentium IV) with 2 GB
RAM. Perhaps it is not possible unless you split the integers into
chunks and then serialize them?
Well you certainly won't be able to store them all in memory - not
unless you have some extremely efficient compression system, which
won't work for all possible combinations.

I suggest you look at storing them on disk.

--
Jon Skeet - <sk***@pobox.com>
Web site: http://www.pobox.com/~skeet
Blog: http://www.msmvps.com/jon.skeet
C# in Depth: http://csharpindepth.com
Oct 20 '08 #16
Arne Vajhřj wrote:
Rudy Velthuis wrote:
>Rudy Velthuis wrote:
>>raylopez99 wrote:
The maximum int for an array on my machine (a Pentium IV with 2 GB
RAM) is < 330 Million...before you get an "out of memory"
exception. I simply filled an array of this size with ints...I got
as far as 320 M. So, myArray[320000000] is a big as I can get.
I assume you can make your array 320 million integers before you run
out of memory?

In theory a 32 bit int is 2.1 billion,
Not exactly. Assume 2GB is 2 billion BYTES, then it is only 0.5
billion INTEGERS.

And well, even with swapping and virtual memory and all that, the OS
and other programs (including VS, I guess), also need some of your
memory.

That does not count.

The limit is on virtual memory. And on 32 bit Windows you app has
2 GB of virtual address space to use, but Windows has another 2 GB
of virtual address space. And VS will be running in its own
2 GB process space.
No, but the CLR is running in your process, and does use up that 2GB of user
>
Arne

Oct 20 '08 #17
Jon Skeet [C# MVP] wrote:
raylopez99 <ra********@yahoo.comwrote:
>So two questions:

(1) how to count to, say 10 billion integers (note 10 billion 4
billion, so uint is out).

long or ulong.
>(2) how to *store*, 10 billion integers on a PC (Pentium IV) with 2
GB RAM. Perhaps it is not possible unless you split the integers
into chunks and then serialize them?

Well you certainly won't be able to store them all in memory - not
unless you have some extremely efficient compression system, which
won't work for all possible combinations.
Well, there's just one chance... if each integer is limited to two
possibilities, maybe {0, 1}, then I guess you could store 32 * 330M ~ 10
billion of them at once.
>
I suggest you look at storing them on disk.

Oct 20 '08 #18
raylopez99 wrote:
On Oct 19, 2:51 pm, Göran Andersson <gu...@guffa.comwrote:
>What do you mean by "count"?

If you want a data type that can hold a very large number, it doesn't
have to be very large. A 1024 bit number for example could probably be
used to count the number of atoms in the universe, and that only uses
128 bytes. Not 128 MiB or 128 kiB, but 128 bytes.

Aha! You and Jon Skeet hit the nail on the head; the others seem to
have missed this or not understood.
store. But I can count to 4 billion using uint, but if I don't want
to *store* every number counted, then I can do this, no problem. Just
count as you go.

So two questions:

(1) how to count to, say 10 billion integers (note 10 billion 4
billion, so uint is out).
As Jon already suggested, a long or an ulong would easily cope with that.
(2) how to *store*, 10 billion integers on a PC (Pentium IV) with 2 GB
RAM. Perhaps it is not possible unless you split the integers into
chunks and then serialize them?
There are only just under 4.3 billion possible values for an int, which
means that you have duplicate values. If there are a lot of duplicate
values, a Dictionary<int, intcould be used to keep track of how many
of each there are.

Otherwise you could make a class that uses some sophisticated way of
keeping track of the number of integers in a range. If the numbers are
evenly distributed over the possible values, only a few bits are needed
to hold the count for each number. If they are unevenly distributed,
some counters require more space, and you will have ranges of numbers
where all the counters are zero, which hardly needs any space at all to
keep track of.

If you keep the size of the objects below 85 kb, they will not be
allocated on the Large Objects Heap, which allows you to use more of the
memory space before you run into a situation where you can't allocate
more memory.

--