By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
443,626 Members | 2,202 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 443,626 IT Pros & Developers. It's quick & easy.

Is this a bug in .NET garbage collection?

P: n/a
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa[i] = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.
Jul 21 '05 #1
Share this Question
Share on Google+
24 Replies


P: n/a
What version of the framework? If memory serves there is a bug in the 1.0
runtime won't free large objects in certain circumstances.

"Yang" <Ya**@discussions.microsoft.com> wrote in message
news:4E**********************************@microsof t.com...
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa[i] = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.

Jul 21 '05 #2

P: n/a
Garbage Collector Works well for .NET 1.0 (SP 1+) and .NET 1.1, but you
cannot always predict when it will reclaim your unused array. I am pretty
sure though that if you click the button twice, the memory will still be 300M
rather than 600M.
If you would like to really understand how the garbage collector works, I
recommend that you download shared source
http://www.microsoft.com/downloads/d...displaylang=en
and step through the garbage collection source code (e.g. gcXXX.cpp).
Aleksey Nudelman,
csharpcomputing.com
"Yang" wrote:
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa[i] = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.

Jul 21 '05 #3

P: n/a
1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.
"Yang" <Ya**@discussions.microsoft.com> wrote in message
news:4E**********************************@microsof t.com...
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa[i] = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.

Jul 21 '05 #4

P: n/a
Thanks for the reply. I am using .NET 1.1 SP1 (the very latest update). You
are right that if I click twice the memory will still 300M. But the problem
is this 300M is there FOREVER. I also tried to increase the size of array to
900M on my machine with 512M RAM. This 900M is there FOREVER too! And my
machine becomes very slow now...

"csharpcomputing.com" wrote:
Garbage Collector Works well for .NET 1.0 (SP 1+) and .NET 1.1, but you
cannot always predict when it will reclaim your unused array. I am pretty
sure though that if you click the button twice, the memory will still be 300M
rather than 600M.
If you would like to really understand how the garbage collector works, I
recommend that you download shared source
http://www.microsoft.com/downloads/d...displaylang=en
and step through the garbage collection source code (e.g. gcXXX.cpp).
Aleksey Nudelman,
csharpcomputing.com
"Yang" wrote:
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa[i] = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.

Jul 21 '05 #5

P: n/a
Thanks for the reply. But I think it is not true in my case. I am using .NET
1.1 with the latest SP1. I also left the application overnight and opened
many other applications. Basically whatever I do, the memory has never been
released.

"Imran Koradia" wrote:
1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.
"Yang" <Ya**@discussions.microsoft.com> wrote in message
news:4E**********************************@microsof t.com...
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa[i] = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.


Jul 21 '05 #6

P: n/a
Thanks for the reply. Please see my other replies. I am using .NET 1.1 SP1.
(NOT the 1.0 SP1). The About box in Visual Studio.NET shows the version
1.1.4322 SP1.

"Daniel O'Connell [C# MVP]" wrote:
What version of the framework? If memory serves there is a bug in the 1.0
runtime won't free large objects in certain circumstances.

"Yang" <Ya**@discussions.microsoft.com> wrote in message
news:4E**********************************@microsof t.com...
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa[i] = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.


Jul 21 '05 #7

P: n/a

"Yang" <Ya**@discussions.microsoft.com> wrote in message
news:3F**********************************@microsof t.com...
Thanks for the reply. Please see my other replies. I am using .NET 1.1
SP1.
(NOT the 1.0 SP1). The About box in Visual Studio.NET shows the version
1.1.4322 SP1.
Then it sounds like you are holding onto the memory somewhere. Can you show
a small example program that exhibits the problem?

"Daniel O'Connell [C# MVP]" wrote:
What version of the framework? If memory serves there is a bug in the 1.0
runtime won't free large objects in certain circumstances.

"Yang" <Ya**@discussions.microsoft.com> wrote in message
news:4E**********************************@microsof t.com...
>I found a very strange behavior when I write a C# windows application.
>If I
> allocate a huge chunk of memory by an array, the memory will never be
> released by .NET.
>
> The problem can be demostrated as follows:
>
> 1. Just create the simplest windows form project and add a button and
> handler like this:
>
> private void button1_Click(object sender, System.EventArgs e)
> {
> byte[] aaa = new byte[300000000];
> for(int i=0; i<aaa.Length; i++)
> aaa[i] = 10;
> }
>
> 2. After executing the above code, I observed the memory in Task
> Manager.
> The Commit Charge Total jumped to 300M and the MEM Usage shows the same
> thing. The problem is whatever you do the memory usage can never drop
> back. I
> have tried to open many other applications and NOTHING can make the
> memory
> get released.
>
> 3. The variable aaa has gone out of scope, so it does not make any
> sense
> that the .NET still holds the memory. Now the performance of the whole
> system
> is downgraded.
>
> 4. The only thing I can make the memory release to Windows is to call
> GC.Collect(). Is this the only way to release that memory? Do I suppose
> to
> do
> this?
>
> Can anybody confirm if this is a bug in .NET? Appearantly it does not
> make
> sense to hold the memory FOREVER even if nobody uses it.
>
> Thanks,
>
> Yang.


Jul 21 '05 #8

P: n/a
Daniel,

Thanks for the fast response. I have showed the sample code in my original
post. Let me repeat it here:

Just create the simplest windows form project and add a button and handler
as follows. This is all you need to reproduce this problem. Do you think the
varible aaa is hold somewhere? I don't think so...

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa[i] = 10;
}

"Daniel O'Connell [C# MVP]" wrote:

"Yang" <Ya**@discussions.microsoft.com> wrote in message
news:3F**********************************@microsof t.com...
Thanks for the reply. Please see my other replies. I am using .NET 1.1
SP1.
(NOT the 1.0 SP1). The About box in Visual Studio.NET shows the version
1.1.4322 SP1.


Then it sounds like you are holding onto the memory somewhere. Can you show
a small example program that exhibits the problem?

"Daniel O'Connell [C# MVP]" wrote:
What version of the framework? If memory serves there is a bug in the 1.0
runtime won't free large objects in certain circumstances.

"Yang" <Ya**@discussions.microsoft.com> wrote in message
news:4E**********************************@microsof t.com...
>I found a very strange behavior when I write a C# windows application.
>If I
> allocate a huge chunk of memory by an array, the memory will never be
> released by .NET.
>
> The problem can be demostrated as follows:
>
> 1. Just create the simplest windows form project and add a button and
> handler like this:
>
> private void button1_Click(object sender, System.EventArgs e)
> {
> byte[] aaa = new byte[300000000];
> for(int i=0; i<aaa.Length; i++)
> aaa[i] = 10;
> }
>
> 2. After executing the above code, I observed the memory in Task
> Manager.
> The Commit Charge Total jumped to 300M and the MEM Usage shows the same
> thing. The problem is whatever you do the memory usage can never drop
> back. I
> have tried to open many other applications and NOTHING can make the
> memory
> get released.
>
> 3. The variable aaa has gone out of scope, so it does not make any
> sense
> that the .NET still holds the memory. Now the performance of the whole
> system
> is downgraded.
>
> 4. The only thing I can make the memory release to Windows is to call
> GC.Collect(). Is this the only way to release that memory? Do I suppose
> to
> do
> this?
>
> Can anybody confirm if this is a bug in .NET? Appearantly it does not
> make
> sense to hold the memory FOREVER even if nobody uses it.
>
> Thanks,
>
> Yang.


Jul 21 '05 #9

P: n/a
Hi Imran

Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing.

To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.

Thanks
-Chris

--------------------

Thanks for the reply. But I think it is not true in my case. I am using .NET
1.1 with the latest SP1. I also left the application overnight and opened
many other applications. Basically whatever I do, the memory has never been
released.

"Imran Koradia" wrote:
1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.
"Yang" <Ya**@discussions.microsoft.com> wrote in message
news:4E**********************************@microsof t.com...
>I found a very strange behavior when I write a C# windows application. If I
> allocate a huge chunk of memory by an array, the memory will never be
> released by .NET.
>
> The problem can be demostrated as follows:
>
> 1. Just create the simplest windows form project and add a button and
> handler like this:
>
> private void button1_Click(object sender, System.EventArgs e)
> {
> byte[] aaa = new byte[300000000];
> for(int i=0; i<aaa.Length; i++)
> aaa[i] = 10;
> }
>
> 2. After executing the above code, I observed the memory in Task Manager.
> The Commit Charge Total jumped to 300M and the MEM Usage shows the same
> thing. The problem is whatever you do the memory usage can never drop
> back. I
> have tried to open many other applications and NOTHING can make the memory
> get released.
>
> 3. The variable aaa has gone out of scope, so it does not make any sense
> that the .NET still holds the memory. Now the performance of the whole
> system
> is downgraded.
>
> 4. The only thing I can make the memory release to Windows is to call
> GC.Collect(). Is this the only way to release that memory? Do I suppose to
> do
> this?
>
> Can anybody confirm if this is a bug in .NET? Appearantly it does not make
> sense to hold the memory FOREVER even if nobody uses it.
>
> Thanks,
>
> Yang.


--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

Jul 21 '05 #10

P: n/a
Chris,

I hope you can write a couple lines of code and reproduce this problem on
your computer.

I am very sure this is a problem, because I examed it by .NET Memory
Profiler, GC.GetTotalMemory() and Task Manager.

Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/

Also 2 of my coworkers got the same result on their computers respectively.

One thing is obvious - you even don't need any tools to verify - after you
allocate such a big thunk of memory (I tried 900MB), your whole system
becomes slow and never come back until you close the app.

AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
problem happens on your machine or not.

""Chris Lyon [MSFT]"" wrote:
Hi Imran

Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing.

To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.

Thanks
-Chris

--------------------

Thanks for the reply. But I think it is not true in my case. I am using .NET
1.1 with the latest SP1. I also left the application overnight and opened
many other applications. Basically whatever I do, the memory has never been
released.

"Imran Koradia" wrote:
1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.
"Yang" <Ya**@discussions.microsoft.com> wrote in message
news:4E**********************************@microsof t.com...
>I found a very strange behavior when I write a C# windows application. If I
> allocate a huge chunk of memory by an array, the memory will never be
> released by .NET.
>
> The problem can be demostrated as follows:
>
> 1. Just create the simplest windows form project and add a button and
> handler like this:
>
> private void button1_Click(object sender, System.EventArgs e)
> {
> byte[] aaa = new byte[300000000];
> for(int i=0; i<aaa.Length; i++)
> aaa[i] = 10;
> }
>
> 2. After executing the above code, I observed the memory in Task Manager.
> The Commit Charge Total jumped to 300M and the MEM Usage shows the same
> thing. The problem is whatever you do the memory usage can never drop
> back. I
> have tried to open many other applications and NOTHING can make the memory
> get released.
>
> 3. The variable aaa has gone out of scope, so it does not make any sense
> that the .NET still holds the memory. Now the performance of the whole
> system
> is downgraded.
>
> 4. The only thing I can make the memory release to Windows is to call
> GC.Collect(). Is this the only way to release that memory? Do I suppose to
> do
> this?
>
> Can anybody confirm if this is a bug in .NET? Appearantly it does not make
> sense to hold the memory FOREVER even if nobody uses it.
>
> Thanks,
>
> Yang.

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

Jul 21 '05 #11

P: n/a
I tried running the test myself.

Calling GC.GetTotalMemory(false) after the method exits shows the array still in memory. This is expected, since a full collection of the large object heap has not yet been
performed. Calling GC.GetTotalMemory(true) shows the memory has been collected. This is using v1.1 SP1.

The amount of vitural memory being used after the collection is the large object heap itself. It expands to fit your large objects, but doesn't shrink when it's emptied. This is
similar to the way Windows manages its virtual memory.

See http://msdn.microsoft.com/library/de...ml/DBGch02.asp for more information

-Chris

--------------------

Chris,

I hope you can write a couple lines of code and reproduce this problem on
your computer.

I am very sure this is a problem, because I examed it by .NET Memory
Profiler, GC.GetTotalMemory() and Task Manager.

Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/

Also 2 of my coworkers got the same result on their computers respectively.

One thing is obvious - you even don't need any tools to verify - after you
allocate such a big thunk of memory (I tried 900MB), your whole system
becomes slow and never come back until you close the app.

AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
problem happens on your machine or not.

""Chris Lyon [MSFT]"" wrote:
Hi Imran

Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing.

To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.

Thanks
-Chris

--------------------
>
>Thanks for the reply. But I think it is not true in my case. I am using .NET
>1.1 with the latest SP1. I also left the application overnight and opened
>many other applications. Basically whatever I do, the memory has never been
>released.
>
>"Imran Koradia" wrote:
>
>> 1.1 framework seems to work fine. The GC performs a collect and releases all
>> the memory very soon (a second) after the array runs out of scope. If you're
>> using 1.0 then it could very well be what Daniel mentioned.
>>
>> Imran.
>>
>>
>> "Yang" <Ya**@discussions.microsoft.com> wrote in message
>> news:4E**********************************@microsof t.com...
>> >I found a very strange behavior when I write a C# windows application. If I
>> > allocate a huge chunk of memory by an array, the memory will never be
>> > released by .NET.
>> >
>> > The problem can be demostrated as follows:
>> >
>> > 1. Just create the simplest windows form project and add a button and
>> > handler like this:
>> >
>> > private void button1_Click(object sender, System.EventArgs e)
>> > {
>> > byte[] aaa = new byte[300000000];
>> > for(int i=0; i<aaa.Length; i++)
>> > aaa[i] = 10;
>> > }
>> >
>> > 2. After executing the above code, I observed the memory in Task Manager.
>> > The Commit Charge Total jumped to 300M and the MEM Usage shows the same
>> > thing. The problem is whatever you do the memory usage can never drop
>> > back. I
>> > have tried to open many other applications and NOTHING can make the memory
>> > get released.
>> >
>> > 3. The variable aaa has gone out of scope, so it does not make any sense
>> > that the .NET still holds the memory. Now the performance of the whole
>> > system
>> > is downgraded.
>> >
>> > 4. The only thing I can make the memory release to Windows is to call
>> > GC.Collect(). Is this the only way to release that memory? Do I suppose to
>> > do
>> > this?
>> >
>> > Can anybody confirm if this is a bug in .NET? Appearantly it does not make
>> > sense to hold the memory FOREVER even if nobody uses it.
>> >
>> > Thanks,
>> >
>> > Yang.
>>
>>
>>
>

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

Jul 21 '05 #12

P: n/a
Thanks Chris,

I noticed the same thing that GC.GetTotalMemory(true) and GC.Collect() can
release the memory to Windows.

However I still think the algorithm here is not optimal. Say, my application
occupied 900MB memory, other applications run on the same computer cannot get
those memory. The reason of this is just because my application allocated
that memory ONCE and never use it any more.

I think it is good for .NET to cache that big memory for performance, but it
should be smart enough to release it if the situation is my application does
not need it while other applications need it.

In my sample code, do you think if I should call GC.Collect()? I think if I
do that, it will cause other problems which have been explained in many
articles, for example, it may take long time to collect memory of all
generations...

My one more opinion is: since in this case I know that memory is not used,
is it possible for me to call some functions to just release this paticular
object?

Thanks again.

""Chris Lyon [MSFT]"" wrote:
I tried running the test myself.

Calling GC.GetTotalMemory(false) after the method exits shows the array still in memory. This is expected, since a full collection of the large object heap has not yet been
performed. Calling GC.GetTotalMemory(true) shows the memory has been collected. This is using v1.1 SP1.

The amount of vitural memory being used after the collection is the large object heap itself. It expands to fit your large objects, but doesn't shrink when it's emptied. This is
similar to the way Windows manages its virtual memory.

See http://msdn.microsoft.com/library/de...ml/DBGch02.asp for more information

-Chris

--------------------

Chris,

I hope you can write a couple lines of code and reproduce this problem on
your computer.

I am very sure this is a problem, because I examed it by .NET Memory
Profiler, GC.GetTotalMemory() and Task Manager.

Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/

Also 2 of my coworkers got the same result on their computers respectively.

One thing is obvious - you even don't need any tools to verify - after you
allocate such a big thunk of memory (I tried 900MB), your whole system
becomes slow and never come back until you close the app.

AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
problem happens on your machine or not.

""Chris Lyon [MSFT]"" wrote:
Hi Imran

Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing.

To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.

Thanks
-Chris

--------------------

>
>Thanks for the reply. But I think it is not true in my case. I am using .NET
>1.1 with the latest SP1. I also left the application overnight and opened
>many other applications. Basically whatever I do, the memory has never been
>released.
>
>"Imran Koradia" wrote:
>
>> 1.1 framework seems to work fine. The GC performs a collect and releases all
>> the memory very soon (a second) after the array runs out of scope. If you're
>> using 1.0 then it could very well be what Daniel mentioned.
>>
>> Imran.
>>
>>
>> "Yang" <Ya**@discussions.microsoft.com> wrote in message
>> news:4E**********************************@microsof t.com...
>> >I found a very strange behavior when I write a C# windows application. If I
>> > allocate a huge chunk of memory by an array, the memory will never be
>> > released by .NET.
>> >
>> > The problem can be demostrated as follows:
>> >
>> > 1. Just create the simplest windows form project and add a button and
>> > handler like this:
>> >
>> > private void button1_Click(object sender, System.EventArgs e)
>> > {
>> > byte[] aaa = new byte[300000000];
>> > for(int i=0; i<aaa.Length; i++)
>> > aaa[i] = 10;
>> > }
>> >
>> > 2. After executing the above code, I observed the memory in Task Manager.
>> > The Commit Charge Total jumped to 300M and the MEM Usage shows the same
>> > thing. The problem is whatever you do the memory usage can never drop
>> > back. I
>> > have tried to open many other applications and NOTHING can make the memory
>> > get released.
>> >
>> > 3. The variable aaa has gone out of scope, so it does not make any sense
>> > that the .NET still holds the memory. Now the performance of the whole
>> > system
>> > is downgraded.
>> >
>> > 4. The only thing I can make the memory release to Windows is to call
>> > GC.Collect(). Is this the only way to release that memory? Do I suppose to
>> > do
>> > this?
>> >
>> > Can anybody confirm if this is a bug in .NET? Appearantly it does not make
>> > sense to hold the memory FOREVER even if nobody uses it.
>> >
>> > Thanks,
>> >
>> > Yang.
>>
>>
>>
>
--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

Jul 21 '05 #13

P: n/a
Hi Yang

I think the issue is that once your large object has been collected, the CLR doesn't know if you'll ever need that much memory again. It's cheaper to keep it once it's been
allocated instead of reallocating a huge chunk every time, and possibly failing that allocation. The GC bases its heuristics on your past allocation pattern, hoping it will be the
same in the future.

You are right, you don't want to call GC.Collect(). The large object will get collected the next time a full (expensive) collection occurs, so you're better off waiting, unless you've
profiled and found performance is better with your Collect.

No, there's no way to force the GC to collect individual objects.

-Chris

--------------------

Thanks Chris,

I noticed the same thing that GC.GetTotalMemory(true) and GC.Collect() can
release the memory to Windows.

However I still think the algorithm here is not optimal. Say, my application
occupied 900MB memory, other applications run on the same computer cannot get
those memory. The reason of this is just because my application allocated
that memory ONCE and never use it any more.

I think it is good for .NET to cache that big memory for performance, but it
should be smart enough to release it if the situation is my application does
not need it while other applications need it.

In my sample code, do you think if I should call GC.Collect()? I think if I
do that, it will cause other problems which have been explained in many
articles, for example, it may take long time to collect memory of all
generations...

My one more opinion is: since in this case I know that memory is not used,
is it possible for me to call some functions to just release this paticular
object?

Thanks again.

""Chris Lyon [MSFT]"" wrote:
I tried running the test myself.

Calling GC.GetTotalMemory(false) after the method exits shows the array still in memory. This is expected, since a full collection of the large object heap has not yet been
performed. Calling GC.GetTotalMemory(true) shows the memory has been collected. This is using v1.1 SP1.

The amount of vitural memory being used after the collection is the large object heap itself. It expands to fit your large objects, but doesn't shrink when it's emptied. This is
similar to the way Windows manages its virtual memory.

See http://msdn.microsoft.com/library/de...ml/DBGch02.asp for more information

-Chris

--------------------
>
>Chris,
>
>I hope you can write a couple lines of code and reproduce this problem on
>your computer.
>
>I am very sure this is a problem, because I examed it by .NET Memory
>Profiler, GC.GetTotalMemory() and Task Manager.
>
>Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/
>
>Also 2 of my coworkers got the same result on their computers respectively.
>
>One thing is obvious - you even don't need any tools to verify - after you
>allocate such a big thunk of memory (I tried 900MB), your whole system
>becomes slow and never come back until you close the app.
>
>AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
>problem happens on your machine or not.
>
>""Chris Lyon [MSFT]"" wrote:
>
>> Hi Imran
>>
>> Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing. >>
>> To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.
>>
>> Thanks
>> -Chris
>>
>> --------------------
>>
>> >
>> >Thanks for the reply. But I think it is not true in my case. I am using .NET
>> >1.1 with the latest SP1. I also left the application overnight and opened
>> >many other applications. Basically whatever I do, the memory has never been
>> >released.
>> >
>> >"Imran Koradia" wrote:
>> >
>> >> 1.1 framework seems to work fine. The GC performs a collect and releases all
>> >> the memory very soon (a second) after the array runs out of scope. If you're
>> >> using 1.0 then it could very well be what Daniel mentioned.
>> >>
>> >> Imran.
>> >>
>> >>
>> >> "Yang" <Ya**@discussions.microsoft.com> wrote in message
>> >> news:4E**********************************@microsof t.com...
>> >> >I found a very strange behavior when I write a C# windows application. If I
>> >> > allocate a huge chunk of memory by an array, the memory will never be
>> >> > released by .NET.
>> >> >
>> >> > The problem can be demostrated as follows:
>> >> >
>> >> > 1. Just create the simplest windows form project and add a button and
>> >> > handler like this:
>> >> >
>> >> > private void button1_Click(object sender, System.EventArgs e)
>> >> > {
>> >> > byte[] aaa = new byte[300000000];
>> >> > for(int i=0; i<aaa.Length; i++)
>> >> > aaa[i] = 10;
>> >> > }
>> >> >
>> >> > 2. After executing the above code, I observed the memory in Task Manager.
>> >> > The Commit Charge Total jumped to 300M and the MEM Usage shows the same
>> >> > thing. The problem is whatever you do the memory usage can never drop
>> >> > back. I
>> >> > have tried to open many other applications and NOTHING can make the memory
>> >> > get released.
>> >> >
>> >> > 3. The variable aaa has gone out of scope, so it does not make any sense
>> >> > that the .NET still holds the memory. Now the performance of the whole
>> >> > system
>> >> > is downgraded.
>> >> >
>> >> > 4. The only thing I can make the memory release to Windows is to call
>> >> > GC.Collect(). Is this the only way to release that memory? Do I suppose to
>> >> > do
>> >> > this?
>> >> >
>> >> > Can anybody confirm if this is a bug in .NET? Appearantly it does not make
>> >> > sense to hold the memory FOREVER even if nobody uses it.
>> >> >
>> >> > Thanks,
>> >> >
>> >> > Yang.
>> >>
>> >>
>> >>
>> >
>>
>>
>> --
>>
>> This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
>> http://www.microsoft.com/info/cpyright.htm
>>
>> Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
>>
>>
>

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

Jul 21 '05 #14

P: n/a
Chris,

Thanks for the fast response. Please see my comments below:

""Chris Lyon [MSFT]"" wrote:
Hi Yang

I think the issue is that once your large object has been collected, the CLR doesn't know if you'll ever need that much memory again. It's cheaper to keep it once it's been
allocated instead of reallocating a huge chunk every time, and possibly failing that allocation.
I understand this part. But I think I am talking about some extreme
situation when that saved memory downgrades the entire system's performance.
My point is in this extreme situation, .NET should release that memory to
Windows and allow other application to use it.
You are right, you don't want to call GC.Collect(). The large object will get collected the next time a full (expensive) collection occurs, so you're better off waiting, unless you've
profiled and found performance is better with your Collect.
This confused me. From my observation, if I don't call GC.Collect() to force
..NET release the memory, everytime I allocate a bigger block of memory .NET
will keep this memory in its heap. In other word, the total memory occupied
by the application will be the maximum size of memory you have allocated in
the history.

In my example, I allocated 300M memory, can you tell me in what condition
that 300M will be reduced. From my observation, if I allocation 10M
thereafter, the 300M is still there. If I allocate 400M, then my application
holds this 400M! Basically, the memory never goes down.

No, there's no way to force the GC to collect individual objects.

-Chris

--------------------

Thanks Chris,

I noticed the same thing that GC.GetTotalMemory(true) and GC.Collect() can
release the memory to Windows.

However I still think the algorithm here is not optimal. Say, my application
occupied 900MB memory, other applications run on the same computer cannot get
those memory. The reason of this is just because my application allocated
that memory ONCE and never use it any more.

I think it is good for .NET to cache that big memory for performance, but it
should be smart enough to release it if the situation is my application does
not need it while other applications need it.

In my sample code, do you think if I should call GC.Collect()? I think if I
do that, it will cause other problems which have been explained in many
articles, for example, it may take long time to collect memory of all
generations...

My one more opinion is: since in this case I know that memory is not used,
is it possible for me to call some functions to just release this paticular
object?

Thanks again.

""Chris Lyon [MSFT]"" wrote:
I tried running the test myself.

Calling GC.GetTotalMemory(false) after the method exits shows the array still in memory. This is expected, since a full collection of the large object heap has not yet been
performed. Calling GC.GetTotalMemory(true) shows the memory has been collected. This is using v1.1 SP1.

The amount of vitural memory being used after the collection is the large object heap itself. It expands to fit your large objects, but doesn't shrink when it's emptied. This is
similar to the way Windows manages its virtual memory.

See http://msdn.microsoft.com/library/de...ml/DBGch02.asp for more information

-Chris

--------------------

>
>Chris,
>
>I hope you can write a couple lines of code and reproduce this problem on
>your computer.
>
>I am very sure this is a problem, because I examed it by .NET Memory
>Profiler, GC.GetTotalMemory() and Task Manager.
>
>Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/
>
>Also 2 of my coworkers got the same result on their computers respectively.
>
>One thing is obvious - you even don't need any tools to verify - after you
>allocate such a big thunk of memory (I tried 900MB), your whole system
>becomes slow and never come back until you close the app.
>
>AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
>problem happens on your machine or not.
>
>""Chris Lyon [MSFT]"" wrote:
>
>> Hi Imran
>>
>> Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing. >>
>> To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.
>>
>> Thanks
>> -Chris
>>
>> --------------------
>>
>> >
>> >Thanks for the reply. But I think it is not true in my case. I am using .NET
>> >1.1 with the latest SP1. I also left the application overnight and opened
>> >many other applications. Basically whatever I do, the memory has never been
>> >released.
>> >
>> >"Imran Koradia" wrote:
>> >
>> >> 1.1 framework seems to work fine. The GC performs a collect and releases all
>> >> the memory very soon (a second) after the array runs out of scope. If you're
>> >> using 1.0 then it could very well be what Daniel mentioned.
>> >>
>> >> Imran.
>> >>
>> >>
>> >> "Yang" <Ya**@discussions.microsoft.com> wrote in message
>> >> news:4E**********************************@microsof t.com...
>> >> >I found a very strange behavior when I write a C# windows application. If I
>> >> > allocate a huge chunk of memory by an array, the memory will never be
>> >> > released by .NET.
>> >> >
>> >> > The problem can be demostrated as follows:
>> >> >
>> >> > 1. Just create the simplest windows form project and add a button and
>> >> > handler like this:
>> >> >
>> >> > private void button1_Click(object sender, System.EventArgs e)
>> >> > {
>> >> > byte[] aaa = new byte[300000000];
>> >> > for(int i=0; i<aaa.Length; i++)
>> >> > aaa[i] = 10;
>> >> > }
>> >> >
>> >> > 2. After executing the above code, I observed the memory in Task Manager.
>> >> > The Commit Charge Total jumped to 300M and the MEM Usage shows the same
>> >> > thing. The problem is whatever you do the memory usage can never drop
>> >> > back. I
>> >> > have tried to open many other applications and NOTHING can make the memory
>> >> > get released.
>> >> >
>> >> > 3. The variable aaa has gone out of scope, so it does not make any sense
>> >> > that the .NET still holds the memory. Now the performance of the whole
>> >> > system
>> >> > is downgraded.
>> >> >
>> >> > 4. The only thing I can make the memory release to Windows is to call
>> >> > GC.Collect(). Is this the only way to release that memory? Do I suppose to
>> >> > do
>> >> > this?
>> >> >
>> >> > Can anybody confirm if this is a bug in .NET? Appearantly it does not make
>> >> > sense to hold the memory FOREVER even if nobody uses it.
>> >> >
>> >> > Thanks,
>> >> >
>> >> > Yang.
>> >>
>> >>
>> >>
>> >
>>
>>
>> --
>>
>> This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
>> http://www.microsoft.com/info/cpyright.htm
>>
>> Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
>>
>>
>
--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

Jul 21 '05 #15

P: n/a
Hi Yang
I understand this part. But I think I am talking about some extreme
situation when that saved memory downgrades the entire system's performance.
My point is in this extreme situation, .NET should release that memory to
Windows and allow other application to use it.
Have you found this to be true with a non-trivial .NET application? The GC tunes itself to your application. If all your app does is allocate a huge chunk of memory, then exits, it
won't try to reduce its size in memory.
This confused me. From my observation, if I don't call GC.Collect() to force
.NET release the memory, everytime I allocate a bigger block of memory .NET
will keep this memory in its heap. In other word, the total memory occupied
by the application will be the maximum size of memory you have allocated in
the history.
I think you're confusing objects in memory and the size of the large object heap. GC.GetTotalMemory and profilers will tell you that the object has been collected, but the size of
the reserved large object heap remains the same.
In my example, I allocated 300M memory, can you tell me in what condition
that 300M will be reduced. From my observation, if I allocation 10M
thereafter, the 300M is still there. If I allocate 400M, then my application
holds this 400M! Basically, the memory never goes down.
According to the article I pointed you to, the large object heap doesn't shrink. Again, if this is a problem in a real-world application, and not a trivial test, let me know. If that's the
case, then you probably need to redesign your application with memory usage in mind. I can point you to several resources if you need.

Thanks
-Chris

No, there's no way to force the GC to collect individual objects.

-Chris

--------------------
>
>Thanks Chris,
>
>I noticed the same thing that GC.GetTotalMemory(true) and GC.Collect() can
>release the memory to Windows.
>
>However I still think the algorithm here is not optimal. Say, my application
>occupied 900MB memory, other applications run on the same computer cannot get
>those memory. The reason of this is just because my application allocated
>that memory ONCE and never use it any more.
>
>I think it is good for .NET to cache that big memory for performance, but it
>should be smart enough to release it if the situation is my application does
>not need it while other applications need it.
>
>In my sample code, do you think if I should call GC.Collect()? I think if I
>do that, it will cause other problems which have been explained in many
>articles, for example, it may take long time to collect memory of all
>generations...
>
>My one more opinion is: since in this case I know that memory is not used,
>is it possible for me to call some functions to just release this paticular
>object?
>
>Thanks again.
>
>""Chris Lyon [MSFT]"" wrote:
>
>> I tried running the test myself.
>>
>> Calling GC.GetTotalMemory(false) after the method exits shows the array still in memory. This is expected, since a full collection of the large object heap has not yet been
>> performed. Calling GC.GetTotalMemory(true) shows the memory has been collected. This is using v1.1 SP1.
>>
>> The amount of vitural memory being used after the collection is the large object heap itself. It expands to fit your large objects, but doesn't shrink when it's emptied. This is >> similar to the way Windows manages its virtual memory.
>>
>> See http://msdn.microsoft.com/library/de...ml/DBGch02.asp for more information
>>
>> -Chris
>>
>> --------------------
>>
>> >
>> >Chris,
>> >
>> >I hope you can write a couple lines of code and reproduce this problem on
>> >your computer.
>> >
>> >I am very sure this is a problem, because I examed it by .NET Memory
>> >Profiler, GC.GetTotalMemory() and Task Manager.
>> >
>> >Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/
>> >
>> >Also 2 of my coworkers got the same result on their computers respectively.
>> >
>> >One thing is obvious - you even don't need any tools to verify - after you
>> >allocate such a big thunk of memory (I tried 900MB), your whole system
>> >becomes slow and never come back until you close the app.
>> >
>> >AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
>> >problem happens on your machine or not.
>> >
>> >""Chris Lyon [MSFT]"" wrote:
>> >
>> >> Hi Imran
>> >>
>> >> Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same

thing.
>> >>
>> >> To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.
>> >>
>> >> Thanks
>> >> -Chris
>> >>
>> >> --------------------
>> >>
>> >> >
>> >> >Thanks for the reply. But I think it is not true in my case. I am using .NET
>> >> >1.1 with the latest SP1. I also left the application overnight and opened
>> >> >many other applications. Basically whatever I do, the memory has never been
>> >> >released.
>> >> >
>> >> >"Imran Koradia" wrote:
>> >> >
>> >> >> 1.1 framework seems to work fine. The GC performs a collect and releases all
>> >> >> the memory very soon (a second) after the array runs out of scope. If you're
>> >> >> using 1.0 then it could very well be what Daniel mentioned.
>> >> >>
>> >> >> Imran.
>> >> >>
>> >> >>
>> >> >> "Yang" <Ya**@discussions.microsoft.com> wrote in message
>> >> >> news:4E**********************************@microsof t.com...
>> >> >> >I found a very strange behavior when I write a C# windows application. If I
>> >> >> > allocate a huge chunk of memory by an array, the memory will never be
>> >> >> > released by .NET.
>> >> >> >
>> >> >> > The problem can be demostrated as follows:
>> >> >> >
>> >> >> > 1. Just create the simplest windows form project and add a button and
>> >> >> > handler like this:
>> >> >> >
>> >> >> > private void button1_Click(object sender, System.EventArgs e)
>> >> >> > {
>> >> >> > byte[] aaa = new byte[300000000];
>> >> >> > for(int i=0; i<aaa.Length; i++)
>> >> >> > aaa[i] = 10;
>> >> >> > }
>> >> >> >
>> >> >> > 2. After executing the above code, I observed the memory in Task Manager.
>> >> >> > The Commit Charge Total jumped to 300M and the MEM Usage shows the same
>> >> >> > thing. The problem is whatever you do the memory usage can never drop
>> >> >> > back. I
>> >> >> > have tried to open many other applications and NOTHING can make the memory
>> >> >> > get released.
>> >> >> >
>> >> >> > 3. The variable aaa has gone out of scope, so it does not make any sense
>> >> >> > that the .NET still holds the memory. Now the performance of the whole
>> >> >> > system
>> >> >> > is downgraded.
>> >> >> >
>> >> >> > 4. The only thing I can make the memory release to Windows is to call
>> >> >> > GC.Collect(). Is this the only way to release that memory? Do I suppose to
>> >> >> > do
>> >> >> > this?
>> >> >> >
>> >> >> > Can anybody confirm if this is a bug in .NET? Appearantly it does not make
>> >> >> > sense to hold the memory FOREVER even if nobody uses it.
>> >> >> >
>> >> >> > Thanks,
>> >> >> >
>> >> >> > Yang.
>> >> >>
>> >> >>
>> >> >>
>> >> >
>> >>
>> >>
>> >> --
>> >>
>> >> This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
>> >> http://www.microsoft.com/info/cpyright.htm
>> >>
>> >> Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
>> >>
>> >>
>> >
>>
>>
>> --
>>
>> This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
>> http://www.microsoft.com/info/cpyright.htm
>>
>> Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
>>
>>
>

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

Jul 21 '05 #16

P: n/a
Hi Chris,

I totally understand what you are talking about and I don't think I confused
with anything. Actually the reason I found this problem is from the REAL
project I am currently working on. In this project, we need to do a
simulation which involves very intensive math calculation and a huge array.
That is why we found this problem.

You might be right that we need to redesign the project becasue .NET does
not have the ability to handle huge arrays correctly. If I interpret your
saying about redesign, I would say you are telling us NOT to use huge arrays.
But in our calulation it is almost impossible not to use a huge array. If I
someday we can change our simulation algorithm to not to use the huge array,
it would make the algorithm very complex and downgrade the performace. BUT -
since we have 1GB memory chip on my machine, why cannot I use it? Why do I
have to close my application to release my memory to other applications?

I have describle the algorithm about how to handle this situation ideally in
my previous posts. You just simply do not want to admit I am right and you
just do not want to say .NET needs improvement - this is the part really
confused me...

Yang.

Jul 21 '05 #17

P: n/a
Yang Lu <Ya****@discussions.microsoft.com> wrote:
I totally understand what you are talking about and I don't think I confused
with anything. Actually the reason I found this problem is from the REAL
project I am currently working on. In this project, we need to do a
simulation which involves very intensive math calculation and a huge array.
That is why we found this problem.

You might be right that we need to redesign the project becasue .NET does
not have the ability to handle huge arrays correctly. If I interpret your
saying about redesign, I would say you are telling us NOT to use huge arrays.
But in our calulation it is almost impossible not to use a huge array. If I
someday we can change our simulation algorithm to not to use the huge array,
it would make the algorithm very complex and downgrade the performace. BUT -
since we have 1GB memory chip on my machine, why cannot I use it? Why do I
have to close my application to release my memory to other applications?

I have describle the algorithm about how to handle this situation ideally in
my previous posts. You just simply do not want to admit I am right and you
just do not want to say .NET needs improvement - this is the part really
confused me...


But do you only need this huge array once, and your application will
keep running for a long time after it's no longer used, never
allocating large objects? That's the situation in which it becomes a
problem, as I understand it.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Jul 21 '05 #18

P: n/a
> But do you only need this huge array once, and your application will
keep running for a long time after it's no longer used, never
allocating large objects? That's the situation in which it becomes a
problem, as I understand it.
Our application is to control irrigation system. One of its feature is
simulation how much water will be used. If users want to simulate one day's
water usage, it consume less memory, but if they want to simulate water usage
for 2 couple months, there will need a huge amount of memory. Also users can
run the simulation at any time: maybe once a day, or maybe once a week...

What our QA team observed was that the memory used by this application,
after the simulation, can increase, but never decrease. After researching for
a few days, I found our situation can be generalized by the sample code I
post here.


--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Jul 21 '05 #19

P: n/a
Hi Yang
You might be right that we need to redesign the project becasue .NET does
not have the ability to handle huge arrays correctly. If I interpret your
saying about redesign, I would say you are telling us NOT to use huge arrays.
I don't think I ever said .NET doesn't handle huge arrays correctly :)
What I mean is, the way you should design your managed application, with respect to memory allocations, is very different than unmanaged. In C++ you can allocate a huge array,
and as soon as you're done with it delete it, and the memory is freed. Obviously with a GC you can't do that, and lage objects are handled differently, since it's so expensive to
create, deallocate, re-create, etc.

Instead of a huge contiguous chunk of memory, is it possible for you to use an ArrayList, or some other dynamic structure? That way it won't all be allocated into the LOH, and be
collected earlier, possibly in pieces. It's hard to give concrete advice without seeing your application, but in general you want to avoid huge memory allocations in .NET because
it limits what the GC can do with it (mainly for perf reasons). See Rico Mariani's blog for more on GC and Perf: http://weblogs.asp.net/ricom/
I have describle the algorithm about how to handle this situation ideally in
my previous posts. You just simply do not want to admit I am right and you
just do not want to say .NET needs improvement - this is the part really
confused me...


Of course .NET needs improvement, that's why I have a job ;)
But I think your algorithm doesn't take into account the other issues involved in a general garbage collector's implementation. There's an excellent book on garbage collection
technology by Jones and Lins that I would recommend, if you're interested in learning more.

-Chris

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

Jul 21 '05 #20

P: n/a
Thanks, Chris. I think it might make sense to try ArrayList... I will try it.
This post is helpful.

Yang.

""Chris Lyon [MSFT]"" wrote:
Hi Yang
You might be right that we need to redesign the project becasue .NET does
not have the ability to handle huge arrays correctly. If I interpret your
saying about redesign, I would say you are telling us NOT to use huge arrays.


I don't think I ever said .NET doesn't handle huge arrays correctly :)
What I mean is, the way you should design your managed application, with respect to memory allocations, is very different than unmanaged. In C++ you can allocate a huge array,
and as soon as you're done with it delete it, and the memory is freed. Obviously with a GC you can't do that, and lage objects are handled differently, since it's so expensive to
create, deallocate, re-create, etc.

Instead of a huge contiguous chunk of memory, is it possible for you to use an ArrayList, or some other dynamic structure? That way it won't all be allocated into the LOH, and be
collected earlier, possibly in pieces. It's hard to give concrete advice without seeing your application, but in general you want to avoid huge memory allocations in .NET because
it limits what the GC can do with it (mainly for perf reasons). See Rico Mariani's blog for more on GC and Perf: http://weblogs.asp.net/ricom/
I have describle the algorithm about how to handle this situation ideally in
my previous posts. You just simply do not want to admit I am right and you
just do not want to say .NET needs improvement - this is the part really
confused me...


Of course .NET needs improvement, that's why I have a job ;)
But I think your algorithm doesn't take into account the other issues involved in a general garbage collector's implementation. There's an excellent book on garbage collection
technology by Jones and Lins that I would recommend, if you're interested in learning more.

-Chris

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

Jul 21 '05 #21

P: n/a
JT
I have found running your own force garbage collection thread in the
background can help even though it will take up a little CPU. You can play
around with the sleep time.

static void Main()
{
Thread GCThread = new System.Threading.Thread(new ThreadStart(RunGC));
GCThread.Start();
Application.Run(new Form1());
GCThread.Abort();
}

static private void RunGC()
{
do
{
GC.Collect();
Thread.Sleep(new TimeSpan(0, 0, 0, 10));
} while (1 != 0);
}

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++) aaa[i] = 10;
}

"Yang" wrote:
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa[i] = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.

Jul 21 '05 #22

P: n/a
JT <JT@discussions.microsoft.com> wrote:
I have found running your own force garbage collection thread in the
background can help even though it will take up a little CPU. You can play
around with the sleep time.


The garbage collector doesn't really run in a background thread - it
needs to stop *all* managed threads while it runs.

I would personally advise against forcing garbage collections in almost
all situations.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Jul 21 '05 #23

P: n/a
JT,

I can't stress enough that what you suggested is a BAD IDEA. Not only are you hurting performance, but you're messing with the GC's own collection scheme. What is
happening by continuously calling GC.Collect(), is that you're forcing promotion of objects from generation 0 to generation 2, that would otherwise be collected.

The GC tunes itself to your memory allocation patterns and determines the best times to Collect to minimize performance hits, and maximize memory usage. There are very
few times where GC.Collect() should be called in production code.

For more information about GC collection perf hits, see Rico Mariani's blog entry:
http://weblogs.asp.net/ricom/archive.../04/41281.aspx

-Chris
--------------------

I have found running your own force garbage collection thread in the
background can help even though it will take up a little CPU. You can play
around with the sleep time.

static void Main()
{
Thread GCThread = new System.Threading.Thread(new ThreadStart(RunGC));
GCThread.Start();
Application.Run(new Form1());
GCThread.Abort();
}

static private void RunGC()
{
do
{
GC.Collect();
Thread.Sleep(new TimeSpan(0, 0, 0, 10));
} while (1 != 0);
}

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++) aaa[i] = 10;
}

"Yang" wrote:
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa[i] = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

Jul 21 '05 #24

P: n/a
JT
Yes I know maximize memory usage is the problem...Joking

""Chris Lyon [MSFT]"" wrote:
JT,

I can't stress enough that what you suggested is a BAD IDEA. Not only are you hurting performance, but you're messing with the GC's own collection scheme. What is
happening by continuously calling GC.Collect(), is that you're forcing promotion of objects from generation 0 to generation 2, that would otherwise be collected.

The GC tunes itself to your memory allocation patterns and determines the best times to Collect to minimize performance hits, and maximize memory usage. There are very
few times where GC.Collect() should be called in production code.

For more information about GC collection perf hits, see Rico Mariani's blog entry:
http://weblogs.asp.net/ricom/archive.../04/41281.aspx

-Chris
--------------------

I have found running your own force garbage collection thread in the
background can help even though it will take up a little CPU. You can play
around with the sleep time.

static void Main()
{
Thread GCThread = new System.Threading.Thread(new ThreadStart(RunGC));
GCThread.Start();
Application.Run(new Form1());
GCThread.Abort();
}

static private void RunGC()
{
do
{
GC.Collect();
Thread.Sleep(new TimeSpan(0, 0, 0, 10));
} while (1 != 0);
}

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++) aaa[i] = 10;
}

"Yang" wrote:
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa[i] = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.

Jul 21 '05 #25

This discussion thread is closed

Replies have been disabled for this discussion.