By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,364 Members | 1,278 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,364 IT Pros & Developers. It's quick & easy.

What good is this automatic garbage collector?

P: n/a
joe
I have a simple .NET application with two or three listViews which are
filled with icons and when the user click on the proper item, they display
the related images. I use "image = null ; " for all images that have been
used and are going to be closed. This is how ever no way to reduce the
memory consumption. I have noticed , using the task manager, that garbage
collector doesn't actually do any collections unless the computer becomes
low on memory. This is very foolish, and what good is a garbage collector
which doesn't collect the disposed objects when they aren't needed anymore?

Besides, calling CG.Collect() is usually avoided for performance and speed.
What else can i do?

PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/

Jul 21 '05 #1
Share this Question
Share on Google+
28 Replies


P: n/a
>I have noticed , using the task manager, that garbage
collector doesn't actually do any collections unless the computer becomes
low on memory. This is very foolish, and what good is a garbage collector
which doesn't collect the disposed objects when they aren't needed
anymore?

Allthough it the task manager is not the good instrument to check.

I am curious, what is foolish about this, every garbaging uses processor
time.

To use a methpore "Do you empty your trash bin, everytime you have thrown in
a paper or whatever?

Cor
Jul 21 '05 #2

P: n/a
Have you tried to call Dispose ?

Though the point of a garbage collector is that there is no need to reclaim
memory if you have no use for it, it's worth to keep in mind that it have
its root in the managed world and that the Dispose or Close methods should
still allows to reclaim unmanaged memory immediately...

Please let us know...

Patrice

--

"joe" <jo*****@rot.ofm.net> a écrit dans le message de
news:uE****************@TK2MSFTNGP14.phx.gbl...
I have a simple .NET application with two or three listViews which are
filled with icons and when the user click on the proper item, they display
the related images. I use "image = null ; " for all images that have been
used and are going to be closed. This is how ever no way to reduce the
memory consumption. I have noticed , using the task manager, that garbage
collector doesn't actually do any collections unless the computer becomes
low on memory. This is very foolish, and what good is a garbage collector
which doesn't collect the disposed objects when they aren't needed anymore?
Besides, calling CG.Collect() is usually avoided for performance and speed. What else can i do?

PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/

Jul 21 '05 #3

P: n/a
joe
> To use a methpore "Do you empty your trash bin, everytime you have thrown
in a paper or whatever?


Can you recycle disposed objects in a programming language? If you can, then
why do you put them in the recycle bin?
just curious.

Jul 21 '05 #4

P: n/a
joe

"Patrice" <no****@nowhere.com> wrote in message
news:eZ*************@TK2MSFTNGP15.phx.gbl...
Have you tried to call Dispose ?
Dispose is not supported on all objects. In my case, calling GC.collect()
does a little help though.
Though the point of a garbage collector is that there is no need to
reclaim
memory if you have no use for it, it's worth to keep in mind that it have
its root in the managed world and that the Dispose or Close methods should
still allows to reclaim unmanaged memory immediately...


Yes, in a perfect managed world where no one wants to struggle with memory
allocation and release, this could be good suggestion. But in the same
managed world there are times when an object is used only "once", and then
thrown away forever or used after a long time or maybe in next program
launch. Why should its memory be still occupied by the program?
I mean, there should a way to have more control over this kind of memory
management. ( sth like a half-automatic GC, until the day GC becomes really
smart and intelligence)

i can also use SetProcessWorkingSetSize(-1,-1) , however it is not usually
recommended.

Jul 21 '05 #5

P: n/a
> I have noticed , using the task manager, that garbage
collector doesn't actually do any collections unless the computer becomes
low on memory. This is very foolish, and what good is a garbage collector
which doesn't collect the disposed objects when they aren't needed anymore?

Why is this foolish? Why would the GC need to use additional processing
power for dynamic cleanup?
What would the unused memory be needed for untill the system gets low on
memory?

I like the way it is implemented. It only cleans up when new memory is
needed and when there is no memory left.
This way the GC is faster, because it can do the job in one go, and your
code gets executed faster because it does not lose time to clean up every
time you release memory. and once it starts cleaning up, it can do it
faster.

The only anoying thing that could give problems is for games, that generates
a hickup because one GC takes up a little longer than a dynamic one, but
overall, a GC only during low memory will actually free up more processing
power. And I believe that it also speeds up your program since the compiled
code is smaller for that function (no code for cleanup in that function) and
thus the chance that this function could reside completely in your processor
cache could speed up dramatically.

One thing that might be implemented is to use dynamic GC when the processor
is in idle mode. But then again, you could regard this as unecesary using
processor cycles, which decreases the laptop batterie power because it needs
more power.
Jul 21 '05 #6

P: n/a


joe wrote:

[snip]

PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/


I disagree. While some aspects of the article may be at least
partially correct, it does more harm than good. For example, the
author completely missed the point of the original claim about reduced
memory leaks. Similarly, I thought the other points were
misinterpreted as well, albiet, to a lesser degree.

Brian

Jul 21 '05 #7

P: n/a
If "Dispose" is not supported you should have then only the managed object.
In most cases it shouldn't a problem (ie. the bitmap portion of an image
object is likely much more big than the members of the managed class).

Another option I can think of is for example for voluminous data structure
to change the current size (for example resizing an array to a lower size).
Could perhaps help (not sure, give this a try, let us know please).

On the other hand what is the problem with having something kept around and
that doesn't harm ? If you really want to clean up the object immediately
they would have to be back to reference counting...

For now I would start first to see if it raises an actual problem and
wouldn't bother if not...

Patrice

--

"joe" <jo*****@rot.ofm.net> a écrit dans le message de
news:ub**************@tk2msftngp13.phx.gbl...

"Patrice" <no****@nowhere.com> wrote in message
news:eZ*************@TK2MSFTNGP15.phx.gbl...
Have you tried to call Dispose ?
Dispose is not supported on all objects. In my case, calling GC.collect()
does a little help though.
Though the point of a garbage collector is that there is no need to
reclaim
memory if you have no use for it, it's worth to keep in mind that it have its root in the managed world and that the Dispose or Close methods should still allows to reclaim unmanaged memory immediately...


Yes, in a perfect managed world where no one wants to struggle with memory
allocation and release, this could be good suggestion. But in the same
managed world there are times when an object is used only "once", and then
thrown away forever or used after a long time or maybe in next program
launch. Why should its memory be still occupied by the program?
I mean, there should a way to have more control over this kind of memory
management. ( sth like a half-automatic GC, until the day GC becomes

really smart and intelligence)

i can also use SetProcessWorkingSetSize(-1,-1) , however it is not usually
recommended.

Jul 21 '05 #8

P: n/a
PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/

This commentary is a case of one inexperienced programmer criticizing a
feature that they do not understand, with poor understanding of the forces
that underlie the fundamental reasoning. Garbage collection is easy in
C++... if every developer were free of mistakes and if code were not
complex. This is not the case in the real world. The author of that
article completely failed to recognize the reality of memory leaks in a
production system of substantial size.

One reason for the success of BOTH Java and .Net languages like C# is that
this problem is solved for you. You may not agree with the way in which it
is solved, but it is solved for you. That is a huge step up and a major
boon for software development.

The problem isn't the computers or their languages... it is the limitations
of the humans who use them.
--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
"joe" <jo*****@rot.ofm.net> wrote in message
news:uE****************@TK2MSFTNGP14.phx.gbl...I have a simple .NET application with two or three listViews which are
filled with icons and when the user click on the proper item, they display
the related images. I use "image = null ; " for all images that have been
used and are going to be closed. This is how ever no way to reduce the
memory consumption. I have noticed , using the task manager, that garbage
collector doesn't actually do any collections unless the computer becomes
low on memory. This is very foolish, and what good is a garbage collector
which doesn't collect the disposed objects when they aren't needed
anymore?

Besides, calling CG.Collect() is usually avoided for performance and
speed.
What else can i do?

PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/

Jul 21 '05 #9

P: n/a
joe
Nick,
Your reasoning is respected and welcomed. I don't hate or love something
like idiots, i simply point out the weak points.

It is true that CG is a great miracle in C# and java, and help the
programmer to concentrate on more critical aspects of his work, however,
when it comes to page faults, it exhibits its weak points.

You might say that because allocation or garbage collection is done only
when the computer becomes low on memory, you gain much higher performance in
your application. This can be true for small programs, however, when some
real big programs are executed on computers with low memory ( sth like 256
MB) , then the amount of OS's page swapping and virtual memory reading and
writing really slows down the entire system. Just think, why your OS is much
faster after a clean restart than the time it is run for long hours of
application executing and playing with the virtual memory?

Well, at least there could be stricter algorithm implemented, which would
force the GC to come into play much sooner that it does now on low-memory
PCs.
"Nick Malik [Microsoft]" <ni*******@hotmail.nospam.com> wrote in message
news:pI********************@comcast.com...
PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/


This commentary is a case of one inexperienced programmer criticizing a
feature that they do not understand, with poor understanding of the forces
that underlie the fundamental reasoning. Garbage collection is easy in
C++... if every developer were free of mistakes and if code were not
complex. This is not the case in the real world. The author of that
article completely failed to recognize the reality of memory leaks in a
production system of substantial size.

One reason for the success of BOTH Java and .Net languages like C# is that
this problem is solved for you. You may not agree with the way in which
it is solved, but it is solved for you. That is a huge step up and a
major boon for software development.

The problem isn't the computers or their languages... it is the
limitations of the humans who use them.
--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.

Jul 21 '05 #10

P: n/a
The GC runs more often than you think - just watch the CLR Performance
counters while your program is running, but the GC isn't meant to collect
unmanaged resources like file handles, DB connections and unmanaged memory,
the GC will also not collect non garbage, that is objects that have live
references. Non managed resources can explicitly and deterministically be
released by calling Dispose or Close or whatever method there is used to do
so. the FCL used the Dispose pattern to dispose of unmanaged resources, and
the languages (C# since v1 and VB in v2.0) help you to automate this pattern
by means of the using statement. If you fail to apply this design pattern
and if you are holding references to objects alive when they should not than
there is nothing the GC can do to reclaim memory, but this is not different
from the unmanaged world.
Note, there will be no swapping because non referenced objects aren't GC
collected. The OS will trigger an event that will force the GC to collect
when there is memory pressure. Note also that the garbage collector runs
more often than you thing.
Willy.

"joe" <jo*****@rot.ofm.net> wrote in message
news:OM**************@TK2MSFTNGP10.phx.gbl...
Nick,
Your reasoning is respected and welcomed. I don't hate or love something
like idiots, i simply point out the weak points.

It is true that CG is a great miracle in C# and java, and help the
programmer to concentrate on more critical aspects of his work, however,
when it comes to page faults, it exhibits its weak points.

You might say that because allocation or garbage collection is done only
when the computer becomes low on memory, you gain much higher performance
in your application. This can be true for small programs, however, when
some real big programs are executed on computers with low memory ( sth
like 256 MB) , then the amount of OS's page swapping and virtual memory
reading and writing really slows down the entire system. Just think, why
your OS is much faster after a clean restart than the time it is run for
long hours of application executing and playing with the virtual memory?

Well, at least there could be stricter algorithm implemented, which would
force the GC to come into play much sooner that it does now on low-memory
PCs.
"Nick Malik [Microsoft]" <ni*******@hotmail.nospam.com> wrote in message
news:pI********************@comcast.com...
PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/


This commentary is a case of one inexperienced programmer criticizing a
feature that they do not understand, with poor understanding of the
forces that underlie the fundamental reasoning. Garbage collection is
easy in C++... if every developer were free of mistakes and if code were
not complex. This is not the case in the real world. The author of that
article completely failed to recognize the reality of memory leaks in a
production system of substantial size.

One reason for the success of BOTH Java and .Net languages like C# is
that this problem is solved for you. You may not agree with the way in
which it is solved, but it is solved for you. That is a huge step up and
a major boon for software development.

The problem isn't the computers or their languages... it is the
limitations of the humans who use them.
--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.


Jul 21 '05 #11

P: n/a
Hello Joe,

I appreciate that you are taking a reasoned approach.

I also agree that a case could be made for fine-tuning the GC. That said,
Willy's comment is correct... the GC really does run more often than you
think. Also page swapping, as caused by .Net apps, is probably less than
you think. Note that a page is only swapped "in" when memory on that page
is referenced. If the page is completely empty of active objects, then it
will not swap in until the GC frees it. (This actually slows the system
down if you do this too often).

You refer to how much faster the system runs after a while. This is
completely true. However, it is also often the case that the culprit for
this kind of memory fragmentation is the use of unmanaged objects that don't
drop their references and therefore create gaps in the memory space that is
difficult for the heap allocation system to reuse.

As for low-memory systems, I would assert that you are using the OS at or
near the smallest amount of memory that it supports. Normal OS operations
will fragment the memory, all by themselves, without any help from your app.
In this case, .Net is not able to really improve the situation.

Note that the OS teams have made real strides over the years in improving
how memory is used. However, as memory has become so much less expensive,
it is not unreasonable to expect that users will occasionally upgrade their
memory when they install a new OS to get new features like better security,
better handling of multimedia, more efficient file systems and (strictly for
the non-server environments) more advanced games.

I hope this helps. I understand your frustration. (One of my systems at
home has 256M of RAM, too... I'm just too lazy to run down to Fry's and get
a memory module for it :-(.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
"joe" <jo*****@rot.ofm.net> wrote in message
news:OM**************@TK2MSFTNGP10.phx.gbl...
Nick,
Your reasoning is respected and welcomed. I don't hate or love something
like idiots, i simply point out the weak points.

It is true that CG is a great miracle in C# and java, and help the
programmer to concentrate on more critical aspects of his work, however,
when it comes to page faults, it exhibits its weak points.

You might say that because allocation or garbage collection is done only
when the computer becomes low on memory, you gain much higher performance
in your application. This can be true for small programs, however, when
some real big programs are executed on computers with low memory ( sth
like 256 MB) , then the amount of OS's page swapping and virtual memory
reading and writing really slows down the entire system. Just think, why
your OS is much faster after a clean restart than the time it is run for
long hours of application executing and playing with the virtual memory?

Well, at least there could be stricter algorithm implemented, which would
force the GC to come into play much sooner that it does now on low-memory
PCs.
"Nick Malik [Microsoft]" <ni*******@hotmail.nospam.com> wrote in message
news:pI********************@comcast.com...
PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/


This commentary is a case of one inexperienced programmer criticizing a
feature that they do not understand, with poor understanding of the
forces that underlie the fundamental reasoning. Garbage collection is
easy in C++... if every developer were free of mistakes and if code were
not complex. This is not the case in the real world. The author of that
article completely failed to recognize the reality of memory leaks in a
production system of substantial size.

One reason for the success of BOTH Java and .Net languages like C# is
that this problem is solved for you. You may not agree with the way in
which it is solved, but it is solved for you. That is a huge step up and
a major boon for software development.

The problem isn't the computers or their languages... it is the
limitations of the humans who use them.
--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.


Jul 21 '05 #12

P: n/a
I really love the GC. :-)
Not because I have a lot of memory leaks in my pogram, but because my
programming gets far simpler because I do no have to keep track of which
list created the object and in what order do I need to release it to prevent
an access violation if it is referenced by another list.

This way I am guaranteed that the object is only released when bot lists are
not referencing it anymore.
In the none-GC way I have to keep track of that creating additional code
that might even be slower because the GC way is far more optimized than the
code that I would create. The closest to what is equivalent in the GC is
what COM objects does. When the reference counter reaches zero it disposes
itself. But you do not have to call the AddRef and release.

The only negative side of the GC is that it sometimes starts to collect and
for a game that means a momentarely freeze which is deadly if you were about
to kill the bad guy online. ;-) Or maybe when you have a program that must
do something real-time like capturing a video that might lose frames.
Jul 22 '05 #13

P: n/a
On Fri, 24 Jun 2005 14:18:13 +0430, joe wrote:
I have a simple .NET application with two or three listViews which are
filled with icons and when the user click on the proper item, they display
the related images. I use "image = null ; " for all images that have been
used and are going to be closed. This is how ever no way to reduce the
memory consumption. I have noticed , using the task manager, that garbage
collector doesn't actually do any collections unless the computer becomes
low on memory. This is very foolish, and what good is a garbage collector
which doesn't collect the disposed objects when they aren't needed anymore?

Besides, calling CG.Collect() is usually avoided for performance and speed.
What else can i do?

PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/


This might be a futile suggestion, unhelpful for your situation ... but I
have noticed that GC is more agressive when the project is compiled for
release, rather than debug.
Jul 22 '05 #14

P: n/a
Have you verified the "no GC unless low memory" using the GC perf counters? Task Manager doesn't show you what the GC is doing.

And agreed, in Release build the GC's concept of object liveness is much more aggressive

Regards

Richard Blewett - DevelopMentor
http://www.dotnetconsult.co.uk/weblog
http://www.dotnetconsult.co.uk

On Fri, 24 Jun 2005 14:18:13 +0430, joe wrote:
I have a simple .NET application with two or three listViews which are
filled with icons and when the user click on the proper item, they display
the related images. I use "image = null ; " for all images that have been
used and are going to be closed. This is how ever no way to reduce the
memory consumption. I have noticed , using the task manager, that garbage
collector doesn't actually do any collections unless the computer becomes
low on memory. This is very foolish, and what good is a garbage collector
which doesn't collect the disposed objects when they aren't needed anymore?

Besides, calling CG.Collect() is usually avoided for performance and speed.
What else can i do?

PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/


This might be a futile suggestion, unhelpful for your situation ... but I
have noticed that GC is more agressive when the project is compiled for
release, rather than debug.

Jul 22 '05 #15

P: n/a
Like most things there are tradeoffs involved. One of the most
difficult parts of developing large software systems, especially among
multiple developers, is defining and maintaining an object life cycle.
You must define who creates the object, and who has the responsibility
for the storage at any time. In some applications it is fairly
straight forward, in other cases it is not.

C# and the garbage collector removes this burden (for the most part)
from the design and implementation process. However it does this at an
expense. First, there is a performance penality versus what can be
done in native C++. This penality can be relativily small and will
only come into play in the most performance critical of applications.
Second, there is a memory foot print issue. There will be a small
penality for garbage collection.

Finally, there is a point of confusion in the original post. Most
memory schemes, whether they be the garbage collection in Java or C#,
or whether it is the basic malloc/free in C, do not return memory to
the operating system. Memory that is free'ed is made available for the
next request in that process, but the process will almost never return
memory to the OS. Therefore, looking at the memory size for the
process will show the process only getting larger as the process runs
and never getting smaller. If you want to really characterize memory
usage, you need to do that within the process, with the cooperation of
the memory system. There are ways to do this in Visual C++ and C# as
well.
Just my $0.02 worth
Butch

Jul 22 '05 #16

P: n/a
On 28 Jun 2005 20:14:39 -0700, bu****@comcast.net wrote:
Finally, there is a point of confusion in the original post. Most
memory schemes, whether they be the garbage collection in Java or C#,
or whether it is the basic malloc/free in C, do not return memory to
the operating system. Memory that is free'ed is made available for the
next request in that process, but the process will almost never return
memory to the OS. Therefore, looking at the memory size for the
process will show the process only getting larger as the process runs
and never getting smaller.
Yes, but you can tell the difference between a lazy GC and an aggressive
GC: the process running the agressive GC will not grab as much memory from
the OS in the first place, because it will recycle memory from free'd
objects, whereas the lazy GC will just grab more memory until the system
runs low.
If you want to really characterize memory
usage, you need to do that within the process, with the cooperation of
the memory system. There are ways to do this in Visual C++ and C# as
well.

Jul 22 '05 #17

P: n/a
The answer is that in most cases, "No.", you can't recycle disposed objects.
You need to make a new instance of one.
"joe" <jo*****@rot.ofm.net> wrote in message
news:uS**************@tk2msftngp13.phx.gbl...
To use a methpore "Do you empty your trash bin, everytime you have thrown
in a paper or whatever?


Can you recycle disposed objects in a programming language? If you can,
then
why do you put them in the recycle bin?
just curious.

Jul 22 '05 #18

P: n/a
Why exactly are you trying to reclaim memory when there is still plenty of
memory for you application to run?
"joe" <jo*****@rot.ofm.net> wrote in message
news:ub**************@tk2msftngp13.phx.gbl...

"Patrice" <no****@nowhere.com> wrote in message
news:eZ*************@TK2MSFTNGP15.phx.gbl...
Have you tried to call Dispose ?


Dispose is not supported on all objects. In my case, calling GC.collect()
does a little help though.
Though the point of a garbage collector is that there is no need to
reclaim
memory if you have no use for it, it's worth to keep in mind that it have
its root in the managed world and that the Dispose or Close methods
should
still allows to reclaim unmanaged memory immediately...


Yes, in a perfect managed world where no one wants to struggle with memory
allocation and release, this could be good suggestion. But in the same
managed world there are times when an object is used only "once", and then
thrown away forever or used after a long time or maybe in next program
launch. Why should its memory be still occupied by the program?
I mean, there should a way to have more control over this kind of memory
management. ( sth like a half-automatic GC, until the day GC becomes
really
smart and intelligence)

i can also use SetProcessWorkingSetSize(-1,-1) , however it is not usually
recommended.

Jul 22 '05 #19

P: n/a
GC is a big work! if u set some enviroment variable , GC can occured every
time, but is's too slowly...Of course, if u want to find some memory
problems(such as leak) ,this is a good idea.
GC.Collect() is not a good idea, u'd better NOT call this method in your code.

"Brian Gideon" wrote:


joe wrote:

[snip]

PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/


I disagree. While some aspects of the article may be at least
partially correct, it does more harm than good. For example, the
author completely missed the point of the original claim about reduced
memory leaks. Similarly, I thought the other points were
misinterpreted as well, albiet, to a lesser degree.

Brian

Jul 22 '05 #20

P: n/a
> GC is a big work! if u set some enviroment variable , GC can occured every
time, but is's too slowly.. Exactly what setting is that?

--
Regards,
Alvin Bruney - ASP.NET MVP

[Shameless Author Plug]
The Microsoft Office Web Components Black Book with .NET
Now available @ www.lulu.com/owc, Amazon.com etc
"juqiang" <ju*****@discussions.microsoft.com> wrote in message
news:23**********************************@microsof t.com... GC is a big work! if u set some enviroment variable , GC can occured every
time, but is's too slowly...Of course, if u want to find some memory
problems(such as leak) ,this is a good idea.
GC.Collect() is not a good idea, u'd better NOT call this method in your
code.

"Brian Gideon" wrote:


joe wrote:
>
> [snip]
>
> PS: i wont hurt you to read this:
> http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/


I disagree. While some aspects of the article may be at least
partially correct, it does more harm than good. For example, the
author completely missed the point of the original claim about reduced
memory leaks. Similarly, I thought the other points were
misinterpreted as well, albiet, to a lesser degree.

Brian

Jul 22 '05 #21

P: n/a
"Cor Ligthert" wrote:
To use a methpore "Do you empty your trash bin, everytime you have thrown in
a paper or whatever?


Actually, a better metaphor would be that the act of throwing your paper in
the trash is the act of garbage collection. Just as the system leaves objects
in memory until they are collected, you would leave paper you've finished
with lying on your desk. Eventually, you'll get to a point where you have so
much scrap paper around that you have a tidy-up.

Me, when I finish with a paper, I scrunch it up and chuck it there and then.
Makes my working environment much cleaner and tidier... but then, I do work
in C++ :)

Jul 22 '05 #22

P: n/a
Well, apart from issues like system performance - where the cache size is
reduced because some app is hogging all available RAM, we have had issues
with ASP.net apps that keep on using more and more memory causing other apps
to run into difficulties.

"Scott M." wrote:
Why exactly are you trying to reclaim memory when there is still plenty of
memory for you application to run?
"joe" <jo*****@rot.ofm.net> wrote in message
news:ub**************@tk2msftngp13.phx.gbl...

"Patrice" <no****@nowhere.com> wrote in message
news:eZ*************@TK2MSFTNGP15.phx.gbl...
Have you tried to call Dispose ?


Dispose is not supported on all objects. In my case, calling GC.collect()
does a little help though.
Though the point of a garbage collector is that there is no need to
reclaim
memory if you have no use for it, it's worth to keep in mind that it have
its root in the managed world and that the Dispose or Close methods
should
still allows to reclaim unmanaged memory immediately...


Yes, in a perfect managed world where no one wants to struggle with memory
allocation and release, this could be good suggestion. But in the same
managed world there are times when an object is used only "once", and then
thrown away forever or used after a long time or maybe in next program
launch. Why should its memory be still occupied by the program?
I mean, there should a way to have more control over this kind of memory
management. ( sth like a half-automatic GC, until the day GC becomes
really
smart and intelligence)

i can also use SetProcessWorkingSetSize(-1,-1) , however it is not usually
recommended.


Jul 22 '05 #23

P: n/a
Nick,
The notion of memory getting cheap is really relative. You buy the best
possible machine required for your business. By the time you need more memory
the processor and the memory technology are outdated. You dont always get
what you want. The end user of software wants return on his investment and
the time frame for each update is very short.
I am currently facing a problem where the guys in field have PIII machines
with 256 MB RAM and the start up time for framework is killing the app. Also
the memory requirements for reporting tool are also becoming performance
bottleneck. My client gets distributes the software to get the business from
agents in field. He can't control the hardware beyond a point. He has to
support hardware that came into market 3 years back. The guy in field has
simply not recovered his investment or does not require additional H/W for
his other work.
Not all guys get enough money to throw their PCs or worse case laptops in
dustbin every 2 years.

"Nick Malik [Microsoft]" wrote:
Hello Joe,

I appreciate that you are taking a reasoned approach.

I also agree that a case could be made for fine-tuning the GC. That said,
Willy's comment is correct... the GC really does run more often than you
think. Also page swapping, as caused by .Net apps, is probably less than
you think. Note that a page is only swapped "in" when memory on that page
is referenced. If the page is completely empty of active objects, then it
will not swap in until the GC frees it. (This actually slows the system
down if you do this too often).

You refer to how much faster the system runs after a while. This is
completely true. However, it is also often the case that the culprit for
this kind of memory fragmentation is the use of unmanaged objects that don't
drop their references and therefore create gaps in the memory space that is
difficult for the heap allocation system to reuse.

As for low-memory systems, I would assert that you are using the OS at or
near the smallest amount of memory that it supports. Normal OS operations
will fragment the memory, all by themselves, without any help from your app.
In this case, .Net is not able to really improve the situation.

Note that the OS teams have made real strides over the years in improving
how memory is used. However, as memory has become so much less expensive,
it is not unreasonable to expect that users will occasionally upgrade their
memory when they install a new OS to get new features like better security,
better handling of multimedia, more efficient file systems and (strictly for
the non-server environments) more advanced games.

I hope this helps. I understand your frustration. (One of my systems at
home has 256M of RAM, too... I'm just too lazy to run down to Fry's and get
a memory module for it :-(.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
"joe" <jo*****@rot.ofm.net> wrote in message
news:OM**************@TK2MSFTNGP10.phx.gbl...
Nick,
Your reasoning is respected and welcomed. I don't hate or love something
like idiots, i simply point out the weak points.

It is true that CG is a great miracle in C# and java, and help the
programmer to concentrate on more critical aspects of his work, however,
when it comes to page faults, it exhibits its weak points.

You might say that because allocation or garbage collection is done only
when the computer becomes low on memory, you gain much higher performance
in your application. This can be true for small programs, however, when
some real big programs are executed on computers with low memory ( sth
like 256 MB) , then the amount of OS's page swapping and virtual memory
reading and writing really slows down the entire system. Just think, why
your OS is much faster after a clean restart than the time it is run for
long hours of application executing and playing with the virtual memory?

Well, at least there could be stricter algorithm implemented, which would
force the GC to come into play much sooner that it does now on low-memory
PCs.
"Nick Malik [Microsoft]" <ni*******@hotmail.nospam.com> wrote in message
news:pI********************@comcast.com...

PS: i wont hurt you to read this:
http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/
This commentary is a case of one inexperienced programmer criticizing a
feature that they do not understand, with poor understanding of the
forces that underlie the fundamental reasoning. Garbage collection is
easy in C++... if every developer were free of mistakes and if code were
not complex. This is not the case in the real world. The author of that
article completely failed to recognize the reality of memory leaks in a
production system of substantial size.

One reason for the success of BOTH Java and .Net languages like C# is
that this problem is solved for you. You may not agree with the way in
which it is solved, but it is solved for you. That is a huge step up and
a major boon for software development.

The problem isn't the computers or their languages... it is the
limitations of the humans who use them.
--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.



Jul 22 '05 #24

P: n/a
Hemant,

I think that if you write about history you have to be correct.

I am currently facing a problem where the guys in field have PIII machines
with 256 MB RAM and ......... <snip>He has to support hardware that came into market 3 years back.


http://arstechnica.com/articles/paed...entium-1.ars/3

I hope this helps,

Cor
Jul 22 '05 #25

P: n/a
Hi HermanT,

Cor has a point. Two of my home systems are business-class PIII with 256K
RAM: they were purchased new by my dot-com in 1999... six years ago.

I don't know if you work in the USA or not. In the USA, the federal
government allows a business to write off 100% of the depreciation cost of a
new computer in the same year in which it is purchased. So your business
recovered about a third of the cost of the machines that year in tax
offsets.

My business-class machines cost about $1500 new (I know because part of my
job was to approve their purchase). At today's prices, my two
business-class systems will cost me about $100 each to upgrade to 512MB.
http://www.memorydealers.com/dixpst5.html

Both machines are perfectly serviceable and I have no intention of getting
rid of either any time soon, although I'll have to replace a monitor this
year, and I pulled a power supply in a third (newer) machine about four
months ago. I probably put about $80 per year per machine into upkeep, not
including my time spent under the hood.

So, you won't hear a lot of pity from me when discussing a business that is
too cheap to purchase an inexpensive upgrade five or six years after the
computer was purchased. Even including the overhead, spending $300 per year
to keep up a $1500 investment isn't asking a lot.

Now back to software,

According to you, your application is slow starting up because of the
framework loading up in a memory constrained environment. Consider this
option: put a service in their machine that starts up, sets a timer for five
minutes, and goes to sleep... waking up every five minutes to go back to
sleep. Now, when your app loads, the framework is in memory. It's a
kludge, granted, but it may save your business a whopping $100 per machine.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
"Hemant" <He****@discussions.microsoft.com> wrote in message
news:03**********************************@microsof t.com...
Nick,
The notion of memory getting cheap is really relative. You buy the best
possible machine required for your business. By the time you need more
memory
the processor and the memory technology are outdated. You dont always get
what you want. The end user of software wants return on his investment and
the time frame for each update is very short.
I am currently facing a problem where the guys in field have PIII machines
with 256 MB RAM and the start up time for framework is killing the app.
Also
the memory requirements for reporting tool are also becoming performance
bottleneck. My client gets distributes the software to get the business
from
agents in field. He can't control the hardware beyond a point. He has to
support hardware that came into market 3 years back. The guy in field has
simply not recovered his investment or does not require additional H/W for
his other work.
Not all guys get enough money to throw their PCs or worse case laptops in
dustbin every 2 years.

"Nick Malik [Microsoft]" wrote:
Hello Joe,

I appreciate that you are taking a reasoned approach.

I also agree that a case could be made for fine-tuning the GC. That
said,
Willy's comment is correct... the GC really does run more often than you
think. Also page swapping, as caused by .Net apps, is probably less than
you think. Note that a page is only swapped "in" when memory on that
page
is referenced. If the page is completely empty of active objects, then
it
will not swap in until the GC frees it. (This actually slows the system
down if you do this too often).

You refer to how much faster the system runs after a while. This is
completely true. However, it is also often the case that the culprit for
this kind of memory fragmentation is the use of unmanaged objects that
don't
drop their references and therefore create gaps in the memory space that
is
difficult for the heap allocation system to reuse.

As for low-memory systems, I would assert that you are using the OS at or
near the smallest amount of memory that it supports. Normal OS
operations
will fragment the memory, all by themselves, without any help from your
app.
In this case, .Net is not able to really improve the situation.

Note that the OS teams have made real strides over the years in improving
how memory is used. However, as memory has become so much less
expensive,
it is not unreasonable to expect that users will occasionally upgrade
their
memory when they install a new OS to get new features like better
security,
better handling of multimedia, more efficient file systems and (strictly
for
the non-server environments) more advanced games.

I hope this helps. I understand your frustration. (One of my systems at
home has 256M of RAM, too... I'm just too lazy to run down to Fry's and
get
a memory module for it :-(.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
"joe" <jo*****@rot.ofm.net> wrote in message
news:OM**************@TK2MSFTNGP10.phx.gbl...
> Nick,
> Your reasoning is respected and welcomed. I don't hate or love
> something
> like idiots, i simply point out the weak points.
>
> It is true that CG is a great miracle in C# and java, and help the
> programmer to concentrate on more critical aspects of his work,
> however,
> when it comes to page faults, it exhibits its weak points.
>
> You might say that because allocation or garbage collection is done
> only
> when the computer becomes low on memory, you gain much higher
> performance
> in your application. This can be true for small programs, however, when
> some real big programs are executed on computers with low memory ( sth
> like 256 MB) , then the amount of OS's page swapping and virtual memory
> reading and writing really slows down the entire system. Just think,
> why
> your OS is much faster after a clean restart than the time it is run
> for
> long hours of application executing and playing with the virtual
> memory?
>
> Well, at least there could be stricter algorithm implemented, which
> would
> force the GC to come into play much sooner that it does now on
> low-memory
> PCs.
>
>
> "Nick Malik [Microsoft]" <ni*******@hotmail.nospam.com> wrote in
> message
> news:pI********************@comcast.com...
>>
>>> PS: i wont hurt you to read this:
>>> http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/
>>>
>>
>> This commentary is a case of one inexperienced programmer criticizing
>> a
>> feature that they do not understand, with poor understanding of the
>> forces that underlie the fundamental reasoning. Garbage collection is
>> easy in C++... if every developer were free of mistakes and if code
>> were
>> not complex. This is not the case in the real world. The author of
>> that
>> article completely failed to recognize the reality of memory leaks in
>> a
>> production system of substantial size.
>>
>> One reason for the success of BOTH Java and .Net languages like C# is
>> that this problem is solved for you. You may not agree with the way
>> in
>> which it is solved, but it is solved for you. That is a huge step up
>> and
>> a major boon for software development.
>>
>> The problem isn't the computers or their languages... it is the
>> limitations of the humans who use them.
>> --
>> --- Nick Malik [Microsoft]
>> MCSD, CFPS, Certified Scrummaster
>> http://blogs.msdn.com/nickmalik
>>
>> Disclaimer: Opinions expressed in this forum are my own, and not
>> representative of my employer.
>> I do not answer questions on behalf of my employer. I'm just a
>> programmer helping programmers.
>
>


Jul 22 '05 #26

P: n/a
Unfortunately, USA does not understand problems of rest of the world.
Currently we are working for a client in South Africa. Also the service
option does not work for Windows 98, which is still one of the very popular
OS in rest of the world. .Net Framework is supposed to be world ready. That
is why it has Unicode.

The PIII machine with 866MHz and 128MB RAM is about 4-5 years old in this
part of the world and $100 when converted to other currencies is not a small
amount for many people. Client is asking the users to upgrade and will stop
supporting these systems. But that is still one year away. And altough not
all are like this, 13% of the users are still using these systems. This is
not a small fraction.

Hope you realize the problems in field. I am not putting any dream in front
of you. This is reallity. The parameters by which you upgrade the systems are
not always valid in the field.

Regards and good luck for your work.
"Nick Malik [Microsoft]" wrote:
Hi HermanT,

Cor has a point. Two of my home systems are business-class PIII with 256K
RAM: they were purchased new by my dot-com in 1999... six years ago.

I don't know if you work in the USA or not. In the USA, the federal
government allows a business to write off 100% of the depreciation cost of a
new computer in the same year in which it is purchased. So your business
recovered about a third of the cost of the machines that year in tax
offsets.

My business-class machines cost about $1500 new (I know because part of my
job was to approve their purchase). At today's prices, my two
business-class systems will cost me about $100 each to upgrade to 512MB.
http://www.memorydealers.com/dixpst5.html

Both machines are perfectly serviceable and I have no intention of getting
rid of either any time soon, although I'll have to replace a monitor this
year, and I pulled a power supply in a third (newer) machine about four
months ago. I probably put about $80 per year per machine into upkeep, not
including my time spent under the hood.

So, you won't hear a lot of pity from me when discussing a business that is
too cheap to purchase an inexpensive upgrade five or six years after the
computer was purchased. Even including the overhead, spending $300 per year
to keep up a $1500 investment isn't asking a lot.

Now back to software,

According to you, your application is slow starting up because of the
framework loading up in a memory constrained environment. Consider this
option: put a service in their machine that starts up, sets a timer for five
minutes, and goes to sleep... waking up every five minutes to go back to
sleep. Now, when your app loads, the framework is in memory. It's a
kludge, granted, but it may save your business a whopping $100 per machine.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
"Hemant" <He****@discussions.microsoft.com> wrote in message
news:03**********************************@microsof t.com...
Nick,
The notion of memory getting cheap is really relative. You buy the best
possible machine required for your business. By the time you need more
memory
the processor and the memory technology are outdated. You dont always get
what you want. The end user of software wants return on his investment and
the time frame for each update is very short.
I am currently facing a problem where the guys in field have PIII machines
with 256 MB RAM and the start up time for framework is killing the app.
Also
the memory requirements for reporting tool are also becoming performance
bottleneck. My client gets distributes the software to get the business
from
agents in field. He can't control the hardware beyond a point. He has to
support hardware that came into market 3 years back. The guy in field has
simply not recovered his investment or does not require additional H/W for
his other work.
Not all guys get enough money to throw their PCs or worse case laptops in
dustbin every 2 years.

"Nick Malik [Microsoft]" wrote:
Hello Joe,

I appreciate that you are taking a reasoned approach.

I also agree that a case could be made for fine-tuning the GC. That
said,
Willy's comment is correct... the GC really does run more often than you
think. Also page swapping, as caused by .Net apps, is probably less than
you think. Note that a page is only swapped "in" when memory on that
page
is referenced. If the page is completely empty of active objects, then
it
will not swap in until the GC frees it. (This actually slows the system
down if you do this too often).

You refer to how much faster the system runs after a while. This is
completely true. However, it is also often the case that the culprit for
this kind of memory fragmentation is the use of unmanaged objects that
don't
drop their references and therefore create gaps in the memory space that
is
difficult for the heap allocation system to reuse.

As for low-memory systems, I would assert that you are using the OS at or
near the smallest amount of memory that it supports. Normal OS
operations
will fragment the memory, all by themselves, without any help from your
app.
In this case, .Net is not able to really improve the situation.

Note that the OS teams have made real strides over the years in improving
how memory is used. However, as memory has become so much less
expensive,
it is not unreasonable to expect that users will occasionally upgrade
their
memory when they install a new OS to get new features like better
security,
better handling of multimedia, more efficient file systems and (strictly
for
the non-server environments) more advanced games.

I hope this helps. I understand your frustration. (One of my systems at
home has 256M of RAM, too... I'm just too lazy to run down to Fry's and
get
a memory module for it :-(.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
"joe" <jo*****@rot.ofm.net> wrote in message
news:OM**************@TK2MSFTNGP10.phx.gbl...
> Nick,
> Your reasoning is respected and welcomed. I don't hate or love
> something
> like idiots, i simply point out the weak points.
>
> It is true that CG is a great miracle in C# and java, and help the
> programmer to concentrate on more critical aspects of his work,
> however,
> when it comes to page faults, it exhibits its weak points.
>
> You might say that because allocation or garbage collection is done
> only
> when the computer becomes low on memory, you gain much higher
> performance
> in your application. This can be true for small programs, however, when
> some real big programs are executed on computers with low memory ( sth
> like 256 MB) , then the amount of OS's page swapping and virtual memory
> reading and writing really slows down the entire system. Just think,
> why
> your OS is much faster after a clean restart than the time it is run
> for
> long hours of application executing and playing with the virtual
> memory?
>
> Well, at least there could be stricter algorithm implemented, which
> would
> force the GC to come into play much sooner that it does now on
> low-memory
> PCs.
>
>
> "Nick Malik [Microsoft]" <ni*******@hotmail.nospam.com> wrote in
> message
> news:pI********************@comcast.com...
>>
>>> PS: i wont hurt you to read this:
>>> http://www.cs.tut.fi/~warp/MicrosoftComparingLanguages/
>>>
>>
>> This commentary is a case of one inexperienced programmer criticizing
>> a
>> feature that they do not understand, with poor understanding of the
>> forces that underlie the fundamental reasoning. Garbage collection is
>> easy in C++... if every developer were free of mistakes and if code
>> were
>> not complex. This is not the case in the real world. The author of
>> that
>> article completely failed to recognize the reality of memory leaks in
>> a
>> production system of substantial size.
>>
>> One reason for the success of BOTH Java and .Net languages like C# is
>> that this problem is solved for you. You may not agree with the way
>> in
>> which it is solved, but it is solved for you. That is a huge step up
>> and
>> a major boon for software development.
>>
>> The problem isn't the computers or their languages... it is the
>> limitations of the humans who use them.
>> --
>> --- Nick Malik [Microsoft]
>> MCSD, CFPS, Certified Scrummaster
>> http://blogs.msdn.com/nickmalik
>>
>> Disclaimer: Opinions expressed in this forum are my own, and not
>> representative of my employer.
>> I do not answer questions on behalf of my employer. I'm just a
>> programmer helping programmers.
>
>


Jul 22 '05 #27

P: n/a

"andy" <an**@discussions.microsoft.com> wrote in message
news:C3**********************************@microsof t.com...
Well, apart from issues like system performance - where the cache size is
reduced because some app is hogging all available RAM,
But some app wouldn't be hogging all available RAM because that's exactly
when the GC would kick in.

we have had issues with ASP.net apps that keep on using more and more
memory causing other apps
to run into difficulties.
I would think that this may be because of incorrect usage and cleanup of
objects and their unmanaged resources.

"Scott M." wrote:
Why exactly are you trying to reclaim memory when there is still plenty
of
memory for you application to run?
"joe" <jo*****@rot.ofm.net> wrote in message
news:ub**************@tk2msftngp13.phx.gbl...
>
> "Patrice" <no****@nowhere.com> wrote in message
> news:eZ*************@TK2MSFTNGP15.phx.gbl...
>> Have you tried to call Dispose ?
>
> Dispose is not supported on all objects. In my case, calling
> GC.collect()
> does a little help though.
>
>> Though the point of a garbage collector is that there is no need to
>> reclaim
>> memory if you have no use for it, it's worth to keep in mind that it
>> have
>> its root in the managed world and that the Dispose or Close methods
>> should
>> still allows to reclaim unmanaged memory immediately...
>
> Yes, in a perfect managed world where no one wants to struggle with
> memory
> allocation and release, this could be good suggestion. But in the same
> managed world there are times when an object is used only "once", and
> then
> thrown away forever or used after a long time or maybe in next program
> launch. Why should its memory be still occupied by the program?
> I mean, there should a way to have more control over this kind of
> memory
> management. ( sth like a half-automatic GC, until the day GC becomes
> really
> smart and intelligence)
>
> i can also use SetProcessWorkingSetSize(-1,-1) , however it is not
> usually
> recommended.
>
>
>


Jul 22 '05 #28

P: n/a
Hmmm. I think the first metaphor is a bit closer to what is actually
happening. When you put something into a garbage can, it isn't actually
gone yet. It is when you empty the trash that the item is gone forever.
"Andy Bolstridge" <Andy Bo********@discussions.microsoft.com> wrote in
message news:B9**********************************@microsof t.com...
"Cor Ligthert" wrote:
To use a methpore "Do you empty your trash bin, everytime you have thrown
in
a paper or whatever?


Actually, a better metaphor would be that the act of throwing your paper
in
the trash is the act of garbage collection. Just as the system leaves
objects
in memory until they are collected, you would leave paper you've finished
with lying on your desk. Eventually, you'll get to a point where you have
so
much scrap paper around that you have a tidy-up.

Me, when I finish with a paper, I scrunch it up and chuck it there and
then.
Makes my working environment much cleaner and tidier... but then, I do
work
in C++ :)

Jul 22 '05 #29

This discussion thread is closed

Replies have been disabled for this discussion.