By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
428,586 Members | 623 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 428,586 IT Pros & Developers. It's quick & easy.

Disposing of brushes

P: n/a
I am supposed to manually dispose of some instances, such as Brushes, right?

I have a couple of questions:

1. I have the following code, and it works just fine:

penarea.DrawString(selectedslot.drawString,new Font("Arial",fontsize),
new SolidBrush(selectedslot.colour),new Point(0,this.Height/2-(int)
scale)); break;

Just how am I supposed to dispose of the new SolidBrush () ?

2. I create lots of brushes deep down in lots of places, often in loops. How
worried should I be that I am creating brushes but not disposing of them? Is
that some way to enumerate brushes at run time so I can do my own GC when I
know the system is otherwise idle? Can I at least tell in the IDE that I
don't have thousands of brushes being created but not disposed? Or is this
just examining the code?

3. Am I missing something with respect to disposing of things like brushes
and graphics? (Obviously I am, including why they aren't GCed like
everything else). Disposing of them seems a bit of a pain in the arse, but
is seldom discussed on this newsgroup.
Feb 21 '08 #1
Share this Question
Share on Google+
4 Replies

P: n/a
2. I create lots of brushes deep down in lots of places, often in
You might also like to consider creating one brush and changing its
characteristics (color etc.) for each paint. You then dispose that one brush
after painting. You will find your code will run a lot quicker than
continual creation and disposal of IDisposible objects.

Thank you,

Christopher Ireland

"You can't reason someone out of a position they didn't reason themselves
Author Unknown
Feb 21 '08 #2

P: n/a
I'd consider that a "soft" leak - but if failing to dispose of a Brush
*never ever* released its Windows handle, that would count as a "hard"
leak in my view.

And no, I don't have hard and fast definitions of "soft" leak vs
"hard" leak but I hope they make some kind of sense here :)
I think so. A "soft" leak causes inflationary memory consumption during the
lifetime of the application, whereas a "hard" leak causes an increase in
memory consumption after the application has been closed, relative the the
memory consumption before the application was run.

I think we can agree that in a ideal .NET GC context, "hard" leaks never
occur :-)

Thank you,

Christopher Ireland

"If you cannot find the truth right where you are, where else do you expect
to find it?"
Dogen Zenji
Feb 21 '08 #3

P: n/a
Okay, we mostly agree - except that in my view the worse problem is
that you use up handles.
It is interesting that you should say that. I hadn't appreciated that there
was a finite set of GDI handles. Would you be so kind as to send me a link
to something I can read about this? It is very relevant to the area in which
I work.
After the application has been closed it's up to the OS to clean
things up, not the GC.

But in the scenario I was considering, the GC had called the finalizer
but the finalizer had failed to free the GDI handle.
Yes, I see. One scenario would be a defect in the finalizer code, and the
other would be a defect in the GC code that calls the finalizer.
No - the GC isn't even *running* after the application has been
closed. The GC is part of the application.
I guess this is the greyest area for me. Say there was a defect in the GC,
either one of the two scenarios described above, in that the GC wasn't
releasing memory. From what I understand of what you've written, you seem to
be suggesting that even in this case it is the responsibility of the OS to
recuperate the memory once the application has closed.

This goes against what I've understood of memory allocation working in
environments which don't have managed memory capabilities. I believe I've
seen cases where applications use up a chunk of memory making it unavailable
to the OS; the memory use after running the application was greater than the
memory use before running the application. In these cases, therefore, it
seemed to me that the OS was not responsible and was even incapable of
recuperating such memory and it was the responsibility of the programmer of
the application to make sure such things didn't happen.

Thank you for helping me deepen my understanding on this subject!

Thank you,

Christopher Ireland

"When I was 10, my pa told me never to talk to strangers. We haven't spoken
Steven Wright
Feb 21 '08 #4

P: n/a
has some detail. I'm afraid I don't know much about it really - just
that it's a potential issue.
Thanks for the link, Jon! I didn't know that.
The OS is responsible for memory allocation to individual processes,
and for noticing when processes die etc - and releasing appropriate
resources. To take a non-memory example, if an application (managed or
otherwise) has a file open and locked, but then the application dies,
the file handle should be released by the OS, and the file unlocked.
I see. I had thought it was the responsibility of the programmer (in
non-managed memory environments) or the GC (in .NET) to make sure all memory
and resources used by the application were released before the application
closed. Glad to hear I can now blame the OS ;-)
I'm just sorry I can't give more links about this.
This is the problem. If more accurate information about this area was
available I'm pretty sure I wouldn't need to use up kind people's time, such
as yours, asking them to help me clarify it!

Thank you,

Christopher Ireland

"The only good is knowledge and the only evil is ignorance."
Feb 21 '08 #5

This discussion thread is closed

Replies have been disabled for this discussion.