When you minimize an application the process is reduced to it's minimal
working set (because the OS thinks you won't be using the application for a
"long time"), by writing R/W data pages to the paging file and throwing away
Read only pages. Whenever your application gets activated you will incur a
lot of hard page faults, because the data pages must be brought back into
memory (when needed). The resulting WS will in general be less than before
because some pages aren't needed anymore, for instance because they were
only needed to initialize a context.
Now it's possible to reduce the working set of YOUR own application, but
this is realy a very bad idea. The system will trim the WS of all
application when there is memory pressure (lack of), in a far better way
than it would do when you force the OS to trim the WS of a single
application.
So please don't try to play the role of the Memory Manager, you simply don't
know how large your WS should be and you simply can't control which pages
should be removed.
Willy.
"Bhargavan" <bh********@yahoo.co.in> wrote in message
news:uW*************@TK2MSFTNGP15.phx.gbl...
Hey Group,
I just found that the memory usuage for my .NET windows application drops
significantly whenever I minimize my application. Also I noticed when I
maximize the application again, the memory usuage increases but it is
stilll a lot less than what it was intially (when the program initially
loaded). I would like to know the reason behind this behavior. Also I
would like to know whether it would be possible to mimic this behavior
through user code. Any suggestions/advice in this will be greatly
appreciated.
Thanks,
Bhargavan