Cross posting this question on the recommendation of an [MVP]
I have a .NET application that I am developing in C# I am loading
information in from a dataset, and then pushing the dataset to a grid, or
other controls depending on the particular form. This application is setup
with one MDI parent calling MDI children with the exception of one Modal
form (the report viewer).
When I run the application and run one of the screens that pulls the dataset
information, I see in TaskManager (I know, I know just hear me out) the
memory grow from 25MB to 60-70 MB. If I close this screen (I am calling all
of the dispose and close methods I can find for all of my objects) the
memory remains at the same threshold, and in some cases climbs higher after
I call a dispose method.
I understand that the GC may be running in the background (or will at some)
to free this memory but what I observer and keep hearing in these newsgroups
is that the GC will clean these objects but leave the memory allocated to
the application.
The twist here is that this application will be deployed to a terminal
server and may have as many as 35 users running this application. 35 * 70MB
= 2.45GB this doesn't leave much memory room for other applications, so I am
a bit concerned how GC will handle this situation.
When I run an application profiler on the application I see that even after
disposal (maybe I am just not waiting long enough) there are assemblies
still loaded for forms that I have closed (3rd part .NET controls) They seem
to be taking up quite a bit of room.
So my summary questions are here after this novel of a description.
1.) How will the GC handle this situation in the terminal server
environment, if it doesn't return the memory to the OS before the
application is closed it could be a problem for me.
2.) Is there anyway that I can unload these assemblies that I see even after
I have disposed the form?
3.) Any way to make the GC return the memory to the OS
4.) Anyone else have an experience with Custom .net applications in terminal
server, and if so what have you learned.
Sorry for the lengthy question, and an advanced thanks to all who can help!
Justin 16 1069
Hi,
I think that you selected the wrong application type for this application.
First of all how may licences do you have for having 35 users connected to
the same server?
My advise is that either you transform this to a web app ( very time
consuming ) or to create a client/server app.
You could have a web service reading the info from the db and interacting
with win clients running on the local machines. you could design your win
app to be stored in a network share and used from there for all the clients,
just need to keep any local information in the client machines.
But I would definely change your current strcuture.
cheers,
--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation
"Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message
news:%2****************@TK2MSFTNGP10.phx.gbl... Cross posting this question on the recommendation of an [MVP]
I have a .NET application that I am developing in C# I am loading information in from a dataset, and then pushing the dataset to a grid, or other controls depending on the particular form. This application is setup with one MDI parent calling MDI children with the exception of one Modal form (the report viewer).
When I run the application and run one of the screens that pulls the dataset information, I see in TaskManager (I know, I know just hear me out) the memory grow from 25MB to 60-70 MB. If I close this screen (I am calling all of the dispose and close methods I can find for all of my objects) the memory remains at the same threshold, and in some cases climbs higher after I call a dispose method.
I understand that the GC may be running in the background (or will at some) to free this memory but what I observer and keep hearing in these newsgroups is that the GC will clean these objects but leave the memory allocated to the application.
The twist here is that this application will be deployed to a terminal server and may have as many as 35 users running this application. 35 * 70MB = 2.45GB this doesn't leave much memory room for other applications, so I am a bit concerned how GC will handle this situation.
When I run an application profiler on the application I see that even after disposal (maybe I am just not waiting long enough) there are assemblies still loaded for forms that I have closed (3rd part .NET controls) They seem to be taking up quite a bit of room.
So my summary questions are here after this novel of a description.
1.) How will the GC handle this situation in the terminal server environment, if it doesn't return the memory to the OS before the application is closed it could be a problem for me.
2.) Is there anyway that I can unload these assemblies that I see even after I have disposed the form?
3.) Any way to make the GC return the memory to the OS
4.) Anyone else have an experience with Custom .net applications in terminal server, and if so what have you learned.
Sorry for the lengthy question, and an advanced thanks to all who can help!
Justin
"Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message
news:%2****************@TK2MSFTNGP10.phx.gbl... Cross posting this question on the recommendation of an [MVP]
I have a .NET application that I am developing in C# I am loading information in from a dataset, and then pushing the dataset to a grid, or other controls depending on the particular form. This application is setup with one MDI parent calling MDI children with the exception of one Modal form (the report viewer).
When I run the application and run one of the screens that pulls the dataset information, I see in TaskManager (I know, I know just hear me out) the memory grow from 25MB to 60-70 MB. If I close this screen (I am calling all of the dispose and close methods I can find for all of my objects) the memory remains at the same threshold, and in some cases climbs higher after I call a dispose method.
I understand that the GC may be running in the background (or will at some) to free this memory but what I observer and keep hearing in these newsgroups is that the GC will clean these objects but leave the memory allocated to the application.
The twist here is that this application will be deployed to a terminal server and may have as many as 35 users running this application. 35 * 70MB = 2.45GB this doesn't leave much memory room for other applications, so I am a bit concerned how GC will handle this situation.
When I run an application profiler on the application I see that even after disposal (maybe I am just not waiting long enough) there are assemblies still loaded for forms that I have closed (3rd part .NET controls) They seem to be taking up quite a bit of room.
So my summary questions are here after this novel of a description.
1.) How will the GC handle this situation in the terminal server environment, if it doesn't return the memory to the OS before the application is closed it could be a problem for me.
2.) Is there anyway that I can unload these assemblies that I see even after I have disposed the form?
3.) Any way to make the GC return the memory to the OS
4.) Anyone else have an experience with Custom .net applications in terminal server, and if so what have you learned.
Sorry for the lengthy question, and an advanced thanks to all who can help!
Justin
First of, to conclude that 35 instances will consume 35 times the working
set (if this is what you are looking at in taskman) is not correct.
The working set includes a shared part, that is memory that is shared
between process. On the other hand JITted code is non sharable which
is/might (be) a problem in TS scenarios, but the bulk of the WS pages
consumed by this application are mostly due to the objects allocated from
the GC heap, and that's the price you have to pay when designing in a OO
(and managed) world.
To answer your Q.
Q1. The GC has nothing to do with this, the OS will reclaim memory when it
needs memory for pages and the processes will return pages to the OS or to
the paging file when their WS are getting trimmed by the OS, which is bad
performance wise.
Q2. Assembly unloading has nothing to do with Disposing Forms or whatever,
assemblies are unloaded when the appdomain is unloaded. It's very unlikely
that your application uses additional AppDomains, right?
Q3. No. And there is no need to, this is taken care of by the OS and the
CLR. The CLR will trim it's WS when there is memory pressure, so it will
return VAS segments to the system (when possible).
Q4. Yep, what I have learned is to pay attention to the memory resource
consumption (something you also need to do when developping unmanaged
applications that have to run on TS, but to a lesser extend) .
Be careful when selecting your containers, as an example, prefer using
arrays with fixed sizes when possible over ArrayList when possible. Create
your objects as late as possible and free your objects and call Dispose as
early as possible, but don't call GC.Collect() yourself. Profile your
application, look for references cached in containers that might be kept
alive for a longer period as necessary.
Be careful when creating additional (excessive) threads , each thread
reserves 1MB of contiguous virtual address space, keep the count as low as
possible.
Be careful with Interop (COM or native code), don't use it when possible in
TS scenarios.
If your code base is large try to NGEN your assemblies, NGEN'd assemblies
can share their code, but again don't expect too much from this, JIted code
is in general rather compact as not all methods of a loaded modules are
necessarily compiled.
What I've noticed is that a carefully designed application needs ~10%-20%
more memory in TS scenarios, your mileage may vary of course.
Willy.
Maybe I don't have the right definitions but I think this is a client server
application.
Windows Application - Client
SQL Server Database - Server
Our current enviroment here runs 90% of our users on Published Citrix
Desktops we are licenced to run these users on terminal services.
"Ignacio Machin ( .NET/ C# MVP )" <ignacio.machin AT dot.state.fl.us> wrote
in message news:%2****************@TK2MSFTNGP10.phx.gbl... Hi,
I think that you selected the wrong application type for this application. First of all how may licences do you have for having 35 users connected to the same server?
My advise is that either you transform this to a web app ( very time consuming ) or to create a client/server app. You could have a web service reading the info from the db and interacting with win clients running on the local machines. you could design your win app to be stored in a network share and used from there for all the clients, just need to keep any local information in the client machines.
But I would definely change your current strcuture.
cheers,
-- Ignacio Machin, ignacio.machin AT dot.state.fl.us Florida Department Of Transportation "Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message news:%2****************@TK2MSFTNGP10.phx.gbl... Cross posting this question on the recommendation of an [MVP]
I have a .NET application that I am developing in C# I am loading information in from a dataset, and then pushing the dataset to a grid, or other controls depending on the particular form. This application is setup with one MDI parent calling MDI children with the exception of one Modal form (the report viewer).
When I run the application and run one of the screens that pulls the dataset information, I see in TaskManager (I know, I know just hear me out) the memory grow from 25MB to 60-70 MB. If I close this screen (I am calling all of the dispose and close methods I can find for all of my objects) the memory remains at the same threshold, and in some cases climbs higher after I call a dispose method.
I understand that the GC may be running in the background (or will at some) to free this memory but what I observer and keep hearing in these newsgroups is that the GC will clean these objects but leave the memory allocated to the application.
The twist here is that this application will be deployed to a terminal server and may have as many as 35 users running this application. 35 * 70MB = 2.45GB this doesn't leave much memory room for other applications, so I am a bit concerned how GC will handle this situation.
When I run an application profiler on the application I see that even after disposal (maybe I am just not waiting long enough) there are assemblies still loaded for forms that I have closed (3rd part .NET controls) They seem to be taking up quite a bit of room.
So my summary questions are here after this novel of a description.
1.) How will the GC handle this situation in the terminal server environment, if it doesn't return the memory to the OS before the application is closed it could be a problem for me.
2.) Is there anyway that I can unload these assemblies that I see even after I have disposed the form?
3.) Any way to make the GC return the memory to the OS
4.) Anyone else have an experience with Custom .net applications in terminal server, and if so what have you learned.
Sorry for the lengthy question, and an advanced thanks to all who can help!
Justin
Hi,
I was referring to a win client app that connect to a server component (
aka web service ) that in turn interact with the DB.
Now, in your case I would try to optimize the code, why are you loading
that many data? are you using it all at the same time?
If you have a table with 1K rows, but only shows 25 in the interface, just
load those 25 you will show, this will decrease the memory usage.
Even in this escenario you may use my above suggestion, it's the web service
then one that have a unique copy in memory of the data, all the clients
request the exact subset needed to be shown. in this way you have only one
copy of the data, instead of 35 copies.
Cheers,
--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation
"Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message
news:%2****************@tk2msftngp13.phx.gbl... Maybe I don't have the right definitions but I think this is a client server application.
Windows Application - Client SQL Server Database - Server
Our current enviroment here runs 90% of our users on Published Citrix Desktops we are licenced to run these users on terminal services.
"Ignacio Machin ( .NET/ C# MVP )" <ignacio.machin AT dot.state.fl.us> wrote in message news:%2****************@TK2MSFTNGP10.phx.gbl... Hi,
I think that you selected the wrong application type for this application. First of all how may licences do you have for having 35 users connected to the same server?
My advise is that either you transform this to a web app ( very time consuming ) or to create a client/server app. You could have a web service reading the info from the db and interacting with win clients running on the local machines. you could design your win app to be stored in a network share and used from there for all the clients, just need to keep any local information in the client machines.
But I would definely change your current strcuture.
cheers,
-- Ignacio Machin, ignacio.machin AT dot.state.fl.us Florida Department Of Transportation "Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message news:%2****************@TK2MSFTNGP10.phx.gbl... Cross posting this question on the recommendation of an [MVP]
I have a .NET application that I am developing in C# I am loading information in from a dataset, and then pushing the dataset to a grid, or other controls depending on the particular form. This application is setup with one MDI parent calling MDI children with the exception of one Modal form (the report viewer).
When I run the application and run one of the screens that pulls the dataset information, I see in TaskManager (I know, I know just hear me out) the memory grow from 25MB to 60-70 MB. If I close this screen (I am calling all of the dispose and close methods I can find for all of my objects) the memory remains at the same threshold, and in some cases climbs higher after I call a dispose method.
I understand that the GC may be running in the background (or will at some) to free this memory but what I observer and keep hearing in these newsgroups is that the GC will clean these objects but leave the memory allocated to the application.
The twist here is that this application will be deployed to a terminal server and may have as many as 35 users running this application. 35 * 70MB = 2.45GB this doesn't leave much memory room for other applications, so I am a bit concerned how GC will handle this situation.
When I run an application profiler on the application I see that even after disposal (maybe I am just not waiting long enough) there are assemblies still loaded for forms that I have closed (3rd part .NET controls) They seem to be taking up quite a bit of room.
So my summary questions are here after this novel of a description.
1.) How will the GC handle this situation in the terminal server environment, if it doesn't return the memory to the OS before the application is closed it could be a problem for me.
2.) Is there anyway that I can unload these assemblies that I see even after I have disposed the form?
3.) Any way to make the GC return the memory to the OS
4.) Anyone else have an experience with Custom .net applications in terminal server, and if so what have you learned.
Sorry for the lengthy question, and an advanced thanks to all who can help!
Justin
Willy,
I appreciate the input, if I may I have a couple of follow up questions.
Please bear with me, I am very new to .NET and trying my best to get
accuanted with it.
1.) We have another application that is setup the same way as this one will
be (I didn't write the other one) and I see 100-200 MB per user that runs
that application (again in task manager to be fair). That application isn't
run by as many users as this one will be but it clearly concerns me
2.) Since I don't know what AppDomains are, I am going to have to conclude
that no I am not using AppDomains, however when I profile my application, if
I run the application profiler before I run a Crystal Report and then I run
it after, I see a bunch of Crystal Assembilies loaded. After I close the
Crystal form (on it's closed event it calls a this.dispose()) I still see
these assembilies loaded. They are taking quite a bit of space. I though
that this might help me reclaim some of this.
3.) Can you recomend a good application profiler that would help me find any
refrences that I may not be closing?
4.) When you say Ngen'd what exactly are you refering to?
5.) Application design. I am trying to make this application as reusable and
flexable as possible. I would like to list something that I am doing now,
and if you don't mind letting me know if this is a good or bad way of doing
things (Remember I am new so be gentel :) )
My current application logic looks something like this.
Main Form (This will launch the MDI Children no data access is used on
this form, a configuration and security class are called from the
application to an outside project (MyApplicationClasses))
MDIChild will load its respective controls for instance a data grid.
It will instance a new strongly typed dataset, it then instances s a Class
that is expcility for Data Access (Data Access Layer).
Data Access Layer loads with no constructor. All select
statment are in this class and look a little like this
public System.Data.Dataset selectSomeInfo(int locID)
{
//set the paramater to the sqlCommand
sqlSelectEquipment.Paramaters["@LocationCode"].Value
= locID;
//set the sqlDataAdapter to the sqlCommand
(Stored Procedure I want to use)
sqlCmd.SelectCommand = sqlSelectEquipment;
//ds is untyped and created as a protected
member of the class RunSQL is a funciton in the class that Runs the fill
method of the dataAdapter
ds = RunSQL("Equipment");
return ds;
}
So my MDI child would call its instance of the Strongly
typed dataset, and then call the DAL.selectSomeInfo
it would look like this
dsEquipment1.Merge(Selects.selectSomeInfo(1));
dsEquipment1 would be bound to my datagrid.
When I close the mdi for I would dispose my Selects object, and call the
dispose method on the form.
Am I heading down the wrong track here or is this an OK way to do this? I
figured it would be better than calling a million try catch blocks inside of
every diffrent form, and I could reuse that select command somewhere else if
needed.
I appreciate the input.
"Willy Denoyette [MVP]" <wi*************@telenet.be> wrote in message
news:ON****************@TK2MSFTNGP14.phx.gbl... "Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message news:%2****************@TK2MSFTNGP10.phx.gbl... Cross posting this question on the recommendation of an [MVP]
I have a .NET application that I am developing in C# I am loading information in from a dataset, and then pushing the dataset to a grid, or other controls depending on the particular form. This application is setup with one MDI parent calling MDI children with the exception of one Modal form (the report viewer).
When I run the application and run one of the screens that pulls the dataset information, I see in TaskManager (I know, I know just hear me out) the memory grow from 25MB to 60-70 MB. If I close this screen (I am calling all of the dispose and close methods I can find for all of my objects) the memory remains at the same threshold, and in some cases climbs higher after I call a dispose method.
I understand that the GC may be running in the background (or will at some) to free this memory but what I observer and keep hearing in these newsgroups is that the GC will clean these objects but leave the memory allocated to the application.
The twist here is that this application will be deployed to a terminal server and may have as many as 35 users running this application. 35 * 70MB = 2.45GB this doesn't leave much memory room for other applications, so I am a bit concerned how GC will handle this situation.
When I run an application profiler on the application I see that even after disposal (maybe I am just not waiting long enough) there are assemblies still loaded for forms that I have closed (3rd part .NET controls) They seem to be taking up quite a bit of room.
So my summary questions are here after this novel of a description.
1.) How will the GC handle this situation in the terminal server environment, if it doesn't return the memory to the OS before the application is closed it could be a problem for me.
2.) Is there anyway that I can unload these assemblies that I see even after I have disposed the form?
3.) Any way to make the GC return the memory to the OS
4.) Anyone else have an experience with Custom .net applications in terminal server, and if so what have you learned.
Sorry for the lengthy question, and an advanced thanks to all who can help!
Justin
First of, to conclude that 35 instances will consume 35 times the working set (if this is what you are looking at in taskman) is not correct. The working set includes a shared part, that is memory that is shared between process. On the other hand JITted code is non sharable which is/might (be) a problem in TS scenarios, but the bulk of the WS pages consumed by this application are mostly due to the objects allocated from the GC heap, and that's the price you have to pay when designing in a OO (and managed) world.
To answer your Q. Q1. The GC has nothing to do with this, the OS will reclaim memory when it needs memory for pages and the processes will return pages to the OS or to the paging file when their WS are getting trimmed by the OS, which is bad performance wise. Q2. Assembly unloading has nothing to do with Disposing Forms or whatever, assemblies are unloaded when the appdomain is unloaded. It's very unlikely that your application uses additional AppDomains, right? Q3. No. And there is no need to, this is taken care of by the OS and the CLR. The CLR will trim it's WS when there is memory pressure, so it will return VAS segments to the system (when possible). Q4. Yep, what I have learned is to pay attention to the memory resource consumption (something you also need to do when developping unmanaged applications that have to run on TS, but to a lesser extend) . Be careful when selecting your containers, as an example, prefer using arrays with fixed sizes when possible over ArrayList when possible. Create your objects as late as possible and free your objects and call Dispose as early as possible, but don't call GC.Collect() yourself. Profile your application, look for references cached in containers that might be kept alive for a longer period as necessary. Be careful when creating additional (excessive) threads , each thread reserves 1MB of contiguous virtual address space, keep the count as low as possible. Be careful with Interop (COM or native code), don't use it when possible in TS scenarios. If your code base is large try to NGEN your assemblies, NGEN'd assemblies can share their code, but again don't expect too much from this, JIted code is in general rather compact as not all methods of a loaded modules are necessarily compiled. What I've noticed is that a carefully designed application needs ~10%-20% more memory in TS scenarios, your mileage may vary of course.
Willy.
Actually what is the most puzzling is the fact that I am not calling that
much data
293 rows in this case. Normally it would be much less, more on the order of
25-50 rows at the max.
The pig that eats up the most memory is when I call a crystal report, your
suggestion to have something call the subset of data, If I have a Windows
form then query it wouldn't it then just read that same subset of data into
the windows app?
"Ignacio Machin ( .NET/ C# MVP )" <ignacio.machin AT dot.state.fl.us> wrote
in message news:uv**************@TK2MSFTNGP10.phx.gbl... Hi,
I was referring to a win client app that connect to a server component ( aka web service ) that in turn interact with the DB.
Now, in your case I would try to optimize the code, why are you loading that many data? are you using it all at the same time? If you have a table with 1K rows, but only shows 25 in the interface, just load those 25 you will show, this will decrease the memory usage.
Even in this escenario you may use my above suggestion, it's the web service then one that have a unique copy in memory of the data, all the clients request the exact subset needed to be shown. in this way you have only one copy of the data, instead of 35 copies.
Cheers,
-- Ignacio Machin, ignacio.machin AT dot.state.fl.us Florida Department Of Transportation
"Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message news:%2****************@tk2msftngp13.phx.gbl... Maybe I don't have the right definitions but I think this is a client server application.
Windows Application - Client SQL Server Database - Server
Our current enviroment here runs 90% of our users on Published Citrix Desktops we are licenced to run these users on terminal services.
"Ignacio Machin ( .NET/ C# MVP )" <ignacio.machin AT dot.state.fl.us> wrote in message news:%2****************@TK2MSFTNGP10.phx.gbl... Hi,
I think that you selected the wrong application type for this application. First of all how may licences do you have for having 35 users connected to the same server?
My advise is that either you transform this to a web app ( very time consuming ) or to create a client/server app. You could have a web service reading the info from the db and interacting with win clients running on the local machines. you could design your win app to be stored in a network share and used from there for all the clients, just need to keep any local information in the client machines.
But I would definely change your current strcuture.
cheers,
-- Ignacio Machin, ignacio.machin AT dot.state.fl.us Florida Department Of Transportation "Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message news:%2****************@TK2MSFTNGP10.phx.gbl...
Cross posting this question on the recommendation of an [MVP]
I have a .NET application that I am developing in C# I am loading information in from a dataset, and then pushing the dataset to a grid, or other controls depending on the particular form. This application is setup with one MDI parent calling MDI children with the exception of one Modal form (the report viewer).
When I run the application and run one of the screens that pulls the dataset information, I see in TaskManager (I know, I know just hear me out) the memory grow from 25MB to 60-70 MB. If I close this screen (I am calling all of the dispose and close methods I can find for all of my objects) the memory remains at the same threshold, and in some cases climbs higher after I call a dispose method.
I understand that the GC may be running in the background (or will at some) to free this memory but what I observer and keep hearing in these newsgroups is that the GC will clean these objects but leave the memory allocated to the application.
The twist here is that this application will be deployed to a terminal server and may have as many as 35 users running this application. 35 * 70MB = 2.45GB this doesn't leave much memory room for other applications, so I am a bit concerned how GC will handle this situation.
When I run an application profiler on the application I see that even after disposal (maybe I am just not waiting long enough) there are assemblies still loaded for forms that I have closed (3rd part .NET controls) They seem to be taking up quite a bit of room.
So my summary questions are here after this novel of a description.
1.) How will the GC handle this situation in the terminal server environment, if it doesn't return the memory to the OS before the application is closed it could be a problem for me.
2.) Is there anyway that I can unload these assemblies that I see even after I have disposed the form?
3.) Any way to make the GC return the memory to the OS
4.) Anyone else have an experience with Custom .net applications in terminal server, and if so what have you learned.
Sorry for the lengthy question, and an advanced thanks to all who can help!
Justin
Willy,
In my opinion again a great answer, my compliments
Cor
Justin,
Do this experiment. Open your application on a desktop. Keep using it. Spick
up the memory by using windows you think does that. Keep an eye on task
manager for memory use. Now minimize your application. What you should see
it that your application released all the memory it should not have. Is that
what you see ?
--
Po
"Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message
news:%2****************@TK2MSFTNGP10.phx.gbl... Cross posting this question on the recommendation of an [MVP]
I have a .NET application that I am developing in C# I am loading information in from a dataset, and then pushing the dataset to a grid, or other controls depending on the particular form. This application is setup with one MDI parent calling MDI children with the exception of one Modal form (the report viewer).
When I run the application and run one of the screens that pulls the dataset information, I see in TaskManager (I know, I know just hear me out) the memory grow from 25MB to 60-70 MB. If I close this screen (I am calling all of the dispose and close methods I can find for all of my objects) the memory remains at the same threshold, and in some cases climbs higher after I call a dispose method.
I understand that the GC may be running in the background (or will at some) to free this memory but what I observer and keep hearing in these newsgroups is that the GC will clean these objects but leave the memory allocated to the application.
The twist here is that this application will be deployed to a terminal server and may have as many as 35 users running this application. 35 * 70MB = 2.45GB this doesn't leave much memory room for other applications, so I am a bit concerned how GC will handle this situation.
When I run an application profiler on the application I see that even after disposal (maybe I am just not waiting long enough) there are assemblies still loaded for forms that I have closed (3rd part .NET controls) They seem to be taking up quite a bit of room.
So my summary questions are here after this novel of a description.
1.) How will the GC handle this situation in the terminal server environment, if it doesn't return the memory to the OS before the application is closed it could be a problem for me.
2.) Is there anyway that I can unload these assemblies that I see even after I have disposed the form?
3.) Any way to make the GC return the memory to the OS
4.) Anyone else have an experience with Custom .net applications in terminal server, and if so what have you learned.
Sorry for the lengthy question, and an advanced thanks to all who can help!
Justin
Justin,
See inline.
Willy.
"Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message
news:%2****************@tk2msftngp13.phx.gbl... Willy,
I appreciate the input, if I may I have a couple of follow up questions. Please bear with me, I am very new to .NET and trying my best to get accuanted with it.
1.) We have another application that is setup the same way as this one will be (I didn't write the other one) and I see 100-200 MB per user that runs that application (again in task manager to be fair). That application isn't run by as many users as this one will be but it clearly concerns me
Guess this is the same application that integrates Crystal Reports, please
correct me if I'm wrong.
Now, I'm not 100% sure, but I think that CR isn't "designed" for and/or
tested in TS/Citrix scenarios (never used it in TS), simply it's a desktop
class application, and when used in a TS scenario it would be 'classified'
as a Server application. The difference between a TS scenario and a desktop
is that on a desktop, you don't have multiple applications actively running
in parallel , so CR is designed such that it's free to consume all (well
most) of the available VAS while this isn't true for a (TS) Server class
application where available memory must be shared amongst multiple
applications/sessions. I highly doubt that CR was optimized for memory
consumption.
2.) Since I don't know what AppDomains are, I am going to have to conclude that no I am not using AppDomains, however when I profile my application, if I run the application profiler before I run a Crystal Report and then I run it after, I see a bunch of Crystal Assembilies loaded. After I close the Crystal form (on it's closed event it calls a this.dispose()) I still see these assembilies loaded. They are taking quite a bit of space. I though that this might help me reclaim some of this.
Application domains are something like light weight processes, but instead
of being OS type of isolated execution units, they share the same OS
process, but they are managed by the CLR (or simply the VM).
Each managed application start it's life in a AppDomain initialy created by
the CLR, this is the so called "default" AppDomain, all modules
(assemblies) loaded by the application are bound to that AD and they can
only be unloaded when the AD unloads. An application can however create
additional AD's, and load assemblies and run their code into these AD's.
These AD's can unload, and as a result unload the loaded assemblies - which
is the only way to unload assembles and release their memory resources. So a
classic scenario is that a default domain creates a new AD, loads some
application/add-in and runs the code. When done, the default AD, unloads the
additional AD and as such releases the code and assemblies from the
containing process.
3.) Can you recomend a good application profiler that would help me find any refrences that I may not be closing?
Well, I wont recommend any, but here is a list of what's worth to look at. http://blogs.msdn.com/brada/archive/...01/398060.aspx
Note that you must take care before you jump on this bandwagon, thry to
undestand the basics of .NET, it execution environment and it's (Garbage
collected) memory model before you start using such tools, sometime they can
be lifesavers but you can also waste a lot of time with them and they all
tend to be somewhat intrusive. A better tool to start with is permon, the
CLR publishes a number of counters that can be used to watch your memory
allocation pattern.
4.) When you say Ngen'd what exactly are you refering to?
In short: NGEN is the native code generator - ngen.exe, part of the
framework.
Check this article for more detailed info, note it talks about v2.0 which is
a highly improved version of what's available in v1.x, but it basically the
principles remain the same. http://msdn.microsoft.com/msdnmag/is...n/default.aspx
5.) Application design. I am trying to make this application as reusable and flexable as possible. I would like to list something that I am doing now, and if you don't mind letting me know if this is a good or bad way of doing things (Remember I am new so be gentel :) )
My current application logic looks something like this.
Main Form (This will launch the MDI Children no data access is used on this form, a configuration and security class are called from the application to an outside project (MyApplicationClasses)) MDIChild will load its respective controls for instance a data grid. It will instance a new strongly typed dataset, it then instances s a Class that is expcility for Data Access (Data Access Layer). Data Access Layer loads with no constructor. All select statment are in this class and look a little like this public System.Data.Dataset selectSomeInfo(int locID) { //set the paramater to the sqlCommand
sqlSelectEquipment.Paramaters["@LocationCode"].Value = locID; //set the sqlDataAdapter to the sqlCommand (Stored Procedure I want to use) sqlCmd.SelectCommand = sqlSelectEquipment; //ds is untyped and created as a protected member of the class RunSQL is a funciton in the class that Runs the fill method of the dataAdapter ds = RunSQL("Equipment"); return ds; }
So my MDI child would call its instance of the Strongly typed dataset, and then call the DAL.selectSomeInfo it would look like this dsEquipment1.Merge(Selects.selectSomeInfo(1)); dsEquipment1 would be bound to my datagrid. When I close the mdi for I would dispose my Selects object, and call the dispose method on the form.
Am I heading down the wrong track here or is this an OK way to do this? I figured it would be better than calling a million try catch blocks inside of every diffrent form, and I could reuse that select command somewhere else if needed.
Well, I don't see anything wrong with this high level view of the design,
just that I'm missing some more details, so it's really hard to make some
sensible comment.
I appreciate the input.
"Pohihihi" <po******@hotmail.com> wrote in message
news:Ne****************@tornado.socal.rr.com... Justin,
Do this experiment. Open your application on a desktop. Keep using it. Spick up the memory by using windows you think does that. Keep an eye on task manager for memory use. Now minimize your application. What you should see it that your application released all the memory it should not have. Is that what you see ?
-- Po
Why, it's not relevant in a TS scenario and it's wrong to assume the memory
is released, it's not. The only thing that gets done is to swap-out the
excess data pages from the WS to the page file, with as result a lot of hard
page faults (and increased IO load) when the application resumes. If this
was possible on a TS it would be a disaster for the applications that stay
maximized isn't it?
Willy.
..
Just reading this over, and I noticed the tid-bit I have left below.
Why doesn't this leave a lot of room?
You can have up to 4 Gb on a "typical" server, so that still leaves a lot,
and if you're doing a lot more on this server (which, honestly, I wouldn't
recommend), you can get a server that supports the 32 Gb maximum instead.
No one process would be able to exceed the 2 Gb maximum, but it doesn't
sound like you're worried about that anyway.
For your reading entertainment: http://members.shaw.ca/bsanders/Wind...ileEtc.htm#2_4
--
Reginald Blue
"I have always wished that my computer would be as easy to use as my
telephone. My wish has come true. I no longer know how to use my
telephone."
- Bjarne Stroustrup (originator of C++) [quoted at the 2003
International Conference on Intelligent User Interfaces]
Justin Lazanowski wrote: The twist here is that this application will be deployed to a terminal server and may have as many as 35 users running this application. 35 * 70MB = 2.45GB this doesn't leave much memory room for other applications, so I am a bit concerned how GC will handle this situation.
if a big swap then won't it go back to previous memory level after restore?
should we see any change in Committed memory? cause I do not see any change
in pagefault/CommitCharge/paged memory or IO increase (min or restore). I
guess I might be missing some big concept here but few apps we have are
running on TS and behave this way. They are dotNet app. They just keep on
using memory up to a big limit (some time 1 instance 25% of tot mem). Bad
part is they do not release memory if needed by other apps. We have done
many code reviews and it is frustrating that can't see what is wrong.
"Willy Denoyette [MVP]" <wi*************@telenet.be> wrote in message
news:ej**************@tk2msftngp13.phx.gbl... "Pohihihi" <po******@hotmail.com> wrote in message news:Ne****************@tornado.socal.rr.com... Justin,
Do this experiment. Open your application on a desktop. Keep using it. Spick up the memory by using windows you think does that. Keep an eye on task manager for memory use. Now minimize your application. What you should see it that your application released all the memory it should not have. Is that what you see ?
-- Po
Why, it's not relevant in a TS scenario and it's wrong to assume the memory is released, it's not. The only thing that gets done is to swap-out the excess data pages from the WS to the page file, with as result a lot of hard page faults (and increased IO load) when the application resumes. If this was possible on a TS it would be a disaster for the applications that stay maximized isn't it?
Willy. .
Willy,
Thank you so much for your help with this. I really appreciate the input,
you have been a valuable resource.
If I may ask one more quick question.
If I load the report into a new "appDomain" and then destroy that appDomain
when the user closes the report, what your saying is that any Crystal
Assembilies that may be loaded should be unloaded, is that the case?
Is there a better way to run reports that may not kill so much memory? With
the applications that you have done on TS how have you done the reporting,
or was there no reporting in that situation?
Again I really appreciate the input
Justin
"Willy Denoyette [MVP]" <wi*************@telenet.be> wrote in message
news:uQ**************@tk2msftngp13.phx.gbl... Justin,
See inline.
Willy.
"Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message news:%2****************@tk2msftngp13.phx.gbl... Willy,
I appreciate the input, if I may I have a couple of follow up questions. Please bear with me, I am very new to .NET and trying my best to get accuanted with it.
1.) We have another application that is setup the same way as this one will be (I didn't write the other one) and I see 100-200 MB per user that runs that application (again in task manager to be fair). That application isn't run by as many users as this one will be but it clearly concerns me Guess this is the same application that integrates Crystal Reports, please correct me if I'm wrong. Now, I'm not 100% sure, but I think that CR isn't "designed" for and/or tested in TS/Citrix scenarios (never used it in TS), simply it's a desktop class application, and when used in a TS scenario it would be 'classified' as a Server application. The difference between a TS scenario and a desktop is that on a desktop, you don't have multiple applications actively running in parallel , so CR is designed such that it's free to consume all (well most) of the available VAS while this isn't true for a (TS) Server class application where available memory must be shared amongst multiple applications/sessions. I highly doubt that CR was optimized for memory consumption.
2.) Since I don't know what AppDomains are, I am going to have to conclude that no I am not using AppDomains, however when I profile my application, if I run the application profiler before I run a Crystal Report and then I run it after, I see a bunch of Crystal Assembilies loaded. After I close the Crystal form (on it's closed event it calls a this.dispose()) I still see these assembilies loaded. They are taking quite a bit of space. I though that this might help me reclaim some of this.
Application domains are something like light weight processes, but instead of being OS type of isolated execution units, they share the same OS process, but they are managed by the CLR (or simply the VM). Each managed application start it's life in a AppDomain initialy created by the CLR, this is the so called "default" AppDomain, all modules (assemblies) loaded by the application are bound to that AD and they can only be unloaded when the AD unloads. An application can however create additional AD's, and load assemblies and run their code into these AD's. These AD's can unload, and as a result unload the loaded assemblies - which is the only way to unload assembles and release their memory resources. So a classic scenario is that a default domain creates a new AD, loads some application/add-in and runs the code. When done, the default AD, unloads the additional AD and as such releases the code and assemblies from the containing process.
3.) Can you recomend a good application profiler that would help me find any refrences that I may not be closing? Well, I wont recommend any, but here is a list of what's worth to look at. http://blogs.msdn.com/brada/archive/...01/398060.aspx
Note that you must take care before you jump on this bandwagon, thry to undestand the basics of .NET, it execution environment and it's (Garbage collected) memory model before you start using such tools, sometime they can be lifesavers but you can also waste a lot of time with them and they all tend to be somewhat intrusive. A better tool to start with is permon, the CLR publishes a number of counters that can be used to watch your memory allocation pattern.
4.) When you say Ngen'd what exactly are you refering to?
In short: NGEN is the native code generator - ngen.exe, part of the framework. Check this article for more detailed info, note it talks about v2.0 which is a highly improved version of what's available in v1.x, but it basically the principles remain the same.
http://msdn.microsoft.com/msdnmag/is...n/default.aspx
5.) Application design. I am trying to make this application as reusable and flexable as possible. I would like to list something that I am doing now, and if you don't mind letting me know if this is a good or bad way of doing things (Remember I am new so be gentel :) )
My current application logic looks something like this.
Main Form (This will launch the MDI Children no data access is used on this form, a configuration and security class are called from the application to an outside project (MyApplicationClasses)) MDIChild will load its respective controls for instance a data grid. It will instance a new strongly typed dataset, it then instances s a Class that is expcility for Data Access (Data Access Layer). Data Access Layer loads with no constructor. All select statment are in this class and look a little like this public System.Data.Dataset selectSomeInfo(int locID) { //set the paramater to the sqlCommand
sqlSelectEquipment.Paramaters["@LocationCode"].Value = locID; //set the sqlDataAdapter to the sqlCommand (Stored Procedure I want to use) sqlCmd.SelectCommand = sqlSelectEquipment; //ds is untyped and created as a protected member of the class RunSQL is a funciton in the class that Runs the fill method of the dataAdapter ds = RunSQL("Equipment"); return ds; }
So my MDI child would call its instance of the Strongly typed dataset, and then call the DAL.selectSomeInfo it would look like this dsEquipment1.Merge(Selects.selectSomeInfo(1)); dsEquipment1 would be bound to my datagrid. When I close the mdi for I would dispose my Selects object, and call the dispose method on the form.
Am I heading down the wrong track here or is this an OK way to do this? I figured it would be better than calling a million try catch blocks inside of every diffrent form, and I could reuse that select command somewhere else if needed.
Well, I don't see anything wrong with this high level view of the design, just that I'm missing some more details, so it's really hard to make some sensible comment.
I appreciate the input.
"Pohihihi" <po******@hotmail.com> wrote in message
news:NE**************@tornado.socal.rr.com... if a big swap then won't it go back to previous memory level after restore? should we see any change in Committed memory? cause I do not see any change in pagefault/CommitCharge/paged memory or IO increase (min or restore). I guess I might be missing some big concept here but few apps we have are running on TS and behave this way. They are dotNet app. They just keep on using memory up to a big limit (some time 1 instance 25% of tot mem). Bad part is they do not release memory if needed by other apps. We have done many code reviews and it is frustrating that can't see what is wrong.
No, it won't go back to the previous level. When you minimize a windows
application, the OS starts trimming the Working Set (WS is Mem. Usage in
Taskman), note that this doesn't mean that pages are necessarily swapped out
to the paging file, they are just marked as 'standby' or 'modified'.
After a 'maximize' the process gets reactivated and starts running code (a
redraw of the main window to start with), this will lead to page faulting
because required pages might no longer be mapped in the WS. Now there are
two kind of page faults, hard and soft. In the case that the pages are still
in memory but not mapped in the WS, the process incurs a soft page fault and
the page is remapped in the WS, however, if the page is no longer in memory
it incurs a hard page fault which leads to a remap from the paging file or
the load library file (exe or dll or ...).
Now lets look at a managed application running in a number of TS sessions.
First, a managed windows program has a higher initials WS than a native
windows application, the reason is that at program start a number of FCL
libraries get loaded and a lot of code gets JITt'd just to initialize the
runtime and the Forms environment. When such application gets minimized the
WS will drop drastically, as a sample say from 14MB to 500 KB, when restored
the WS will grow to something like 2MB (as a result of page faulting, watch
perf counter process\page faults per sec.), but this is only the number of
pages that was needed to restore the applications passive state, the pages
used to initialize the Forms environment are no longer mapped as they aren't
required (yet), whenever you start executing function by activating UI
elements you'll see the WS grow again.
Each process consists of a number of code pages and a number of data pages.
Code pages are pages read from a load library ( a native code DLL or EXE
including DLL's including NGEN'd native code assemblies ), data pages are
pages read from the load library (native and EXE's data segment and managed
assembly MSIL and metadata) and pages from the heaps (including the GC
heaps). Data pages are also called private pages because they aren't shared
across processes, code pages are sharable by definition. In a TS scenario
where you might have several instances running of the same application, the
sharing of code pages is a big plus, that's why you certainly have to NGEN
your application assemblies when running in TS. Now if say you are running
20 instances and one of them gets minimized, the effect on the memory
consumption is nil, the WS of your minimized application will still drop to
500Kb, but the shared (code) pages will stay in RAM because they are
certainly used by other instances, and your data pages will stay in RAM as
long as there is sufficient free RAM. When there is memory pressure, your
minimized application datapages are the first to be swapped out to the
paging file and you will incur a large IO overhead to bring them back in the
WS, this overhead affects all programs running in all sessions. That latter
is the reason why I said don't use this minimize trick, you are just fooling
yourself, when there is insufficient memory to run a certain number of
instances you have a few chooses; run less instances, add additional RAM or
optimize your WS.
As to your questions about page faults and committed memory; you should see
page faults when minimizing and maximizing, just look at the "process" "page
bytes/sec." in perfmon. The committed charge should not change between
minimize and maximize and the pagefile bytes neither. The committed charge
varies with process activity (growth of heaps and stacks) and the page file
bytes varies accordingly. The "IO process" counters are used when executing
process initiated IO, page file IO is initiated by the memory manager in the
OS kernel.
Willy.
Justin,
inline
Willy.
"Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message
news:eu**************@TK2MSFTNGP14.phx.gbl... Willy,
Thank you so much for your help with this. I really appreciate the input, you have been a valuable resource.
If I may ask one more quick question.
If I load the report into a new "appDomain" and then destroy that appDomain when the user closes the report, what your saying is that any Crystal Assembilies that may be loaded should be unloaded, is that the case?
Yes, these are unloaded when the AppDomain unloads.
Is there a better way to run reports that may not kill so much memory? With the applications that you have done on TS how have you done the reporting, or was there no reporting in that situation?
CR is designed for small scale businesses and the desktop, the problem you
have is that the CR integrated applications run on TS, again that's a worst
case scenario as CR was not designed for this (Not sure, but I think there
might be a licensing issue too, as it might be considered as a server
scenario - one application shared by multiple users...)
A large customer I know runs "Crystal Server" on a W2K3 TS for 100+ users,
which is designed for large enterprises. I suggest you contact BI for
details on the product and pricing.
Again I really appreciate the input
Justin
"Willy Denoyette [MVP]" <wi*************@telenet.be> wrote in message news:uQ**************@tk2msftngp13.phx.gbl... Justin,
See inline.
Willy.
"Justin Lazanowski" <ju****@nospam.southeastmilk.org> wrote in message news:%2****************@tk2msftngp13.phx.gbl... Willy,
I appreciate the input, if I may I have a couple of follow up questions. Please bear with me, I am very new to .NET and trying my best to get accuanted with it.
1.) We have another application that is setup the same way as this one will be (I didn't write the other one) and I see 100-200 MB per user that runs that application (again in task manager to be fair). That application isn't run by as many users as this one will be but it clearly concerns me Guess this is the same application that integrates Crystal Reports, please correct me if I'm wrong. Now, I'm not 100% sure, but I think that CR isn't "designed" for and/or tested in TS/Citrix scenarios (never used it in TS), simply it's a desktop class application, and when used in a TS scenario it would be 'classified' as a Server application. The difference between a TS scenario and a desktop is that on a desktop, you don't have multiple applications actively running in parallel , so CR is designed such that it's free to consume all (well most) of the available VAS while this isn't true for a (TS) Server class application where available memory must be shared amongst multiple applications/sessions. I highly doubt that CR was optimized for memory consumption.
2.) Since I don't know what AppDomains are, I am going to have to conclude that no I am not using AppDomains, however when I profile my application, if I run the application profiler before I run a Crystal Report and then I run it after, I see a bunch of Crystal Assembilies loaded. After I close the Crystal form (on it's closed event it calls a this.dispose()) I still see these assembilies loaded. They are taking quite a bit of space. I though that this might help me reclaim some of this.
Application domains are something like light weight processes, but instead of being OS type of isolated execution units, they share the same OS process, but they are managed by the CLR (or simply the VM). Each managed application start it's life in a AppDomain initialy created by the CLR, this is the so called "default" AppDomain, all modules (assemblies) loaded by the application are bound to that AD and they can only be unloaded when the AD unloads. An application can however create additional AD's, and load assemblies and run their code into these AD's. These AD's can unload, and as a result unload the loaded assemblies - which is the only way to unload assembles and release their memory resources. So a classic scenario is that a default domain creates a new AD, loads some application/add-in and runs the code. When done, the default AD, unloads the additional AD and as such releases the code and assemblies from the containing process.
3.) Can you recomend a good application profiler that would help me find any refrences that I may not be closing? Well, I wont recommend any, but here is a list of what's worth to look at. http://blogs.msdn.com/brada/archive/...01/398060.aspx
Note that you must take care before you jump on this bandwagon, thry to undestand the basics of .NET, it execution environment and it's (Garbage collected) memory model before you start using such tools, sometime they can be lifesavers but you can also waste a lot of time with them and they all tend to be somewhat intrusive. A better tool to start with is permon, the CLR publishes a number of counters that can be used to watch your memory allocation pattern.
4.) When you say Ngen'd what exactly are you refering to?
In short: NGEN is the native code generator - ngen.exe, part of the framework. Check this article for more detailed info, note it talks about v2.0 which is a highly improved version of what's available in v1.x, but it basically the principles remain the same.
http://msdn.microsoft.com/msdnmag/is...n/default.aspx
5.) Application design. I am trying to make this application as reusable and flexable as possible. I would like to list something that I am doing now, and if you don't mind letting me know if this is a good or bad way of doing things (Remember I am new so be gentel :) )
My current application logic looks something like this.
Main Form (This will launch the MDI Children no data access is used on this form, a configuration and security class are called from the application to an outside project (MyApplicationClasses)) MDIChild will load its respective controls for instance a data grid. It will instance a new strongly typed dataset, it then instances s a Class that is expcility for Data Access (Data Access Layer). Data Access Layer loads with no constructor. All select statment are in this class and look a little like this public System.Data.Dataset selectSomeInfo(int locID) { //set the paramater to the sqlCommand
sqlSelectEquipment.Paramaters["@LocationCode"].Value = locID; //set the sqlDataAdapter to the sqlCommand (Stored Procedure I want to use) sqlCmd.SelectCommand = sqlSelectEquipment; //ds is untyped and created as a protected member of the class RunSQL is a funciton in the class that Runs the fill method of the dataAdapter ds = RunSQL("Equipment"); return ds; }
So my MDI child would call its instance of the Strongly typed dataset, and then call the DAL.selectSomeInfo it would look like this dsEquipment1.Merge(Selects.selectSomeInfo(1)); dsEquipment1 would be bound to my datagrid. When I close the mdi for I would dispose my Selects object, and call the dispose method on the form.
Am I heading down the wrong track here or is this an OK way to do this? I figured it would be better than calling a million try catch blocks inside of every diffrent form, and I could reuse that select command somewhere else if needed.
Well, I don't see anything wrong with this high level view of the design, just that I'm missing some more details, so it's really hard to make some sensible comment.
I appreciate the input.
Thanks Willy.
"Willy Denoyette [MVP]" <wi*************@telenet.be> wrote in message
news:Oo**************@TK2MSFTNGP14.phx.gbl... "Pohihihi" <po******@hotmail.com> wrote in message news:NE**************@tornado.socal.rr.com... if a big swap then won't it go back to previous memory level after restore? should we see any change in Committed memory? cause I do not see any change in pagefault/CommitCharge/paged memory or IO increase (min or restore). I guess I might be missing some big concept here but few apps we have are running on TS and behave this way. They are dotNet app. They just keep on using memory up to a big limit (some time 1 instance 25% of tot mem). Bad part is they do not release memory if needed by other apps. We have done many code reviews and it is frustrating that can't see what is wrong.
No, it won't go back to the previous level. When you minimize a windows application, the OS starts trimming the Working Set (WS is Mem. Usage in Taskman), note that this doesn't mean that pages are necessarily swapped out to the paging file, they are just marked as 'standby' or 'modified'. After a 'maximize' the process gets reactivated and starts running code (a redraw of the main window to start with), this will lead to page faulting because required pages might no longer be mapped in the WS. Now there are two kind of page faults, hard and soft. In the case that the pages are still in memory but not mapped in the WS, the process incurs a soft page fault and the page is remapped in the WS, however, if the page is no longer in memory it incurs a hard page fault which leads to a remap from the paging file or the load library file (exe or dll or ...). Now lets look at a managed application running in a number of TS sessions. First, a managed windows program has a higher initials WS than a native windows application, the reason is that at program start a number of FCL libraries get loaded and a lot of code gets JITt'd just to initialize the runtime and the Forms environment. When such application gets minimized the WS will drop drastically, as a sample say from 14MB to 500 KB, when restored the WS will grow to something like 2MB (as a result of page faulting, watch perf counter process\page faults per sec.), but this is only the number of pages that was needed to restore the applications passive state, the pages used to initialize the Forms environment are no longer mapped as they aren't required (yet), whenever you start executing function by activating UI elements you'll see the WS grow again. Each process consists of a number of code pages and a number of data pages. Code pages are pages read from a load library ( a native code DLL or EXE including DLL's including NGEN'd native code assemblies ), data pages are pages read from the load library (native and EXE's data segment and managed assembly MSIL and metadata) and pages from the heaps (including the GC heaps). Data pages are also called private pages because they aren't shared across processes, code pages are sharable by definition. In a TS scenario where you might have several instances running of the same application, the sharing of code pages is a big plus, that's why you certainly have to NGEN your application assemblies when running in TS. Now if say you are running 20 instances and one of them gets minimized, the effect on the memory consumption is nil, the WS of your minimized application will still drop to 500Kb, but the shared (code) pages will stay in RAM because they are certainly used by other instances, and your data pages will stay in RAM as long as there is sufficient free RAM. When there is memory pressure, your minimized application datapages are the first to be swapped out to the paging file and you will incur a large IO overhead to bring them back in the WS, this overhead affects all programs running in all sessions. That latter is the reason why I said don't use this minimize trick, you are just fooling yourself, when there is insufficient memory to run a certain number of instances you have a few chooses; run less instances, add additional RAM or optimize your WS.
As to your questions about page faults and committed memory; you should see page faults when minimizing and maximizing, just look at the "process" "page bytes/sec." in perfmon. The committed charge should not change between minimize and maximize and the pagefile bytes neither. The committed charge varies with process activity (growth of heaps and stacks) and the page file bytes varies accordingly. The "IO process" counters are used when executing process initiated IO, page file IO is initiated by the memory manager in the OS kernel.
Willy. This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics
by: Justin Lazanowski |
last post by:
I seem to be having some problems with .NET running a garbage collection.
I have setup an application that reads in records from a database into a
dataset. The user needs to modify a group of...
|
by: Justin Lazanowski |
last post by:
Cross posting this question on the recommendation of an
I have a .NET application that I am developing in C# I am loading
information in from a dataset, and then pushing the dataset to a grid,...
|
by: Rich Denis |
last post by:
Hello,
I have been trying to solve a mysterious memory leak problem and was
hoping that you could help me out on my stuck point.
First a bit of background. We have two app servers in an app...
|
by: ML |
last post by:
We have a rather large vb.net winform app that multiple users access via
sessions on s terminal server. We are running into issues where the
application seems to be bogging down the terminal...
|
by: DolphinDB |
last post by:
Tired of spending countless mintues downsampling your data? Look no further!
In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million...
|
by: ryjfgjl |
last post by:
ExcelToDatabase: batch import excel into database automatically...
|
by: isladogs |
last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM).
In this month's session, we are pleased to welcome back...
|
by: ArrayDB |
last post by:
The error message I've encountered is; ERROR:root:Error generating model response: exception: access violation writing 0x0000000000005140, which seems to be indicative of an access violation...
|
by: Defcon1945 |
last post by:
I'm trying to learn Python using Pycharm but import shutil doesn't work
|
by: Shællîpôpï 09 |
last post by:
If u are using a keypad phone, how do u turn on JavaScript, to access features like WhatsApp, Facebook, Instagram....
|
by: af34tf |
last post by:
Hi Guys, I have a domain whose name is BytesLimited.com, and I want to sell it. Does anyone know about platforms that allow me to list my domain in auction for free. Thank you
|
by: Faith0G |
last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
|
by: isladogs |
last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM).
In this session, we are pleased to welcome former...
| |