By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
448,562 Members | 1,244 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 448,562 IT Pros & Developers. It's quick & easy.

Memory usage issues

P: n/a
SDS
I am writing an ASP.NET application (in C#) that, as part of a
particular response, populates a MemoryStream object with binary data
that is being collected from a Process object's StandardOutput. This
data can be between 60MB and 100MB under normal circumstances. I write
this data to the response stream.

I'm noticing that calls to this particular page are causing W3WP to use
upwards of 60MB. When W3WP starts up, its using around 24MB or so.
Calls to simpler pages do not cause the memory usage to increase
significantly at all. So, it makes sense to me that this memory usage
is directly related to those stream objects.

In order to get memory usage back down, I've simplified code and tried
everything I can think of. I'm calling .Close() on the stream, I'm
calling .Close() on the BinaryWriter, I've tried casting the stream to
IDisposable in order to call .Dispose() on it. I've even made a call
to GC.Collect() as a last resort to see if it had any effect. Nothing
is working. I have a high level of confidence that my code is
releasing resources as it should (unless, of course, there is something
I can do outside of the methods I described above).

The desired result is to see the W3WP drop back down to (or somewhere
close at least) the 24MB it initially started up with. Are my
expectations unreasonable? What am I missing here? Is there something
I don't understand about the way garbage collection works?

Incidentally, subsequent calls to this same page do not cause the
memory usage to double/triple/etc. It seems to be holding tight at the
60MB mark or so (I can see the memory usage increase briefly and then
drop back down). So, for some reason, after the initial call to the
page in question, there is always this 40MB or so chunk that just
refuses to go away.

Thanks in advance! =)

Nov 17 '05 #1
Share this Question
Share on Google+
5 Replies


P: n/a
Inline

Willy.

"SDS" <ss*******@gmail.com> wrote in message
news:11********************@g14g2000cwa.googlegrou ps.com...
I am writing an ASP.NET application (in C#) that, as part of a
particular response, populates a MemoryStream object with binary data
that is being collected from a Process object's StandardOutput. This
data can be between 60MB and 100MB under normal circumstances. I write
this data to the response stream.

I'm noticing that calls to this particular page are causing W3WP to use
upwards of 60MB. When W3WP starts up, its using around 24MB or so.
Calls to simpler pages do not cause the memory usage to increase
significantly at all. So, it makes sense to me that this memory usage
is directly related to those stream objects.

In order to get memory usage back down, I've simplified code and tried
everything I can think of. I'm calling .Close() on the stream, I'm
calling .Close() on the BinaryWriter, I've tried casting the stream to
IDisposable in order to call .Dispose() on it. I've even made a call
to GC.Collect() as a last resort to see if it had any effect. Nothing
is working. I have a high level of confidence that my code is
releasing resources as it should (unless, of course, there is something
I can do outside of the methods I described above).

The desired result is to see the W3WP drop back down to (or somewhere
close at least) the 24MB it initially started up with. Are my
expectations unreasonable? What am I missing here? Is there something
I don't understand about the way garbage collection works?

No, this is the normal behavior and the result of how the CLR manages it's
memory pool (the managed heaps), whenever you create such large object (or a
large number of smaller objects), the CLR needs grab additional memory from
the process heap in order to store the object data in the large object heap.
After the object is released and collected by the GC, that extra allocated
memory is not returned to the OS as long as the OS is not requesting the CLR
to do so. The task of the GC is only to remove the garbage from the managed
heaps and to compact the objects inside the heaps, not to manage the heaps
themself, this is the task of the CLR in cooperation with the OS.
Incidentally, subsequent calls to this same page do not cause the
memory usage to double/triple/etc. It seems to be holding tight at the
60MB mark or so (I can see the memory usage increase briefly and then
drop back down). So, for some reason, after the initial call to the
page in question, there is always this 40MB or so chunk that just
refuses to go away.

Thanks in advance! =)

Nov 17 '05 #2

P: n/a
SDS
Very cool, I was expecting an answer along these lines. =)

So, my dev box has 2GB of RAM I believe. Obviously 60MB is not a
concern to me personally, but my concerns were more along the lines of
this being installed by others on boxes with substantially less RAM
(like 256MB for example, where 60MB *is* a big deal). Would it be safe
to say that the behavior I am observing may vary from system to system,
depending on the hardware configuration of that system?

Also, if watching the process memory utilization is not an indication
of whether or not I am cleaning up resources properly, what would be a
good way to monitor such things?

Thanks!

Nov 17 '05 #3

P: n/a
SDS,

You might want to check the performance counters for .NET. It will give
you information such as number of GC's, bytes retrieved per GC, and a whole
slew of other information (there are 12 categories alone for .NET, 2 of
which are specific to data providers, the other 10 related to the CLR, with
individual counters in each category).

Hope this helps.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"SDS" <ss*******@gmail.com> wrote in message
news:11*********************@z14g2000cwz.googlegro ups.com...
Very cool, I was expecting an answer along these lines. =)

So, my dev box has 2GB of RAM I believe. Obviously 60MB is not a
concern to me personally, but my concerns were more along the lines of
this being installed by others on boxes with substantially less RAM
(like 256MB for example, where 60MB *is* a big deal). Would it be safe
to say that the behavior I am observing may vary from system to system,
depending on the hardware configuration of that system?

Also, if watching the process memory utilization is not an indication
of whether or not I am cleaning up resources properly, what would be a
good way to monitor such things?

Thanks!

Nov 17 '05 #4

P: n/a
Hi,
"SDS" <ss*******@gmail.com> wrote in message
news:11*********************@z14g2000cwz.googlegro ups.com...
Very cool, I was expecting an answer along these lines. =)

So, my dev box has 2GB of RAM I believe. Obviously 60MB is not a
concern to me personally, but my concerns were more along the lines of
this being installed by others on boxes with substantially less RAM
(like 256MB for example, where 60MB *is* a big deal). Would it be safe
to say that the behavior I am observing may vary from system to system,
depending on the hardware configuration of that system?


It will vary of course, if you have less memory the OS may reclaim the
memory sooner. What you can be sure is that if you have a memorystream of
60MB you will have that consuption no matter where you run it so the OS
would need to page to disk part of the memory used by other programs.
Cheers,

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation

Nov 17 '05 #5

P: n/a

"SDS" <ss*******@gmail.com> wrote in message
news:11*********************@z14g2000cwz.googlegro ups.com...
Very cool, I was expecting an answer along these lines. =)

So, my dev box has 2GB of RAM I believe. Obviously 60MB is not a
concern to me personally, but my concerns were more along the lines of
this being installed by others on boxes with substantially less RAM
(like 256MB for example, where 60MB *is* a big deal). Would it be safe
to say that the behavior I am observing may vary from system to system,
depending on the hardware configuration of that system?

Also, if watching the process memory utilization is not an indication
of whether or not I am cleaning up resources properly, what would be a
good way to monitor such things?

Thanks!


Well, honestly, running asp.net on a 256MB box isn't exactly what I would
suggest anyway.
Another thing you shouldn't do is use a webserver as a means of transferring
huge chunk's of data like this, and in general in .NET you should try NOT to
allocate such huge "objects" at all, the reason for this is that they end on
the LOH which is never compacted by the GC, so the danger exists that you'll
end with a LOH that is so badly fragmented that you'll end with memory
allocation failures when there is still sufficient (total) memory available
but not as one contiguous block.
Also note that asp.net has a protection mecahnism to prevent excessive
memory consumption by bad behaving web applications, the memory limit can be
set to a certain percentage of the total memory in the machine config file.
Monitoring memory resources can best be done using permon, herewith you can
watch the OS counters for process memory and the CLR counters for GC memory
(hot normal and large heaps).

Willy.


Nov 17 '05 #6

This discussion thread is closed

Replies have been disabled for this discussion.