By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
434,572 Members | 963 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 434,572 IT Pros & Developers. It's quick & easy.

Memory Problem Again

P: n/a
Ok, lets try this again.

I have a program which searches all disk drives for certain file types.
When it finds a file it writes a record to a Firebird DB. The program
normally is using 40 - 44 megs. However after running for about a minute
the memory usage suddenly jumps up to 440 - 450 megs for about 15 - 20
seconds and then instantly drops back to normal usage. When the memory
load peaks the program stop scanning and resumes when memory usage goes
back to normal.. Any ideas what could be going on?

I downloaded the Scitech memory profiler and according to it the memory
being allocated is not in any of my classes. While Sysinternals process
explorer reports memory usage jumping from 40 meg to 450 meg the Sci
profiler shows no jump in memory usage internal to my program at all.
Something is happening however. The Scitech profiler does show my program
going into a steady state, no memory is being allocated or freed up; i.e.
no processing within my code. Any Microsoft guys here have any idea?
Nov 17 '05 #1
Share this Question
Share on Google+
8 Replies


P: n/a
I don't think that process explorer is going to help you in this case,
as it gives you the same information that task manager does, that is to say,
it is showing you the working set, which is NOT an (accurate) indicator of
the memory that your program is allocating.

So the scitech profiler is showing that memory is being allocated
outside of your code. Well, this is not so surprizing. I haven't seen your
code, but perhaps you have a few large data sets, or you are constantly
fetching large amounts of data. Those data sets add up in memory, and then
they are freed by the gc eventually, which could be exactly what you are
seeing.

The question that is really important here is how is this affecting the
performance of your program? This kind of behavior is actually perfectly
normal (and you admit, the memory goes back down). It's part of the price
you pay for a memory managed system.

Or, are you just worried because you see the number go up?
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"TheB" <*****@xxx.com**> wrote in message
news:Xn******************@63.218.45.254...
Ok, lets try this again.

I have a program which searches all disk drives for certain file types.
When it finds a file it writes a record to a Firebird DB. The program
normally is using 40 - 44 megs. However after running for about a minute
the memory usage suddenly jumps up to 440 - 450 megs for about 15 - 20
seconds and then instantly drops back to normal usage. When the memory
load peaks the program stop scanning and resumes when memory usage goes
back to normal.. Any ideas what could be going on?

I downloaded the Scitech memory profiler and according to it the memory
being allocated is not in any of my classes. While Sysinternals process
explorer reports memory usage jumping from 40 meg to 450 meg the Sci
profiler shows no jump in memory usage internal to my program at all.
Something is happening however. The Scitech profiler does show my program
going into a steady state, no memory is being allocated or freed up; i.e.
no processing within my code. Any Microsoft guys here have any idea?

Nov 17 '05 #2

P: n/a
Thanks for your response. I am not using large data sets and did suspect
the cg. I put in a call to System.GC.Collect()that I am probaly calling
more than is needed. The SciTech does show the results of calling the cg
as memory allocated and used stays very consistent which is what I would
expect. Given that I would be very surprised if CG made memory usage
jump from 40 - 50 MBs to 450 MB. That does not seem right. It's effects
by program drastically because when the memory jumps to 450 MB my program
stop processing for abouyt 15 - 20 seconds. This is verified by the
Scitech as no memory is being allocated and freed, no objects are being
created or destroyed. Also, this behavior is totally unacceptable for
deplyment on my target customers machines which typically have 512MB. If
this program does this the machine will grid to a halt.

This kind of behavior is not normal. A program does not go from using 40
- 50 MBs and then jump to 10 times that and stay for 15 - 20 seconds.

The real question is of course what of earth can cause that to occur?

"Nicholas Paldino [.NET/C# MVP]" <mv*@spam.guard.caspershouse.com> wrote
in news:eI**************@TK2MSFTNGP12.phx.gbl:
I don't think that process explorer is going to help you in this
case,
as it gives you the same information that task manager does, that is
to say, it is showing you the working set, which is NOT an (accurate)
indicator of the memory that your program is allocating.

So the scitech profiler is showing that memory is being allocated
outside of your code. Well, this is not so surprizing. I haven't
seen your code, but perhaps you have a few large data sets, or you are
constantly fetching large amounts of data. Those data sets add up in
memory, and then they are freed by the gc eventually, which could be
exactly what you are seeing.

The question that is really important here is how is this
affecting the
performance of your program? This kind of behavior is actually
perfectly normal (and you admit, the memory goes back down). It's
part of the price you pay for a memory managed system.

Or, are you just worried because you see the number go up?


Nov 17 '05 #3

P: n/a

"TheB" <*****@xxx.com**> wrote in message
news:Xn******************@63.218.45.254...
Ok, lets try this again.

I have a program which searches all disk drives for certain file types.
When it finds a file it writes a record to a Firebird DB. The program
normally is using 40 - 44 megs. However after running for about a minute
the memory usage suddenly jumps up to 440 - 450 megs for about 15 - 20
seconds and then instantly drops back to normal usage. When the memory
load peaks the program stop scanning and resumes when memory usage goes
back to normal.. Any ideas what could be going on?

I downloaded the Scitech memory profiler and according to it the memory
being allocated is not in any of my classes. While Sysinternals process
explorer reports memory usage jumping from 40 meg to 450 meg the Sci
profiler shows no jump in memory usage internal to my program at all.
Something is happening however. The Scitech profiler does show my program
going into a steady state, no memory is being allocated or freed up; i.e.
no processing within my code. Any Microsoft guys here have any idea?


If the SciTech profiler shows no excessive GC heap allocations, while the
the memory jumps to 440-450 MB, that means that the memory is not allocated
by the CLR, instead the memory must be allocated by a non managed component
out of the control of the GC - or simply said this is an "unmanaged
resource". This is why I said that you should take a look at what the code
is doing when the memory jumps skyhigh, are you sure you are correctly and
timely Disposing your Disposable objects? It looks like you or the
(FireBird DB code) are relying on the Finalizer to clean up. Anyway, without
you showing some code it's real hard to say where you should start to look
at. Also I would like to know how you are scanning your drives to look for
file types, this seems like a very expensive kind of action to perform on a
user system if not well done!
Willy.

Nov 17 '05 #4

P: n/a
That's the thing. You say it isn't normal, but you offer no code, and
no details of what you are doing (other than using the firebird data
provider, which is vague, at best) to support that claim.

It's very possible that this could be normal.

As Willy said, are you properly disposing of everything that implements
IDisposable? The firebird db provider is the only clue you gave as to what
you are doing, and data providers have the potential to use unmanaged code,
along with the fact that they provide connections, which should be disposed
of in a timely manner.

--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"TheB" <*****@xxx.com**> wrote in message
news:Xn*****************@63.218.45.254...
Thanks for your response. I am not using large data sets and did suspect
the cg. I put in a call to System.GC.Collect()that I am probaly calling
more than is needed. The SciTech does show the results of calling the cg
as memory allocated and used stays very consistent which is what I would
expect. Given that I would be very surprised if CG made memory usage
jump from 40 - 50 MBs to 450 MB. That does not seem right. It's effects
by program drastically because when the memory jumps to 450 MB my program
stop processing for abouyt 15 - 20 seconds. This is verified by the
Scitech as no memory is being allocated and freed, no objects are being
created or destroyed. Also, this behavior is totally unacceptable for
deplyment on my target customers machines which typically have 512MB. If
this program does this the machine will grid to a halt.

This kind of behavior is not normal. A program does not go from using 40
- 50 MBs and then jump to 10 times that and stay for 15 - 20 seconds.

The real question is of course what of earth can cause that to occur?

"Nicholas Paldino [.NET/C# MVP]" <mv*@spam.guard.caspershouse.com> wrote
in news:eI**************@TK2MSFTNGP12.phx.gbl:
I don't think that process explorer is going to help you in this
case,
as it gives you the same information that task manager does, that is
to say, it is showing you the working set, which is NOT an (accurate)
indicator of the memory that your program is allocating.

So the scitech profiler is showing that memory is being allocated
outside of your code. Well, this is not so surprizing. I haven't
seen your code, but perhaps you have a few large data sets, or you are
constantly fetching large amounts of data. Those data sets add up in
memory, and then they are freed by the gc eventually, which could be
exactly what you are seeing.

The question that is really important here is how is this
affecting the
performance of your program? This kind of behavior is actually
perfectly normal (and you admit, the memory goes back down). It's
part of the price you pay for a memory managed system.

Or, are you just worried because you see the number go up?

Nov 17 '05 #5

P: n/a
Thanks for the responses. I have been experimenting and have narrowed it
down to the Firbird DotNet interface. I am disposing of everything, first
thing I though of. I am surprised that you would accept as normal a
program instantly going from 40 mb to over ten times that memory usage and
all processing stopping for over 20 seconds.

"Nicholas Paldino [.NET/C# MVP]" <mv*@spam.guard.caspershouse.com> wrote
in news:eO*************@TK2MSFTNGP10.phx.gbl:
That's the thing. You say it isn't normal, but you offer no code,
and
no details of what you are doing (other than using the firebird data
provider, which is vague, at best) to support that claim.

It's very possible that this could be normal.

As Willy said, are you properly disposing of everything that
implements
IDisposable? The firebird db provider is the only clue you gave as to
what you are doing, and data providers have the potential to use
unmanaged code, along with the fact that they provide connections,
which should be disposed of in a timely manner.


Nov 17 '05 #6

P: n/a
Why not? Anything is possible. And again, because I don't know what
YOU are doing with the firebird provider, I couldn't tell you.

For all I kow, you might be selecting a few million rows into a data
set. That would definitely cause a spike, one would think.

But I don't know that, because all you have indicated you were doing is
that you were using the firebird provider.

It's your application, we know nothing about it beyond what you tell us,
which up to this point, has not been much. So when you say you are
surprized that I accept it as normal, I am not saying that, necessarily.
What I am saying is that given the limited set of inputs, it is very
feasable that this would be normal for your app.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"TheB" <*****@xxx.com**> wrote in message
news:Xn******************@63.218.45.254...
Thanks for the responses. I have been experimenting and have narrowed it
down to the Firbird DotNet interface. I am disposing of everything, first
thing I though of. I am surprised that you would accept as normal a
program instantly going from 40 mb to over ten times that memory usage and
all processing stopping for over 20 seconds.

"Nicholas Paldino [.NET/C# MVP]" <mv*@spam.guard.caspershouse.com> wrote
in news:eO*************@TK2MSFTNGP10.phx.gbl:
That's the thing. You say it isn't normal, but you offer no code,
and
no details of what you are doing (other than using the firebird data
provider, which is vague, at best) to support that claim.

It's very possible that this could be normal.

As Willy said, are you properly disposing of everything that
implements
IDisposable? The firebird db provider is the only clue you gave as to
what you are doing, and data providers have the potential to use
unmanaged code, along with the fact that they provide connections,
which should be disposed of in a timely manner.

Nov 17 '05 #7

P: n/a
Why not? Anything is possible. And again, because I don't know
what
YOU are doing with the firebird provider, I couldn't tell you.
Actually I did explain that it was a tight recursive loop searching for
file and when it found one it wrote out a record. As I said it does the
same thing over and over.

For all I kow, you might be selecting a few million rows into a
data
set. That would definitely cause a spike, one would think.

If you had read the post you would have know I wasn't selecting a million
records. I'm writing out one record at a time.
But I don't know that, because all you have indicated you were
doing is
that you were using the firebird provider.

You don't know that because you apparently didn't read the post.
It's your application, we know nothing about it beyond what you
tell us,
which up to this point, has not been much. So when you say you are
surprized that I accept it as normal, I am not saying that,
necessarily. What I am saying is that given the limited set of inputs,
it is very feasable that this would be normal for your app.


Not given what it is doing and what I explained it is quit surprising for
memory to all of the sudden jump to 10X normal usage when it is doing the
same thing over and over.
In any case thanks for your input.
Nov 17 '05 #8

P: n/a
"Willy Denoyette [MVP]" <wi*************@telenet.be> wrote in
news:OS**************@TK2MSFTNGP10.phx.gbl:

"TheB" <*****@xxx.com**> wrote in message
news:Xn******************@63.218.45.254...
Ok, lets try this again.

I have a program which searches all disk drives for certain file
types. When it finds a file it writes a record to a Firebird DB. The
program normally is using 40 - 44 megs. However after running for
about a minute the memory usage suddenly jumps up to 440 - 450 megs
for about 15 - 20 seconds and then instantly drops back to normal
usage. When the memory load peaks the program stop scanning and
resumes when memory usage goes back to normal.. Any ideas what could
If the SciTech profiler shows no excessive GC heap allocations, while
the the memory jumps to 440-450 MB, that means that the memory is not
allocated by the CLR, instead the memory must be allocated by a non
managed component out of the control of the GC - or simply said this
is an "unmanaged resource". This is why I said that you should take a
look at what the code is doing when the memory jumps skyhigh, are you
sure you are correctly and timely Disposing your Disposable objects?
It looks like you or the (FireBird DB code) are relying on the
Finalizer to clean up. Anyway, without you showing some code it's real
hard to say where you should start to look at. Also I would like to
know how you are scanning your drives to look for file types, this
seems like a very expensive kind of action to perform on a user system
if not well done!


Yeah, it's something in their code. You are right, scanning drives
looking for file types can be expensive. It's actually working pretty
good since there are certain directories we can exclude. We are dealing
with users who have no idea where things are so we have to find it for
them. Thanks for your input, it helped confirm what I had alread
suspected.
Nov 17 '05 #9

This discussion thread is closed

Replies have been disabled for this discussion.