468,135 Members | 1,438 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 468,135 developers. It's quick & easy.

Large swap allocation bug

Hi,

I am trying to execute some code compiled by g++ on Linux and have
found that after some time, the program allocates a huge amount of
swap space (250MB on my machine which has 512MB physical) and (700MB
on another server with 1GB physical RAM).

I have used vmstat to trend the amount of swap and observed that the
memory is not being "thrashed" and there is simply a large amount of
data that has been swapped out. This still slows down my PC and is
therefore a big problem.

There don't appear to be any memory leaks (I've checked the run-time
performance using mpatrol), and the amount of swqp does not increase
linearly, rather it is a single large step.

The algorithm that my code executes is a clustering algorithm that
iteratively clusters a data set using an increasing number of clusters
(these are a class CCluster that I have written to implement the
algorithm) so it makes sense that over time the program would use
slightly more memory - but not the instantaneous leap that I am
observing.

Has anybody seen this sort of behaviour before? Can anybody suggest
where I should begin looking for a bug?

Cheers,
Rob
Jul 22 '05 #1
1 1264
Robert May wrote:
Hi,

I am trying to execute some code compiled by g++ on Linux and have
found that after some time, the program allocates a huge amount of
swap space (250MB on my machine which has 512MB physical) and (700MB
on another server with 1GB physical RAM).

I have used vmstat to trend the amount of swap and observed that the
memory is not being "thrashed" and there is simply a large amount of
data that has been swapped out. This still slows down my PC and is
therefore a big problem.

There don't appear to be any memory leaks (I've checked the run-time
performance using mpatrol), and the amount of swqp does not increase
linearly, rather it is a single large step.

The algorithm that my code executes is a clustering algorithm that
iteratively clusters a data set using an increasing number of clusters
(these are a class CCluster that I have written to implement the
algorithm) so it makes sense that over time the program would use
slightly more memory - but not the instantaneous leap that I am
observing.

Has anybody seen this sort of behaviour before? Can anybody suggest
where I should begin looking for a bug?


1) This is off topic for c.l.c++, as it's not even remotely a question
about the C++ language.

2) If I understand you correctly, it's likely that the problem is not so
much a *bug* as an unfortunate interaction with the underlying allocator
on your system (nothing wrong with the allocator, it just sounds like
you've hit a pessimal case). It goes something like this:

You allocate a chunk of N bytes.
You free that chunk of N bytes (which are now available for further
allocation).
You allocate a chunk of N + m bytes (so those orginal N bytes are
useless; that chunk is too small).
You free that chunk of N + m bytes (...)
You allocate a chunk of N + m + l bytes...

I guess you get the picture... ;-)

Most likely the best solution would be to use placement new -- i.e.
custom allocation.

HTH,
--ag

--
Artie Gold -- Austin, Texas

"If you don't think it matters, you're not paying attention."
Jul 22 '05 #2

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

14 posts views Thread by Otto Meijer | last post: by
3 posts views Thread by Akyl Tulegenov | last post: by
3 posts views Thread by Wayne Marsh | last post: by
57 posts views Thread by Chris Foote | last post: by
7 posts views Thread by Marcus Kwok | last post: by
4 posts views Thread by =?Utf-8?B?U2VyZ2Vp?= | last post: by
27 posts views Thread by didacticone | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.