By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
446,303 Members | 1,652 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 446,303 IT Pros & Developers. It's quick & easy.

Can I improve efficiency of 50.000 "new"'ed objects?

P: n/a
Is there features/containers of standard C++ or the STL which will
assist in minimizing memmory fragmentation upon creation of a huge
ammount of objects on the heap?

Currently my application allocates 50.000 objects using "new". I can
store the pointers to those objects in a container/collection class and
indeed this assists in processing (sort, delete etc.) but is there any
way for me to make my "new"'ed object resist in some previously
allocated memmory chunk to improve performance (allocate in bigger
chunks) and avoid memmory fragmentation at the same time?

Just one example of the inefficiency is when I delete my objects, I have
to loop through all and "delete" instead of simply delete in one big
chunk. Obviously I can not use "delete[]" as this would only delete my
array and leave me with a massive memmory leak.

And pointers greatly appreciated!
/Casper
Jul 22 '05 #1
Share this Question
Share on Google+
4 Replies


P: n/a
"Casper" <ca****@jbr.dk> wrote in message
news:Xo*********************@weber.videotron.net.. .
Is there features/containers of standard C++ or the STL which will
assist in minimizing memmory fragmentation upon creation of a huge
ammount of objects on the heap?

Currently my application allocates 50.000 objects using "new". I can
store the pointers to those objects in a container/collection class and
indeed this assists in processing (sort, delete etc.) but is there any
way for me to make my "new"'ed object resist in some previously
allocated memmory chunk to improve performance (allocate in bigger
chunks) and avoid memmory fragmentation at the same time?

Just one example of the inefficiency is when I delete my objects, I have
to loop through all and "delete" instead of simply delete in one big
chunk. Obviously I can not use "delete[]" as this would only delete my
array and leave me with a massive memmory leak.


What are these objects? Are these objects all of the same type? If they
are, then a std::vector can store them contiguously. If not, then things
might be more difficult.

--
David Hilsee
Jul 22 '05 #2

P: n/a
"Casper" <ca****@jbr.dk> wrote in message
news:Xo*********************@weber.videotron.net.. .
Is there features/containers of standard C++ or the STL which will
assist in minimizing memmory fragmentation upon creation of a huge
ammount of objects on the heap?

Currently my application allocates 50.000 objects using "new". I can
store the pointers to those objects in a container/collection class
and indeed this assists in processing (sort, delete etc.) but is
there any way for me to make my "new"'ed object resist in some
previously allocated memmory chunk to improve performance (allocate
in bigger chunks) and avoid memmory fragmentation at the same time?

Just one example of the inefficiency is when I delete my objects, I
have to loop through all and "delete" instead of simply delete in
one big chunk. Obviously I can not use "delete[]" as this would only
delete my array and leave me with a massive memmory leak.


In his book, Modern C++ Design, Andrei Alexandrescu implements a
"Small-Object Allocator". I recommend you buy the book, but you can
download Loki (the library where he implemented everything in the
book) from SourceForge. I believe you can modify it to suit your
needs, even though it might be good enough as it is.

Vladimir Ciobanu
Jul 22 '05 #3

P: n/a
Search for "pool allocation".

Thierry Miceli
www.ideat-solutions.com

Jul 22 '05 #4

P: n/a
IIRC there is a pool allocator in Boost that can be useful too.

VH
Jul 22 '05 #5

This discussion thread is closed

Replies have been disabled for this discussion.