Dilip wrote in message ...
If I reserve 500 I take it that space for _atleast_ 500 elements is
created. So if I put an upper limit of 500 in my code to do some
processing and clear the vector post-processing, ideally there should
be no reason why the vector will ever have a need to re-adjust its
memory, right?
Y'know, sometimes you can read 'till you are blue in the face and still not
get a clear picture. Do some tests to solidify it in your mind:
std::vector<int> VecInt(10);
std::cout<<" size="<<VecInt.size()<<" cap="
<<VecInt.capacity()<<std::endl;
// VecInt.push_back( 1 );
for(size_t i(0); i < 11; ++i){
VecInt.push_back( i );
}
std::cout<<" size="<<VecInt.size()<<" cap="
<<VecInt.capacity()<<std::endl;
for(size_t i(0); i < 50; ++i){
VecInt.push_back( i );
}
std::cout<<" size="<<VecInt.size()<<" cap="
<<VecInt.capacity()<<std::endl;
// etc.
// - output -
// size=10 cap=10
// size=21 cap=40
// size=71 cap=80
Notice how every time it exceeds capacity it doubles the capacity[1]?
Now try your own experiment. Set (reserve()) 500, then fill it with 501
elements and see if it doesn't go to cap=1000. Add 500 more elements, what do
you get for capacity()?
If the vector keeps doubling, eventually it will run out of memory, where if
you had set the capacity in the beginning, it may have fit in memory (memory
gets 'fragmented' sometimes, and the vector allocator won't be able to find a
big enough 'solid' chunk to continue.)
That help any?
[1] - (on my compiler implementation. YMMV)
--
Bob R
POVrookie