By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
445,898 Members | 2,039 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 445,898 IT Pros & Developers. It's quick & easy.

Inheriting interators

P: n/a
I am trying to define a series of iterators for a container I am
building, and as forward, bidirectional and random access iterators are
very similar I thought it would be handy to put them into a heirachy,
much like the code below:

struct MyForwardIterator {
typedef forward_iterator_tag iterator_category;
// forward iterator functions
};

struct MyBidirIterator : public MyForwardIterator {
typedef bidirectional_iterator_tag iterator_category;
// extra bidir functions
};

struct MyRandomAccessIterator : public MyBidirIterator {
typedef random_access_iterator_tag iterator_category;
//extra random access functions
};

While this appears to work fine, I feel a little nervous about
overloading typedefs in the different classes. Should I be worried and
is there any unpleasent errors that could arise from this kind of thing?

PS: I know that if I can provide a random access iterator then I don't
have to provide bidirectional and forward iterators
Jul 22 '05 #1
Share this Question
Share on Google+
5 Replies


P: n/a

"Azumanga" <az******@bubblescope.net> wrote in message
news:cf**********@pump1.york.ac.uk...
I am trying to define a series of iterators for a container I am
building, and as forward, bidirectional and random access iterators are very similar I thought it would be handy to put them into a heirachy, much like the code below:

struct MyForwardIterator {
typedef forward_iterator_tag iterator_category;
// forward iterator functions
};

struct MyBidirIterator : public MyForwardIterator {
typedef bidirectional_iterator_tag iterator_category;
// extra bidir functions
};
<snip>
While this appears to work fine, I feel a little nervous about
overloading typedefs in the different classes. Should I be worried and is there any unpleasent errors that could arise from this kind of

thing?

I wouldn't worry about hiding the typedefs from the base classes since
users will see the types of the iterators as container::iterator and
won't know about the base classes.

I would worry about writing conforming iterators, however, since it's
tricky. I'd recommend at least looking at the boost iterator library
before writing iterators from scratch.
(http://www.boost.org/libs/iterator/doc/index.html)

Jonathan
Jul 22 '05 #2

P: n/a
"Azumanga" <az******@bubblescope.net> wrote in message
I am trying to define a series of iterators for a container I am
building, and as forward, bidirectional and random access iterators are
very similar I thought it would be handy to put them into a heirachy,
much like the code below:

struct MyForwardIterator {
typedef forward_iterator_tag iterator_category;
// forward iterator functions
};

struct MyBidirIterator : public MyForwardIterator {
typedef bidirectional_iterator_tag iterator_category;
// extra bidir functions
};

struct MyRandomAccessIterator : public MyBidirIterator {
typedef random_access_iterator_tag iterator_category;
//extra random access functions
};

While this appears to work fine, I feel a little nervous about
overloading typedefs in the different classes. Should I be worried and
is there any unpleasent errors that could arise from this kind of thing?


It seems OK to me to override static functions and types (ie. nested classes
and typedefs).

The problem with overriding non-virtual member functions is that when you
call the function from a pointer to the base class or pointer to the derived
class you get different behavior, which is not intuitive. In reality, the
same can be said for static functions and types.

(Though as an aside I think it's OK to override non-virtual functions if you
return the same object in a covariant manner, so the base class returns a
std::auto_ptr<Base> and the derived class returns the same object as a
std::auto_ptr<Derived>).

However, because C++ is a statically typed language, the argument does not
hold much sway for static types. Sure, the derived class may point to a
different type. But when you write

template <class Container>
typename Container::value_type product(const Container& c) {
if (c.empty()) throw ContainerHasZeroSize("product");
typename Container::value_type result = c.first();
typedef typename Container::const_iterator Iter;
Iter iter = c.begin();
const Iter end = c.end();
for (++iter; iter!=end; ++iter) result *= *iter;
return result;
}

the template function 'product' will not be part of the compiled object file
or executable. It is only when you call product(const std::vector<int>&)
that the compiler instantiates the template function 'product' for that
particular type (namely Container=std::vector<int>), and the specialization
becomes part of the compiled object file.

Now at compile time we know the exact type of Container, so we can just call
the template function product(const Container&), and the compiler will
instantiate the function for us at compile time, using the correct semantics
of Container, Container::value_type, Container::const_iterator, and so on.
It will use the correct sizeof, substitute the correct operator*, and so
forth.

Had C++ been a dynamically typed language, with template instantiation
occuring at runtime after we instantiated a particular Container, then
having virtual static types (ie. nested classes and typedefs) would make
sense.

Does this make sense?
Jul 22 '05 #3

P: n/a
Azumanga <az******@bubblescope.net> wrote:
PS: I know that if I can provide a random access iterator then I don't
have to provide bidirectional and forward iterators


I would put it quite differently: if you can provide random access
iterators for a sequence, it is quite useless to also provide iterators
for this sequence which are more restrictive! In fact, for some
algorithms taking less powerful iterators it would actually be harmful!
Just because e.g. 'copy()' operates on input and output iterators it
doesn't mean that it can't benefit from random access iterators.

Always provide the most powerful iterator for your sequences as possible.
Do not provide less powerful ones because all this achieves is preventing
possibilities for optimizations.
--
<mailto:di***********@yahoo.com> <http://www.dietmar-kuehl.de/>
<http://www.contendix.com> - Software Development & Consulting
Jul 22 '05 #4

P: n/a
> Always provide the most powerful iterator for your sequences as possible.
Do not provide less powerful ones because all this achieves is preventing
possibilities for optimizations.


I entirely agree with you, however the purpose of my creating these
iterators is exactly to test algorithms on multiple types of iterators
exactly to find out what optimisations are possible and performed :)

Chris
Jul 22 '05 #5

P: n/a
Siemel Naran wrote:
<snip>
Does this make sense?


Yes it does, and it sets my mind at rest. Thanks :)
Jul 22 '05 #6

This discussion thread is closed

Replies have been disabled for this discussion.