By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
443,846 Members | 1,862 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 443,846 IT Pros & Developers. It's quick & easy.

need help on generator... (re)

P: n/a

thanks to all who replied to my post (2005-01-21) - (i can not post
inside the original thread as i get "Unable to retrieve message
cs*************" from googlenews :(
Do you mean:
[1,2], [2,3], [3,4], [1,2,3], [2,3,4], [1,3,4]
(E.g. all elements in the power set except the empty set, the sets with
one element and the sets with all elements.)
Otherwise, please describe your desired sets in verbal - I cannot see
the point.
yes it was my wishes, but having the others empty, one and all
elements wasn't a big trouble, actually i wanted to understand more
Here is an (untested) variant that accepts any iterable while trying to
remain memory-efficient. This makes it necessary to shuffle the order of
the output a bit. from itertools import tee, islice def gen(iterable, start, end):
it = iter(iterable)
while True:
it, a = tee(it)
a = tuple(islice(a, end-1))
for sz in xrange(start, len(a)+1):
yield a[:sz]

if __name__ == "__main__":
print list(gen(range(1, 5), 2, 4))

please, this one looks interesting, could you explain a bit how it
works and why it "remain memory-efficient" ?

Jul 18 '05 #1
Share this Question
Share on Google+
1 Reply

P: n/a
Joh wrote:
def gen(iterable, start, end):

if __name__ == "__main__":

please, this one looks interesting, could you explain a bit how it
works and why it "remain memory-efficient" ?

If you have a generator of the form

def gen1(iterable):
for i in iterable:
yield f(i)
for i in iterable:
yield g(i)

and then use it

for item in gen1(range(huge_number)):

it will only work when you feed it with something you can iterate over
multiple times, e. g. a list, not a generator that reads data on every call
of the next() method. That means you must store the data for (to keep it
simple) the lifetime of gen1(). If you can modify the generator to

def gen2(iterable):
for i in iterable:
yield f(i)
yield g(i)

for item in gen2(xrange(huge_number)): # switched from range() to xrange()

there is no such restriction to the iterable. All data can be read,
processed and garbage-collected immediately.

The gen() generator is a bit more complex in that it has to store a few
adjacent items instead of only one and allows for an arbitrary number of
functions (inlined as yield of an n-tuple) instead of always two functions,
but the idea is the same.


Jul 18 '05 #2

This discussion thread is closed

Replies have been disabled for this discussion.