By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,016 Members | 2,265 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,016 IT Pros & Developers. It's quick & easy.

Why don't generators execute until first yield?

P: n/a
Hi!

First a bit of context.

Yesterday I spent a lot of time debugging the following method in a
rather slim database abstraction layer we've developed:

,----
| def selectColumn(self, table, column, where={}, order_by=[], group_by=[]):
| """Performs a SQL select query returning a single column
|
| The column is returned as a list. An exception is thrown if the
| result is not a single column."""
| query = build_select(table, [column], where, order_by, group_by)
| result = DBResult(self.rawQuery(query))
| if result.colcount != 1:
| raise QueryError("Query must return exactly one column", query)
| for row in result.fetchAllRowsAsList():
| yield row[0]
`----

I'd just rewritten the method as a generator rather than returning a
list of results. The following test then failed:

,----
| def testSelectColumnMultipleColumns(self):
| res = self.fdb.selectColumn('db3ut1', ['c1', 'c2'],
| {'c1':(1, 2)}, order_by='c1')
| self.assertRaises(db3.QueryError, self.fdb.selectColumn,
| 'db3ut1', ['c1', 'c2'], {'c1':(1, 2)}, order_by='c1')
`----

I expected this to raise a QueryError due to the result.colcount != 1
constraint being violated (as was the case before), but that isn't the
case. The constraint it not violated until I get the first result from
the generator.

Now to the main point. When a generator function is run, it immediately
returns a generator, and it does not run any code inside the generator.
Not until generator.next() is called is any code inside the generator
executed, giving it traditional lazy evaluation semantics. Why don't
generators follow the usual eager evaluation semantics of Python and
immediately execute up until right before the first yield instead?
Giving generators special case semantics for no good reason is a really
bad idea, so I'm very curious if there is a good reason for it being
this way. With the current semantics it means that errors can pop up at
unexpected times rather than the code failing fast.

Martin
Jun 27 '08 #1
Share this Question
Share on Google+
13 Replies


P: n/a
On Wed, May 7, 2008 at 2:29 AM, Martin Sand Christensen <ms*@es.aau.dkwrote:
Now to the main point. When a generator function is run, it immediately
returns a generator, and it does not run any code inside the generator.
Not until generator.next() is called is any code inside the generator
executed, giving it traditional lazy evaluation semantics. Why don't
generators follow the usual eager evaluation semantics of Python and
immediately execute up until right before the first yield instead?
Giving generators special case semantics for no good reason is a really
bad idea, so I'm very curious if there is a good reason for it being
this way. With the current semantics it means that errors can pop up at
unexpected times rather than the code failing fast.
Isn't lazy evaluation sort of the whole point of replacing a list with
an iterator? Besides which, running up to the first yield when
instantiated would make the generator's first iteration inconsistent
with the remaining iterations. Consider this somewhat contrived
example:

def printing_iter(stuff):
for item in stuff:
print item
yield item

Clearly, the idea here is to create a generator that wraps another
iterator and prints each item as it yields it. But using your
suggestion, this would instead print the first item at the time the
generator is created, rather than when the first item is actually
iterated over.

If you really want a generator that behaves the way you describe, I
suggest doing something like this:

def myGenerator(args):
immediate_setup_code()

def generator():
for item in actual_generator_loop():
yield item
return generator()
Jun 27 '08 #2

P: n/a
Martin Sand Christensen <ms*@es.aau.dkwrote:
Now to the main point. When a generator function is run, it
immediately
returns a generator, and it does not run any code inside the
generator.
Not until generator.next() is called is any code inside the generator
executed, giving it traditional lazy evaluation semantics. Why don't
generators follow the usual eager evaluation semantics of Python and
immediately execute up until right before the first yield instead?
You mean you expect the semantics of generators to be that when you
create them, or every time you call next() they run until they hit yield
and then (except for the initial run) return the result that was yielded
the time before? It is easy enough to implement that, but could be a bit
confusing for the user.
>>def greedy(fn):
def greedygenerator(*args, **kw):
def delayed():
it = iter(fn(*args, **kw))
try:
res = it.next()
except StopIteration:
yield None
return
yield None
for value in it:
yield res
res = value
yield res
it = delayed()
it.next()
return it
return greedygenerator
>>@greedy
def mygen(n):
for i in range(n):
print i
yield i

>>x = mygen(3)
0
>>list(x)
1
2
[0, 1, 2]
>>x = mygen(0)
list(x)
[]
>>>
Now try:

for command in getCommandsFromUser():
print "the result of that command was", execute(command)

where getCommandsFromUser is a greedy generator that reads from stdin,
and see why generators don't work that way.

Jun 27 '08 #3

P: n/a
>>>>"Ian" == Ian Kelly <ia*********@gmail.comwrites:
IanIsn't lazy evaluation sort of the whole point of replacing a list
Ianwith an iterator? Besides which, running up to the first yield when
Ianinstantiated would make the generator's first iteration
Ianinconsistent with the remaining iterations.

That wasn't my idea, although that may not have come across quite
clearly enough. I wanted the generator to immediately run until right
before the first yield so that the first call to next() would start with
the first yield.

My objection is that generators _by default_ have different semantics
than the rest of the language. Lazy evaluation as a concept is great for
all the benefits it can provide, but, as I've illustrated, strictly lazy
evaluation semantics can be somewhat surprising at times and lead to
problems that are hard to debug if you don't constantly bear the
difference in mind. In this respect, it seems to me that my suggestion
would be an improvement. I'm not any kind of expert on languages,
though, and I may very well be missing a part of the bigger picture that
makes it obvous why things should be as they are.

As for code to slightly change the semantics of generators, that doesn't
really address the issue as I see it: if you're going to apply such code
to your generators, you're probably doing it exactly because you're
aware of the difference in semantics, and you're not going to be
surprised by it. You may still want to change the semantics, but for
reasons that are irrelevant to my point.

Martin
Jun 27 '08 #4

P: n/a
>>>>"Duncan" == Duncan Booth <du**********@invalid.invalidwrites:
[...]
DuncanNow try:
Duncan>
Duncan for command in getCommandsFromUser():
Duncan print "the result of that command was", execute(command)
Duncan>
Duncanwhere getCommandsFromUser is a greedy generator that reads from stdin,
Duncanand see why generators don't work that way.

I don't see a problem unless the generator isn't defined where it's
going to be used. In other similar input bound use cases, such as the
generator iterating over a query result set in my original post, I see
even less of a problem. Maybe I'm simply daft and you need to spell it
out for me. :-)

Martin
Jun 27 '08 #5

P: n/a
Martin Sand Christensen <ms*@es.aau.dkwrote:
>>>>>"Duncan" == Duncan Booth <du**********@invalid.invalidwrites:
[...]
DuncanNow try:
Duncan>
Duncan for command in getCommandsFromUser():
Duncan print "the result of that command was",
execute(command) Duncan>
Duncanwhere getCommandsFromUser is a greedy generator that reads
from stdin, Duncanand see why generators don't work that way.

I don't see a problem unless the generator isn't defined where it's
going to be used. In other similar input bound use cases, such as the
generator iterating over a query result set in my original post, I see
even less of a problem. Maybe I'm simply daft and you need to spell it
out for me. :-)
It does this:
>>@greedy
def getCommandsFromUser():
while True:
yield raw_input('Command?')

>>for cmd in getCommandsFromUser():
print "that was command", cmd
Command?hello
Command?goodbye
that was command hello
Command?wtf
that was command goodbye
Command?

Traceback (most recent call last):
File "<pyshell#56>", line 1, in <module>
for cmd in getCommandsFromUser():
File "<pyshell#42>", line 11, in delayed
for value in it:
File "<pyshell#53>", line 4, in getCommandsFromUser
yield raw_input('Command?')
KeyboardInterrupt
Jun 27 '08 #6

P: n/a
Duncan Booth wrote:
It does this:
>>>@greedy
def getCommandsFromUser():
while True:
yield raw_input('Command?')

>>>for cmd in getCommandsFromUser():
print "that was command", cmd
Command?hello
Command?goodbye
that was command hello
Command?wtf
that was command goodbye
Command?

Not here..
In [7]: def getCommandsFromUser():
while True:
yield raw_input('Command?')
...:
...:

In [10]: for cmd in getCommandsFromUser(): print "that was command", cmd
....:
Command?hi
that was command hi
Command?there
that was command there
Command?wuwuwuw
that was command wuwuwuw
Command?
Jun 27 '08 #7

P: n/a
Marco Mariani wrote:
Not here..
Oh, sorry, I obviously didn't see the @greedy decorator amongst all the
quoting levels.

Anyway, the idea doesn't make much sense to me :)
Jun 27 '08 #8

P: n/a
Marco Mariani <ma***@sferacarta.comwrote:
Duncan Booth wrote:
>It does this:
>>>>@greedy
def getCommandsFromUser():
while True:
yield raw_input('Command?')

>>>>for cmd in getCommandsFromUser():
print "that was command", cmd
Command?hello
Command?goodbye
that was command hello
Command?wtf
that was command goodbye
Command?


Not here..
In [7]: def getCommandsFromUser():
while True:
yield raw_input('Command?')
...:
...:

In [10]: for cmd in getCommandsFromUser(): print "that was command",
cmd
....:
Command?hi
that was command hi
Command?there
that was command there
Command?wuwuwuw
that was command wuwuwuw
Command?
Perhaps if you'd copied all of my code (including the decorator that was
the whole point of it)...

Jun 27 '08 #9

P: n/a
Duncan Booth wrote:
Perhaps if you'd copied all of my code (including the decorator that was
the whole point of it)...
Sure, I missed the point. Python's symbols become quoting levels and
mess up messages.

Anyway, I would loathe to start execution of a generator before starting
to iterate through it. Especially when generators are passed around.
The current behavior makes perfect sense.

Jun 27 '08 #10

P: n/a
On May 7, 7:37*am, Marco Mariani <ma...@sferacarta.comwrote:
Duncan Booth wrote:
Perhaps if you'd copied all of my code (including the decorator that was
the whole point of it)...

Sure, I missed the point. Python's symbols become quoting levels and
mess up messages.

Anyway, I would loathe to start execution of a generator before starting
to iterate through it. Especially when generators are passed around.
The current behavior makes perfect sense.
Question:
>>def f( ):
... print 0
... while 1:
... yield 1
...
>>g= f( )
g.next( )
0
1
>>g.next( )
1
>>g.next( )
1

This might fit the bill:
>>def dropfirst( h ):
... h.next( )
... return h
...
>>g= dropfirst( f( ) )
0
>>g.next( )
1
>>g.next( )
1
>>g.next( )
1

However as dropfirst is dropping a value, both caller -and- cally have
to designate a/the exception. Hold generators are better "first-
dropped", and you hold 'next' inherently causes side effects. @greedy
(from earlier) frees the caller of a responsibility/obligation.

What can follow without a lead?

The definitions may lean harder on the 'generation' as prior to the
'next': generators inherently don't cause side effects.

Or hold, first-dropped is no exception:
>>special= object( )
def f( ):
... print 0
... yield special
... while 1:
... yield 1
...
>>g= f( )
g.next( )
0
<object object at 0x00980470>
>>g.next( )
1
>>g.next( )
1
>>g.next( )
1
Jun 27 '08 #11

P: n/a
Now to the main point. When a generator function is run, it immediately
returns a generator, and it does not run any code inside the generator.
Not until generator.next() is called is any code inside the generator
executed, giving it traditional lazy evaluation semantics. Why don't
generators follow the usual eager evaluation semantics of Python and
immediately execute up until right before the first yield instead?
Giving generators special case semantics for no good reason is a really
bad idea, so I'm very curious if there is a good reason for it being
this way. With the current semantics it means that errors can pop up at
unexpected times rather than the code failing fast.
The semantics of a generator are very clear: on .next(), run until the next
yield is reached and then return the yielded value. Plus of course the
dealing with StopIteration-stuff.

Your scenario would introduce a special-case for the first run, making it
necessary to keep additional state around (possibly introducing GC-issues
on the way), just for the sake of it. And violate the lazyness a generator
is all about. Think of a situation like this:

def g():
while True:
yield time.time()

Obviously you want to yield the time at the moment of .next() being called.
Not something stored from ages ago. If anything that setups the generator
shall be done immediatly, it's easy enough:

def g():
first_result = time.time()
def _g():
yield first_result
while True:
yield time.time()
return _()

Diez
Jun 27 '08 #12

P: n/a
Martin Sand Christensen wrote:
Why don't
generators follow the usual eager evaluation semantics of Python and
immediately execute up until right before the first yield instead?
A great example of why this behavior would defeat some of the purpose of
generators can be found in this amazing PDF presentation:

http://www.dabeaz.com/generators/Generators.pdf
Giving generators special case semantics for no good reason is a really
bad idea, so I'm very curious if there is a good reason for it being
this way. With the current semantics it means that errors can pop up at
unexpected times rather than the code failing fast.
Most assuredly they do have good reason. Consider the cases in the PDF
I just mentioned. Building generators that work on the output of other
generators allows assembling entire pipelines of behavior. A very
powerful feature that would be impossible if the generators had the
semantics you describe.

If you want generators to behave as you suggest they should, then a
conventional for x in blah approach is likely the better way to go.

I use a generator anytime I want to be able to iterate across something
that has a potentially expensive cost, in terms of memory or cpu, to do
all at once.
Jun 27 '08 #13

P: n/a
On May 7, 4:51*pm, Michael Torrie <torr...@gmail.comwrote:
Martin Sand Christensen wrote:
Why don't
generators follow the usual eager evaluation semantics of Python and
immediately execute up until right before the first yield instead?

A great example of why this behavior would defeat some of the purpose of
generators can be found in this amazing PDF presentation:

http://www.dabeaz.com/generators/Generators.pdf
Giving generators special case semantics for no good reason is a really
bad idea, so I'm very curious if there is a good reason for it being
this way. With the current semantics it means that errors can pop up at
unexpected times rather than the code failing fast.

Most assuredly they do have good reason. *Consider the cases in the PDF
I just mentioned. *Building generators that work on the output of other
generators allows assembling entire pipelines of behavior. *A very
powerful feature that would be impossible if the generators had the
semantics you describe.

If you want generators to behave as you suggest they should, then a
conventional for x in blah approach is likely the better way to go.

I use a generator anytime I want to be able to iterate across something
that has a potentially expensive cost, in terms of memory or cpu, to do
all at once.
The amount of concentration you can write in a program in a sitting
(fixed amount of time) is kind of limited. Sounds like @greedy was
the way to go. The recall implementation may have a short in the
future, but isn't functools kind of full? Has wraptools been
written? Is it any different?

Naming for @greedy also comes to question. My humble opinion muscles
glom on to @early vs. @late; @yieldprior; @dropfirst; @cooperative.
Thesaurus.com adds @ahead vs. @behind.

Jun 27 '08 #14

This discussion thread is closed

Replies have been disabled for this discussion.