By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
435,619 Members | 1,534 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 435,619 IT Pros & Developers. It's quick & easy.

A great Alan Kay quote

P: n/a
In an interview at http://acmqueue.com/modules.php?name...owpage&pid=273
Alan Kay said something I really liked, and I think it applies
equally well to Python as well as the languages mentioned:

I characterized one way of looking at languages in this
way: a lot of them are either the agglutination of features
or they're a crystallization of style. Languages such as
APL, Lisp, and Smalltalk are what you might call style
languages, where there's a real center and imputed style to
how you're supposed to do everything.

I think that "a crystallization of style" sums things up nicely.
The rest of the interview is pretty interesting as well.

--
Grant Edwards grante Yow! Look!! Karl Malden!
at
visi.com
Jul 18 '05 #1
Share this Question
Share on Google+
38 Replies


P: n/a
Surely

"Perl is another example of filling a tiny, short-term need, and then
being a real problem in the longer term."

is better lol ;)
On Wed, 09 Feb 2005 11:00:32 -0800 (PST), Grant Edwards <gr****@visi.com> wrote:
In an interview at http://acmqueue.com/modules.php?name...owpage&pid=273
Alan Kay said something I really liked, and I think it applies
equally well to Python as well as the languages mentioned:

I characterized one way of looking at languages in this
way: a lot of them are either the agglutination of features
or they're a crystallization of style. Languages such as
APL, Lisp, and Smalltalk are what you might call style
languages, where there's a real center and imputed style to
how you're supposed to do everything.

I think that "a crystallization of style" sums things up nicely.
The rest of the interview is pretty interesting as well.

--
Grant Edwards grante Yow! Look!! Karl Malden!
at
visi.com
--
http://mail.python.org/mailman/listinfo/python-list

Jul 18 '05 #2

P: n/a
On 2005-02-09, James <sp*****@gmail.com> wrote:
Surely

"Perl is another example of filling a tiny, short-term need, and then
being a real problem in the longer term."

is better lol ;)


That was the other one I really liked, and Perl was the first
language I thought of when I saw the phrase "agglutination of
features". C++ was the second one.

--
Grant Edwards grante Yow! -- In 1962, you could
at buy a pair of SHARKSKIN
visi.com SLACKS, with a "Continental
Belt," for $10.99!!
Jul 18 '05 #3

P: n/a
"""
Today he is Senior Fellow at Hewlett-Packard Labs and president of Viewpoints
Research Institute, a nonprofit organization whose goal is to change how
children are educated by creating a sample curriculum with supporting media
for teaching math and science. This curriculum will use Squeak as its media,
and will be highly interactive and constructive. Kay’s deep interests in
children and education have been the catalysts for many of his ideas over the
years.
"""

I love him.

It's also interesting to see GUIs with windows, mouse (etc.), which apparently
find their origin in is mind, probably comes from the desire to introduce
computers to children.

Francis Girard

Le mercredi 9 Février 2005 20:29, Grant Edwards a écrit*:
On 2005-02-09, James <sp*****@gmail.com> wrote:
Surely

"Perl is another example of filling a tiny, short-term need, and then
being a real problem in the longer term."

is better lol ;)


That was the other one I really liked, and Perl was the first
language I thought of when I saw the phrase "agglutination of
features". C++ was the second one.

--
Grant Edwards grante Yow! -- In 1962, you
could at buy a pair of SHARKSKIN visi.com SLACKS,
with a "Continental Belt," for $10.99!!


Jul 18 '05 #4

P: n/a
Grant Edwards wrote:
In an interview at http://acmqueue.com/modules.php?name...owpage&pid=273
Alan Kay said something I really liked, and I think it applies
equally well to Python as well as the languages mentioned:

I characterized one way of looking at languages in this
way: a lot of them are either the agglutination of features
or they're a crystallization of style. Languages such as
APL, Lisp, and Smalltalk are what you might call style
languages, where there's a real center and imputed style to
how you're supposed to do everything.

I think that "a crystallization of style" sums things up nicely.
The rest of the interview is pretty interesting as well.


Then Perl is an "agglutination of styles", while Python might
be considered a "crystallization of features"...

-Peter
Jul 18 '05 #5

P: n/a
On 2005-02-09, Peter Hansen <pe***@engcorp.com> wrote:
I characterized one way of looking at languages in this
way: a lot of them are either the agglutination of features
or they're a crystallization of style. Languages such as
APL, Lisp, and Smalltalk are what you might call style
languages, where there's a real center and imputed style to
how you're supposed to do everything.
Then Perl is an "agglutination of styles", while Python might
be considered a "crystallization of features"...


Exactly.

--
Grant Edwards grante Yow! NOW, I'm supposed
at to SCRAMBLE two, and HOLD
visi.com th' MAYO!!
Jul 18 '05 #6

P: n/a
[Peter Hansen]
Then Perl is an "agglutination of styles", while Python might
be considered a "crystallization of features"...


Grosso modo, yes. Yet, we should recognise that Python agglutinated
a few crystals in the recent years. :-)

It gave up some of its purity for practical reasons. We got rather far
from the "There is only one way to do it!" that once was Python motto.

--
Franois Pinard http://pinard.progiciels-bpi.ca
Jul 18 '05 #7

P: n/a
Franois Pinard wrote:
[Peter Hansen]

Then Perl is an "agglutination of styles", while Python might
be considered a "crystallization of features"...

Grosso modo, yes. Yet, we should recognise that Python agglutinated
a few crystals in the recent years. :-)

It gave up some of its purity for practical reasons. We got rather far
from the "There is only one way to do it!" that once was Python motto.


I would call a "pure" language one that had a crystallized style.

Python, on the other hand, is just plain practical. Thus my
half-humorous attempt at defining it in terms of the features
(with its wide-ranging library and extension modules) rather
than in termso of its style (which as you know can range
from procedural to functional, stopping briefly at object
oriented and "newbie" along the way ;-) ).

-Peter
Jul 18 '05 #8

P: n/a
has
Grant Edwards wrote:
In an interview at http://acmqueue.com/modules.php?name...owpage&pid=273 Alan Kay said something I really liked, and I think it applies
equally well to Python as well as the languages mentioned:

I characterized one way of looking at languages in this
way: a lot of them are either the agglutination of features
or they're a crystallization of style
I'd say Python is somewhere in the middle, though moving slowly towards
'agglutination' in the last couple years.

The rest of the interview is pretty interesting as well.


Excellent link, thanks.

Jul 18 '05 #9

P: n/a
On Wed, 09 Feb 2005 15:57:10 -0800, has wrote:
I'd say Python is somewhere in the middle, though moving slowly towards
'agglutination' in the last couple years.


But it feels really badly about that and promises to kick the habit
somewhere around the year 3000.
Jul 18 '05 #10

P: n/a
Francis Girard wrote:
...
It's also interesting to see GUIs with windows, mouse (etc.), which apparently
find their origin in is mind, probably comes from the desire to introduce
computers to children.


OK, presuming "origin in is mind" was meant to say "origin in his mind,"
I'd like to stick up for Doug Engelbart (holds the patent on the mouse)
here. I interviewed with his group at SRI in the ancient past, when
they were working on the "Augmentation Research" project -- machine
augmentation of human intelligence. They, at the time, were working on
input pointing devices and hadn't yet settled. The helmet that read
brain waves was doing astoundingly well (90% correct on up, down, left,
right, don't move), but nowhere near well enough to use for positioning
on edits. This work produced the mouse, despite rumors of Xerox Parc or
Apple inventing the mouse.

Xerox Parc, did, as far as I understand, do the early development on
interactive graphic display using a mouse for positioning on a
graphics screen. Engelbart's mouse navigated on a standard 80x24
character screen.

Augment did real research on what might work, with efforts to measure
ease of use and reliability. They did not simply start with a good
(or great) guess and charge forward. They produced the mouse, and the
earliest "linked" documents that I know of.

http://sloan.stanford.edu/MouseSite/1968Demo.html

--Scott David Daniels
Sc***********@Acm.Org
Jul 18 '05 #11

P: n/a
In article <42********@nntp0.pdx.net>,
Scott David Daniels <Sc***********@Acm.Org> wrote:
Jul 18 '05 #12

P: n/a
On Wed, 9 Feb 2005 21:23:06 +0100, Francis Girard
<fr************@free.fr> wrote:

I love him.
I don't.
It's also interesting to see GUIs with windows, mouse (etc.), which apparently
find their origin in is mind, probably comes from the desire to introduce
computers to children.

Alfred Bork, now
Professor Emeritus
Information and Computer Science
University of California, Irvine 92697

had written an article in 1980 called

"Interactive Learning" which began

"We are at the onset of a major revolution in education, a revolution
unparalleled since the invention of the printing press. The computer
will be the instrument of this revolution."

In 2000 he published:

"Interactive Learning: Twenty Years Later"

looking back on his orignal article and its optimistic predictions and
admitting "I was not a very good prophet"

What went wrong?

Among other things he points (probably using a pointing device) at the
pointing device

"""
Another is the rise of the mouse as a computer device. People had the
peculiar idea that one could deal with the world of learning purely by
pointing.

"""
The articles can be found here:

http://www.citejournal.org/vol2/iss4/seminal.cfm

One does not need to agree or disagree, it seems to me about this or
that point on interface, or influence, or anything else. What one does
need to do is separate hope from actuality, and approach the entire
subject area with some sense of what is at stake, and with some true
sense of the complexity of the issues, in such a way that at this
stage of the game the only authentic stance is one of humility,

Kay fails the humility test, dramatically. IMO.

Art
Jul 18 '05 #13

P: n/a
jfj
Peter Hansen wrote:
Grant Edwards wrote:
In an interview at
http://acmqueue.com/modules.php?name...owpage&pid=273
Alan Kay said something I really liked, and I think it applies
equally well to Python as well as the languages mentioned:

I characterized one way of looking at languages in this
way: a lot of them are either the agglutination of features
or they're a crystallization of style. Languages such as
APL, Lisp, and Smalltalk are what you might call style
languages, where there's a real center and imputed style to
how you're supposed to do everything.

I think that "a crystallization of style" sums things up nicely.
The rest of the interview is pretty interesting as well.

Then Perl is an "agglutination of styles", while Python might
be considered a "crystallization of features"...


Bah. My impressions from the interview was "there are no good
languages anymore. In my time we made great languages, but today
they all suck. Perl for example...."
I got the impressions that the interview is as bad for python
as for perl and any of the languages of the 90's and 00's.

From the interview:
""" You could think of it as putting a low-pass filter on some of the
good ideas from the 60s and 70s, as computing spread out much, much
faster than educating unsophisticated people can happen. In the last 25
years or so, we actually got something like a pop culture, similar to
what happened when television came on the scene and some of its
inventors thought it would be a way of getting Shakespeare to the
masses. But they forgot that you have to be more sophisticated and have
more perspective to understand Shakespeare. What television was able to
do was to capture people as they were.

So I think the lack of a real computer science today, and the lack of
real software engineering today, is partly due to this pop culture.
"""

So, let's not be so self-important <winkus>, and see this interview
as one who bashes perl and admires python. It aint. Python is pop
culture according to Mr Kay. I'll leave the rest to slashdot..
jfj
Jul 18 '05 #14

P: n/a

jfj wrote:
Bah. My impressions from the interview was "there are no good
languages anymore. In my time we made great languages, but today
they all suck. Perl for example...."


That was kind of what I took from it as well. Don't get me wrong, I've
a lot of respect for Kay's contributions...he just doesn't understand
that there's *more* to a language than it's adherence to his ideas of
'best'. His arguments are literally academic.

Decrying contemporary choices for their "pop" nature kinda sounds like
the ugly kid devaluing the importance of the school dance.

It just wasn't fit enough to survive, Alan. Let it go.

- alex23

Jul 18 '05 #15

P: n/a
PA

On Feb 10, 2005, at 19:43, Francis Girard wrote:
I think he's a bit nostalgic.


Steve Wart about "why Smalltalk never caught on":

http://hoho.dyndns.org/~holger/smalltalk.html

Cheers

--
PA, Onnay Equitursay
http://alt.textdrive.com/
Jul 18 '05 #16

P: n/a
Le jeudi 10 Fvrier 2005 04:37, Arthur a crit*:
On Wed, 9 Feb 2005 21:23:06 +0100, Francis Girard

<fr************@free.fr> wrote:
I love him.
I don't.
It's also interesting to see GUIs with windows, mouse (etc.), which
apparently find their origin in is mind, probably comes from the desire
to introduce computers to children.


Alfred Bork, now
Professor Emeritus
Information and Computer Science
University of California, Irvine 92697

had written an article in 1980 called

"Interactive Learning" which began

"We are at the onset of a major revolution in education, a revolution
unparalleled since the invention of the printing press. The computer
will be the instrument of this revolution."

In 2000 he published:

"Interactive Learning: Twenty Years Later"

looking back on his orignal article and its optimistic predictions and
admitting "I was not a very good prophet"

What went wrong?

Among other things he points (probably using a pointing device) at the
pointing device

"""
Another is the rise of the mouse as a computer device. People had the
peculiar idea that one could deal with the world of learning purely by
pointing.

"""
The articles can be found here:

http://www.citejournal.org/vol2/iss4/seminal.cfm

One does not need to agree or disagree, it seems to me about this or
that point on interface, or influence, or anything else. What one does
need to do is separate hope from actuality, and approach the entire
subject area with some sense of what is at stake, and with some true
sense of the complexity of the issues, in such a way that at this
stage of the game the only authentic stance is one of humility,

Kay fails the humility test, dramatically. IMO.


I think I've been enthouasistic too fast. While reading the article I grew
more and more uncomfortable with sayings like :

- Intel and Motorola don't know how to do micro-processors and did not
understand anything in our own architecture
- Languages of today are features filled doggy bags
- Java failed where I succeeded
- I think beautifully like a mathematician while the rest is pop culture
- etc.

I'm not sure at all he likes Python. Python is too pragmmatic for him. And its
definition does not hold in the palm of his hand.

I think he's a bit nostalgic.

Francis Girard
or


Art


Jul 18 '05 #17

P: n/a
Thank you.

Francis Girard

Le jeudi 10 Février 2005 02:48, Scott David Daniels a écrit*:
Francis Girard wrote:
...
It's also interesting to see GUIs with windows, mouse (etc.), which
apparently find their origin in is mind, probably comes from the desire
to introduce computers to children.


OK, presuming "origin in is mind" was meant to say "origin in his mind,"
I'd like to stick up for Doug Engelbart (holds the patent on the mouse)
here. I interviewed with his group at SRI in the ancient past, when
they were working on the "Augmentation Research" project -- machine
augmentation of human intelligence. They, at the time, were working on
input pointing devices and hadn't yet settled. The helmet that read
brain waves was doing astoundingly well (90% correct on up, down, left,
right, don't move), but nowhere near well enough to use for positioning
on edits. This work produced the mouse, despite rumors of Xerox Parc or
Apple inventing the mouse.

Xerox Parc, did, as far as I understand, do the early development on
interactive graphic display using a mouse for positioning on a
graphics screen. Engelbart's mouse navigated on a standard 80x24
character screen.

Augment did real research on what might work, with efforts to measure
ease of use and reliability. They did not simply start with a good
(or great) guess and charge forward. They produced the mouse, and the
earliest "linked" documents that I know of.

http://sloan.stanford.edu/MouseSite/1968Demo.html

--Scott David Daniels
Sc***********@Acm.Org


Jul 18 '05 #18

P: n/a
On Thu, 10 Feb 2005 03:08:11 GMT, rumours say that cl****@lairds.us (Cameron
Laird) might have written:
I entirely agree that Engelbart deserves full recognition for his
achievements. At the same time, I think we also should note that
Ted Nelson was publishing articles about "hypertext" in '65, and
Vannevar Bush lucidly explained his vision for textual linking in
'45. With a little provocation, I can push the ideas of "mechanical"
or "machine" referencing back at least to the Enlightenment, and
arguably much farther.


Like
--
TZOTZIOY, I speak England very best.
"Be strict when sending and tolerant when receiving." (from RFC1958)
I really should keep that in mind when talking with people, actually...
Jul 18 '05 #19

P: n/a
On Thu, 10 Feb 2005 03:08:11 GMT, rumours say that cl****@lairds.us (Cameron
Laird) might have written:

[more snipping]
With a little provocation, I can push the ideas of "mechanical"
or "machine" referencing back at least to the Enlightenment, and
arguably much farther.


Please ignore my earlier post, since it was mistakenly sent incomplete.

about "arguably much farther":

http://www.ancient-mysteries.com/gre...anti-main.html

and

http://www.smith.edu/hsc/museum/anci.../battery2.html

and

http://www.smith.edu/hsc/museum/anci...amengine2.html
Nice page, this:

http://www.smith.edu/hsc/museum/anci...ns/hsclist.htm
--
TZOTZIOY, I speak England very best.
"Be strict when sending and tolerant when receiving." (from RFC1958)
I really should keep that in mind when talking with people, actually...
Jul 18 '05 #20

P: n/a
Le jeudi 10 Fvrier 2005 19:47, PA a crit*:
On Feb 10, 2005, at 19:43, Francis Girard wrote:
I think he's a bit nostalgic.


Steve Wart about "why Smalltalk never caught on":

http://hoho.dyndns.org/~holger/smalltalk.html

Cheers

--
PA, Onnay Equitursay
http://alt.textdrive.com/


!!!!!

Francis Girard

Jul 18 '05 #21

P: n/a
On 2005-02-10, Francis Girard <fr************@free.fr> wrote:

I think I've been enthouasistic too fast. While reading the article I grew
more and more uncomfortable with sayings like :


<snip>

Yes, you may have grown uncomfortable because what you "read" has, at best,
only the most tenuous of relations with what was written. There is no way
in God's frigid hell that your "sayings" (which were never uttered by Alan
Kay) can be construed as anything other than a hopefully transient
psychotic episode by anyone who read the interview with his head in a place
other than where the moon doesn't shine.

Please be so kind as to free your own from the breathless confines of your
own fundamental delirium.
Jul 18 '05 #22

P: n/a
Le vendredi 11 Février 2005 21:45, Curt a écrit*:
On 2005-02-10, Francis Girard <fr************@free.fr> wrote:
I think I've been enthouasistic too fast. While reading the article I
grew more and more uncomfortable with sayings like :


<snip>

Yes, you may have grown uncomfortable because what you "read" has, at best,
only the most tenuous of relations with what was written. There is no way
in God's frigid hell that your "sayings" (which were never uttered by Alan
Kay) can be construed as anything other than a hopefully transient
psychotic episode by anyone who read the interview with his head in a place
other than where the moon doesn't shine.

Please be so kind as to free your own from the breathless confines of your
own fundamental delirium.


Wow ! Peace. I apologize. Didn't want to upset anyone. Of course it was my own
ad lib interpretation of what Alan Kay said. That's what I meant by
"sayings". But I should had been clearer. Anyway, it only implies myself.

I live at a place where it rains most of the time and my head is indeed in a
place where the moon doesn't shine, which may give a good explanation of my
own fundamental delirium.

For another fundamental delirium (which I certainly enjoy), see :

Steve Wart about "why Smalltalk never caught on":

http://hoho.dyndns.org/~holger/smalltalk.html

as someone named "Petite abeille" (a name I also certainly do find full of
flavour) suggested me.

My deepest apologies,

Francis Girard

Jul 18 '05 #23

P: n/a
On Sun, 13 Feb 2005 18:48:03 +0100, Francis Girard >
My deepest apologies,

Francis Girard


Sorry if I helped get you into this, Francis.

I have read and seen enough of Kay and his visions to find him as a
bug where *my* moon don't shine. When the appropriate opportunity
comes, I find it hard not to mention the fact.

All for reasons I have gone into on other fourms, but won;t repeat
here.

His loyaliss are stongly loyal..

FWIW, my issues have nothing to do with Smalltalk. They have to do
with Kay as a puiblic personage. He may, for all I know, be a
sweetheart, privately. And this most recent interview is one of the
more reasonable I have heard from him. Perhaps he is touching down
in reality, more often.

But at this point isn't it fair to consider that Kay has forked from
Smalltalk. I'd probably have more problems with him, not less, if I
were a committed Smalltalk guy.

Art

Jul 18 '05 #24

P: n/a
Hi,

I wrote simple dummy examples to teach to myself iterators and generators.

I think that the iteration protocol brings a very neat confusion between the
iterator and what it iterates upon (i.e. the iteratable). This is outlined in
"example 3" in my dummy examples.

What are your feelings about it ?

Regards

Francis Girard

==== BEGINNING OF EXAMPLES ====

from itertools import tee, imap, izip, islice
import sys
import traceback

sEx1Doc = \
"""
================================================== ==============================
Example 1:
================================================== ==============================

An ""iteratable"" class is a class supporting the __iter__ method which should
return an ""iterator"" instance, that is, an instance of a class supporting
the "next" method.

An iteratable, strictly speaking, doesn't have to support the "next" method
and
an "iterator" doesn't have to support the "__iter__" method (but this breaks
the
iteration protocol as we will later see).

The ""for ... in ..."" construct now expect an iteratable instance in its
second argument place. It first invoke its __iter__ method and then repeatedly
invoke the ""next"" method on the object returned by ""__iter__"" until the
""StopIteration"" exception is raised.

The designing goal is to cleanly separate a container class with the way we
iterate over its internal elements.
"""
class Ex1_IteratableContClass:
def __init__(self):
print "Ex1_IteratableContClass.__init__"
self._vn = range(0,10)

def elAt(self, nIdx):
print "Ex1_IteratableContClass.elAt"
return self._vn[nIdx]

def __iter__(self):
print "Ex1_IteratableContClass.__iter__"
return Ex1_IteratorContClass(self)

class Ex1_IteratorContClass:
def __init__(self, cont):
print "Ex1_IteratorContClass.__init__"
self._cont = cont
self._nIdx = -1

def next(self):
print "Ex1_IteratorContClass.next"
self._nIdx += 1
try:
return self._cont.elAt(self._nIdx)
except IndexError:
raise StopIteration

print
print sEx1Doc
print
for n in Ex1_IteratableContClass():
print n,
sys.stdout.flush()
sEx2Doc = \
"""
================================================== ==============================
Example 2:
================================================== ==============================

Let's say that we want to give two ways to iterate over the elements of our
example container class. The default way is to iterate in direct order and we
want to add the possibility to iterate in reverse order. We simply add another
""iterator"" class. We do not want to modify the container class as,
precisely,
the goal is to decouple the container from iteration.
"""
class Ex2_IteratableContClass:
def __init__(self):
print "Ex2_IteratableContClass.__init__"
self._vn = range(0,10)

def elAt(self, nIdx):
print "Ex2_IteratableContClass.elAt"
return self._vn[nIdx]

def __iter__(self):
print "Ex2_IteratableContClass.__iter__"
return Ex1_IteratorContClass(self)

class Ex2_RevIteratorContClass:
def __init__(self, cont):
print "Ex2_RevIteratorContClass.__init__"
self._cont = cont
self._nIdx = 0

def next(self):
print "Ex2_RevIteratorContClass.next"
self._nIdx -= 1
try:
return self._cont.elAt(self._nIdx)
except IndexError:
raise StopIteration

print
print sEx2Doc
print
print "Default iteration works as before"
print
for n in Ex2_IteratableContClass():
print n,
sys.stdout.flush()

print
print "But reverse iterator fails with an error : "
print

cont = Ex2_IteratableContClass()
try:
for n in Ex2_RevIteratorContClass(cont):
print n,
sys.stdout.flush()
except:
traceback.print_exc()

sEx3Doc = \
"""
================================================== ==============================
Example 3.
================================================== ==============================

The previous example fails with the "iteration over non sequence" error
because
the "Ex2_RevIteratorContClass" iterator class doesn't support the __iter__
method. We therefore have to supply one, even at the price of only returning
""self"".

I think this is ugly because it baffles the distinction between iterators and
iteratables and we artificially have to add an ""__iter__"" method to the
iterator itself, which should return ... well, an iterator, which it already
is !

I presume that the rationale behind this is to keep the feature that the
second
argument place of the ""for ... in ..."" should be filled by a container (i.e.
an iteratable), not an iterator.

Another way that Python might have done this without breaking existing code is
to make the ""for ... in ..."" construct invoke the __iter__ method if it
exists, otherwise, directly call the ""next"" method on the supplied object.

But this is only a minor quirk as the decoupling of the iterator from the
iteratable is nonetheless achieved at the (small) price of adding an "almost"
useless method.

So here it is.
"""

class Ex3_RevIteratorContClass:
def __init__(self, cont):
print "Ex3_RevIteratorContClass.__init__"
self._cont = cont
self._nIdx = 0

def __iter__(self):
print "Ex3_RevIteratorContClass.__iter__"
return self ## zut !

def next(self):
print "Ex3_RevIteratorContClass.next"
self._nIdx -= 1
try:
return self._cont.elAt(self._nIdx)
except IndexError:
raise StopIteration

print
print sEx3Doc
print
cont = Ex2_IteratableContClass()
for n in Ex3_RevIteratorContClass(cont):
print n,
sys.stdout.flush()

sEx4Doc = \
"""
================================================== ==============================
Example 4.
================================================== ==============================

Since version 2.2, a new built-in function is offered that simply takes an
iteratable as argument and returns an iterator by calling its "__iter__". As
long as only iteratables/iterators are involved, this is no big deal as
""iter(anIteratable)"" is strictly equivalent to ""anIteratable.__iter__()"".

The real advantage in using this function is that is also accepts containers
supporting the sequence protocol (the __getitem__() method with integer
arguments starting at 0). The ""iter"" methods also returns an iterator for
this
case. We can therefore write a function accepting a container as argument
that,
with the same code, can either support the iteration protocol or the sequence
protocol.
"""

class Ex4_IteratableContClass:
def __init__(self):
print "Ex4_IteratableContClass.__init__"
self._vn = range(0,10)

def elAt(self, nIdx):
print "Ex4_IteratableContClass.elAt"
return self._vn[nIdx]

def __iter__(self):
print "Ex4_IteratableContClass.__iter__"
return Ex4_IteratorContClass(self)

class Ex4_IteratorContClass:
def __init__(self, cont):
print "Ex4_IteratorContClass.__init__"
self._cont = cont
self._nIdx = -1

def __iter__(self):
print "Ex4_RevIteratorContClass.__iter__"
return self ## zut !

def next(self):
print "Ex4_IteratorContClass.next"
self._nIdx += 1
try:
return self._cont.elAt(self._nIdx)
except IndexError:
raise StopIteration

class Ex4_SequenceContClass:
def __init__(self):
print "Ex4_SequenceContClass.__init__"
self._vn = range(0,10)

def elAt(self, nIdx):
print "Ex4_SequenceContClass.elAt"
return self._vn[nIdx]

def __getitem__(self, key):
print "Ex4_IteratableContClass.__getitem__"
return self.elAt(key)

def Ex4_FuncPrintContainerEl(cont):
for n in iter(cont):
print n,
sys.stdout.flush()

print
print sEx4Doc
print
print
print "Applying Ex4_FuncPrintContainerEl to an iteratable container"
print
Ex4_FuncPrintContainerEl(Ex4_IteratableContClass() )

print
print "Applying Ex4_FuncPrintContainerEl to a sequence container"
print
Ex4_FuncPrintContainerEl(Ex4_SequenceContClass())
sEx5Doc = \
"""
================================================== ==============================
Example 5.
================================================== ==============================

Introducing Generators

The "generator" term itself is frequently used with two different meanings.
It is sometimes used to mean "generator function" and sometimes to mean
"generator-iterator". The PEP255 and python mailing list are the two only
places
where I could read something that clearly distinguishes the two notions.

A generator-function can be first seen as only a notational artefact to
generate
an iterator. The generated iterator is called a generator-iterator and it is,
by its intimate nature, both an iterator and an iteratable. (This might very
well be another reason why the iteration protocol baffles the distinction
betweeen iterators and iteratables.) Typically (but not necessarily), the
generator-iterator is such that the elements do not pre-exist but are
"generated" as needed while iteration is going on.

Here is an example of a generator-function and the equivalent iterator class.
The function ""Ex5_GenList"" can be seen as just a notational shortcut to the
equivalent Ex5_GenListIteratorClass class.
"""

def Ex5_GenList():
print "Enters Ex5_GenList"
i = 0
while i < 10:
print "Ex5_GenList yields -- Equivalent to have a it.next() return"
yield i
i += 1
print "Ex5_GenList exits, equivalent to : raise StopIteration"

class Ex5_GenListIteratorClass:

def __init__(self):
print "Ex5_GenListIteratorClass.__init__"
self._i = 0

def __iter__(self):
print "Ex5_GenListIteratorClass.__iter__"
return self

def next(self):
print "Ex5_GenListIteratorClass.next"
nReturn = self._i
if nReturn > 9:
print "Ex5_GenListIteratorClass.next raises the StopIteration exception"
raise StopIteration
self._i += 1
return nReturn

print
print sEx5Doc
print
print "The type of Ex5_GenList() is : %s" % type(Ex5_GenList())
print "Although it is really meant generator-iterator"
print "As you can see, calling Ex5_GenList() didn't enter the function since
our"
print "trace \"Enters Ex5_GenList\" had not been printed."
print "It only produced the generator-iterator."
print
print "for n in Ex5_GenList(): gives:"
print
for n in Ex5_GenList():
print n,
sys.stdout.flush()

print
print "The type of Ex5_GenList is : %s" % type(Ex5_GenList)
print "Although it is really meant generator-function"
print
print "for n in Ex5_GenList: gives:"
print
try:
for n in Ex5_GenList:
print n,
sys.stdout.flush()
except:
traceback.print_exc()

print
print "The type of Ex5_GenList().__iter__() is : %s" %
type(Ex5_GenList().__iter__())
print
print "for n in Ex5_GenList().__iter__(): gives:"
print
for n in Ex5_GenList().__iter__():
print n,
sys.stdout.flush()

print
print "it = Ex5_GenList()"
print "for n in it(): gives:"
print
it = Ex5_GenList()
try:
for n in it():
print n,
sys.stdout.flush()
except:
traceback.print_exc()
print "The iterator produced by Ex5_GenList() is obviously not itself
callable"

print
print "for n in Ex5_GenListIteratorClass(): gives:"
print
for n in Ex5_GenListIteratorClass():
print n,
sys.stdout.flush()

sEx6Doc = \
"""
================================================== ==============================
Example 6
================================================== ==============================

After having distinguished iteratable from iterator and generator-function
from
generator-iterator, and having seen how generator-iterator makes iterable and
iterator undistinguishables, we are now ready for some real work. There is now
a real vocabulary problem in the Python community but I think that it is now
clear enough.

Our first pseudo-real-world example is to produce the Fibonacci sequence using
a simple generator.

The Ex6_Fibonacci() is a bit special as it never stops. This is one way to
implement something similar to infinite lists in python.
"""
def Ex6_Fibonacci():
a = 1
b = 1
yield a
yield b
while True:
bb = a + b
a = b
b = bb
yield bb

print
print sEx6Doc
print
it = Ex6_Fibonacci()
for i in xrange(0,10):
print it.next()

sEx7Doc = \
"""
================================================== ==============================
Example 7
================================================== ==============================
A beautiful algorithm to produce the fibonacci sequence goes by noticing
that :

fib(i) = 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, ...
fib(i+1) = 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, ...
fib(i)+fib(i+1) = 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, ...

Generators now enable us to ""run after our tail"". We will exploit the
Fibonacci sequence property shown above by defining a function that produces
and returns a "list" while the production algorithm supposes the list as
having
been already produced by recursively calling itself. In order to be able to do
this, we need to initiate the processus by explicitly giving the first values.
We also need that the list must be produced "lazily" ; i.e. the values must be
produced only as needed ... well needed to produce the next element in the
""list"" in our case.

Here is a first try that will almost work.
"""

## Without the traces:
##
## def Ex7_Fibonacci():
## yield 1
## yield 1
## itTail = Ex7_Fibonacci()
## itTail.next() # Skip the first one
## for n in (head + tail for (head, tail) in izip(Ex7_Fibonacci(), itTail)):
## yield n

N_STACK_DEPTH = 0
def Ex7_Fibonacci():
global N_STACK_DEPTH
N_STACK_DEPTH += 1
nStackDepth = N_STACK_DEPTH

print "[%d] Entering Ex7_Fibonacci" % nStackDepth
print "[%d] Yielding 1" % nStackDepth
yield 1
print "[%d] Yielding 1" % nStackDepth
yield 1
itTail = Ex7_Fibonacci()
itTail.next() # Skip the first one
for n in (head + tail for (head, tail) in izip(Ex7_Fibonacci(), itTail)):
print "[%d] Yielding %d" % (nStackDepth, n)
yield n

print
print sEx7Doc
print
print list(islice(Ex7_Fibonacci(), 5))
sEx8Doc = \
"""
================================================== ==============================
Example 8
================================================== ==============================
Running after your tail with itertools.tee

Example 7 shows that although the idea is beautiful, the implementation is
very inefficient.

To work efficiently, the beginning of the list must not be recomputed over
and over again. This is ensured in most FP languages as a built-in feature.
In python, we have to explicitly maintain a list of already computed results
and abandon genuine recursivity.

The "tee" function does just what we want. It internally keeps a generated
result for as long as it has not been "consumed" from all of the duplicated
iterators, whereupon it is deleted. You can therefore print the Fibonacci
sequence during hours without increasing memory usage, or very little.

The beauty of it is that recursive running after their tail FP algorithms
are quite straightforwardly expressed with this Python idiom.
"""

def Ex8_Fibonacci():
print "Entering Ex8_Fibonacci"
def _Ex8_Fibonacci():
print "Entering _Ex8_Fibonacci"
yield 1
yield 1
fibTail.next() # Skip the first one
for n in (head + tail for (head, tail) in izip(fibHead, fibTail)):
yield n
fibHead, fibTail, fibRes = tee(_Ex8_Fibonacci(), 3)
return fibRes

print
print sEx8Doc
print
print list(islice(Ex8_Fibonacci(), 5))

Jul 18 '05 #25

P: n/a
Le dimanche 13 Fvrier 2005 19:05, Arthur a crit*:
On Sun, 13 Feb 2005 18:48:03 +0100, Francis Girard >
My deepest apologies,

Francis Girard


Sorry if I helped get you into this, Francis.


No, no, don't worry. I really expressed my own opinions and feelings. At the
same time, I certainly understand that these opinions might had upset some of
the readers. I am sorry for this as I don't think this is the right
discussion group for such opinions. Therefore it is useless to hurt people
with something that is not a positive contribution. That's all. I will try to
refrain in the future. I am just not used to talk to so many people at the
same time.

Regards,

Francis Girard
Jul 18 '05 #26

P: n/a

"Francis Girard" <fr************@free.fr> wrote in message
news:200502131958.27410.fr************@free.fr...
An ""iteratable"" class is a class supporting the __iter__ method which
should
return an ""iterator"" instance, that is, an instance of a class
supporting
the "next" method.
Not quite right, see below.
An iteratable, strictly speaking, doesn't have to support the "next"
method
An iterable with a next method is, usually, an iterator.
an "iterator" doesn't have to support the "__iter__" method
Yes it does. iter(iterator) is iterator is part of the iterater protocol
for the very reason you noticed...
(but this breaks the iteration protocol as we will later see).


Iterators are a subgroup of iterables. Being able to say iter(it) without
having to worry about whether 'it' is just an iterable or already an
iterator is one of the nice features of the new iteration design.

Terry J. Reedy


Jul 18 '05 #27

P: n/a
Francis Girard wrote:
"""
================================================== ==============================
Example 8
================================================== ==============================
Running after your tail with itertools.tee
The beauty of it is that recursive running after their tail FP algorithms
are quite straightforwardly expressed with this Python idiom.
"""

def Ex8_Fibonacci():
print "Entering Ex8_Fibonacci"
def _Ex8_Fibonacci():
print "Entering _Ex8_Fibonacci"
yield 1
yield 1
fibTail.next() # Skip the first one
for n in (head + tail for (head, tail) in izip(fibHead, fibTail)):
yield n
fibHead, fibTail, fibRes = tee(_Ex8_Fibonacci(), 3)
return fibRes

print
print sEx8Doc
print
print list(islice(Ex8_Fibonacci(), 5))

Absolutely: ever since you brought up the Hamming sequence I've been interested
in this approach. However, if iterators could be extended in place, these
solutions would be even more attractive.

Here are some examples for infinite series constructed with an extendable
iterator. This iterator is returned by an iterable class 'Stream', shown below
the examples:

def factorial():
"""
f = factorial()
f.tolist(10) [1, 1, 2, 6, 24, 120, 720, 5040, 40320, 362880]
"""

factorial = Stream([1])
factorial.extend(factorial * it.count(1))

return factorial

def fib():
"""Example: f = fib()
f.tolist(10) [1, 1, 2, 3, 5, 8, 13, 21, 34, 55]"""
fib = Stream([1,1])
fib.extend(x+y for x, y in it.izip(fib, fib[1:]))
return fib

def multimerge(*iterables):
"""Yields the items in iterables in order, without duplicates"""
cache = {}
iterators = map(iter,iterables)
number = len(iterables)
exhausted = 0
while 1:
for it in iterators:
try:
cache.setdefault(it.next(),[]).append(it)
except StopIteration:
exhausted += 1
if exhausted == number:
raise StopIteration
val = min(cache)
iterators = cache.pop(val)
yield val

def hamming():
"""
Example: h = hamming()
list(h[20:40]) [40, 45, 48, 50, 54, 60, 64, 72, 75, 80, 81, 90, 96, 100, 108, 120, 125,
128, 135, 144] h[10000] 288555831593533440L
"""

hamming = Stream([1])
hamming.extend(i for i in multimerge(2 * hamming, 3 * hamming, 5 * hamming))
return hamming
def compounds():
"""Extension of Hamming series to compounds of primes(2..13)
Example: c = compounds()
list(c[20:30])

[24, 25, 26, 27, 28, 30, 32, 33, 35, 36]"""
compounds = Stream([1])
compounds.extend(i for i in multimerge(2 * compounds, 3 * compounds, 5 *
compounds, 7 * compounds, 9 * compounds, 11 * compounds, 13 * compounds))
return compounds
# Stream class for the above examples:

import itertools as it
import operator as op

class Stream(object):
"""Provides an indepent iterator (using tee) on every iteration request
Also implements lazy iterator arithmetic"""

def __init__(self, *iterables, **kw):
"""iterables: tuple of iterables (including iterators). A sequence of
iterables will be chained
kw: not used in this base class"""
self.queue = list(iterables)
self.itertee = it.tee(self._chain(self.queue))[0] # We may not need
this in every case

def extend(self,other):
"""extend(other: iterable) => None
appends iterable to the end of the Stream instance
"""
self.queue.append(other)

def _chain(self, queue):
while queue:
for i in self.queue.pop(0):
self.head = i
yield i

# Iterator methods:

def __iter__(self):
"""Normal iteration over the iterables in self.queue in turn"""
return self.itertee.__copy__()

def _binop(self,other,op):
"""See injected methods - __add__, __mul__ etc.."""
if hasattr(other,"__iter__"):
return (op(i,j) for i, j in it.izip(self,other))
else:
return (op(i,other) for i in self)

def __getitem__(self,index):
"""__getitem__(index: integer | slice)
index: integer => element at position index
index: slice
if slice.stop is given => Stream(it.islice(iter(self),
index.start, index.stop, index.step or 1)))
else: consumes self up to start then => Stream(iter(self))
Note slice.step is ignored in this case
"""
if isinstance(index, slice):
if index.stop:
return (it.islice(iter(self),
index.start or 0, index.stop, index.step or 1))
else:
iterator = iter(self)
for i in range(index.start):
iterator.next()
return iterator
else:
return it.islice(iter(self), index,index+1).next()

def getattr(self,attrname):
"""__getattr__/getattr(attrname: string)
=> Stream(getattr(item,attrname) for item in self)
Use the getattr variant if the attrname is shadowed by the Stream class"""

return (getattr(item,attrname) for item in self)
__getattr__ = getattr
# Representational methods

def tolist(self, maxlen = 100):
return list(it.islice(iter(self),maxlen))

def __repr__(self):
return "Stream instance at:%x: %s" % (id(self),
repr(self.queue))
# Inject special methods for binary operations:
_binopdoc = """%(func)s(other: constant | iterable, op: binary function)
other: constant => Stream(op.%(op)s(i,other) for i in self))
other: iterable => Stream(op.%(op)s(i,j) for i, j in it.izip(self,other))
"""

[setattr(Stream,attr, func(argspec = "self, other",
expr = "self._binop(other, op.%s)" % opfunc,
doc=_binopdoc % {"func":attr, "op":opfunc},
name=attr)
)
for attr, opfunc in {
"__mul__":"mul",
"__add__":"add",
"__sub__":"sub",
"__div__":"div",
"__mod__":"mod",
"__pow__":"pow",
}.items()
]
# End inject special methods


Jul 18 '05 #28

P: n/a
"Francis Girard" <fr************@free.fr> wrote in message
an "iterator" doesn't have to support the "__iter__" method

Terry Reedy wrote: Yes it does. iter(iterator) is iterator is part of the iterater protocol
for the very reason you noticed...

But, notwithstanding the docs, it is not essential that

iter(iterator) is iterator
class A(object): ... def __iter__(self):
... return AnIterator()
...
... class AnIterator(object): # an iterator that copies itself ... def next(self):
... return "Something"
... def __iter__(self):
... return AnIterator()
... a=A()
i = iter(a) ... i.next() 'Something' j = iter(i)
j.next() 'Something' i is j False


Michael

Jul 18 '05 #29

P: n/a
Le dimanche 13 Février 2005 23:58, Terry Reedy a écrit*:
Iterators are a subgroup of iterables. *Being able to say iter(it) without
having to worry about whether 'it' is just an iterable or already an
iterator is one of the nice features of the new iteration design.

Terry J. Reedy


Hi,

I have difficulties to represent an iterator as a subspecie of an iteratable
as they seem profoundly different to me. But it just might be that my mind is
too strongly influenced by the C++ STL where there is a clear (I resist to
say "clean") distinction between iteratable and iterator.

One of the result of not distinguishing them is that, at some point in your
programming, you are not sure anymore if you have an iterator or an
iteratable ; and you might very well end up calling "iter()" or "__iter__()"
everywhere.

I am not concerned with the small performance issue involved here (as I very
seldom am) but with clarity. After all, why should you have to call __iter__
on an iterator you just constructed (as in my dummy "example 2") ? One might
wonder, "what ? isn't this already the iterator ?" But this, I agree, might
very well just be a beginner (as I am) question, trying to learn the new
iterator semantics.

I have a strong feeling that the problem arises from the difficulty to marry
the familiar ""for ... in ..."" construct with iterators. If you put an
iteratable in the second argument place of the construct (as is traditionally
done) then the syntax construct itself is an implicit iterator. Now, if you
have explicit iterators then they don't fit well with the implicit iterator
hidden in the syntax.

To have iterators act as iteratables might very well had been a compromise to
solve the problem.

I am not sure at all that this is a "nice feature" to consider an iterator at
the same level that an iteratable. It makes it a bit more akward to have the
"mind impulse", so to speak, to build iterators on top of other iterators to
slightly modify the way iteration is done. But I think I'm getting a bit too
severe here as I think that the compomise choice made by Python is very
acceptable.

Regards,

PS : I am carefully reading Micheal Spencer very interesting reply.

Thank you,

Francis Girard

Jul 18 '05 #30

P: n/a
Francis Girard wrote:
Le dimanche 13 Février 2005 23:58, Terry Reedy a écrit :
Iterators are a subgroup of iterables. Being able to say iter(it) without
having to worry about whether 'it' is just an iterable or already an
iterator is one of the nice features of the new iteration design.


I have difficulties to represent an iterator as a subspecie of an iteratable
... One of the result of not distinguishing them is that, at some point in
your programming, you are not sure anymore if you have an iterator or an
iteratable ; and you might very well end up calling "iter()" or "__iter__()"
everywhere.


The point is _almost_, but not exactly unlike that.
Because the "for ... in ..." construct calls iter itself, you seldom
need (as a code user) to distinguish between iterators and iterables.
However, there will come a day when you see some code like:

first = True
for blunge in whatever:
if first:
first = False
else:
print 'and',
print blunge

And you think, "I can make that clearer," so you write:

source = iter(whatever)
print source.next()
for blunge in source:
print 'and', blunge

Because of how iterables work, you know you can do this locally
without looking all around to see what "whatever" is.

--Scott David Daniels
Sc***********@Acm.Org
Jul 18 '05 #31

P: n/a

"Francis Girard" <fr************@free.fr> wrote in message
news:200502142131.53265.fr************@free.fr...

(Note for oldtimer nitpickers: except where relevant, I intentionally
ignore the old and now mostly obsolete pseudo-__getitem__-based iteration
protocol here and in other posts.)

Le dimanche 13 Fvrier 2005 23:58, Terry Reedy a crit :
Iterators are a subgroup of iterables. Being able to say iter(it)
without
having to worry about whether 'it' is just an iterable or already an
iterator is one of the nice features of the new iteration design.
I have difficulties to represent an iterator as a subspecie of an
iteratable
You are not the only one. That is why I say it in plain English.

You are perhaps thinking of 'iterable' as a collection of things. But in
Python, an 'iterable' is a broader and more abstract concept: anything with
an __iter__ method that returns an iterator.

To make iterators a separate, disjoint species then requires that they not
have an __iter__ method. Some problems:
A. This would mean either
1) We could not iterate with iterators, such as generators, which are
*not* derived from iterables, or, less severely
2) We would, usually, have to 'protect' iter() calls with either
hasattr(it, '__iter__') or try: iter(it)...except: pass with probably no
net average time savings.
B. This would prohibit self-reiterable objects, which require .__iter__ to
(re)set the iteration/cursor variable(s) used by .next().
C. There are compatibility issues not just just with classes using the old
iteration protocol but also with classes with .next methods that do *not*
raise StopIteration. The presence of .__iter__ cleanly marks an object as
one following the new iterable/iterator protocol. Another language might
accomplish the same flagging with inheritance from a base object, but that
is not Python.
[snip]...C++ STL where there is a clear (I resist to
say "clean") distinction between iteratable and iterator.
leaves out self-iterating iterables -- collection objects with a .next
method. I am sure that this is a general, standard OO idiom and not a
Python-specific construct. Perhaps, ignoring these, you would prefer the
following nomenclature:
iterob = object with .__iter__
iterable= iterob without .next
iterator = iterob with .next

Does STL allow/have iterators that are *not* tied to an iterable?
One of the result of not distinguishing them is that, at some point in
your
programming, you are not sure anymore if you have an iterator or an
iteratable ; and you might very well end up calling "iter()" or
"__iter__()" everywhere.
If you iterate with a while loop just after creating a new iterable or
iterator, then you probably do know which it is and can make the iter()
call only if needed. If you while-iterate with a function argument, then
iter() is a simpler way to be generic than the alternatives in A2 above.
I am not concerned with the small performance issue involved here
Good. I think there are small. The number and time for iterations is far
more important.
(as I very seldom am) but with clarity. After all, why should you have to
> call __iter__ on an iterator you just constructed

As I said above, you don't, and most people wouldn't. The function
implementing for loops does because *it*, unlike you, only sees the object
passed and not the code that created the object!
I have a strong feeling that the problem arises from the difficulty to
marry the familiar ""for ... in ..."" construct with iterators. [snip]
What difficulty? For loops accept an iterable and iterate with the derived
iterator.

Would you really prohibit the use of for loops with generators and other
non-iterable-derived iterators? See A1 above.
To have iterators act as iteratables might very well had been a
compromise to solve the problem.

I think it elegant. See below.
I am not sure at all that this is a "nice feature" to consider an iterator
at
the same level that an iteratable.
I think you are too stuck in the STL model.
It makes it a bit more akward to have the
"mind impulse", so to speak, to build iterators on top of other iterators
to > slightly modify the way iteration is done.


On the contrary, what could be more elegant than
def itermodifier(it):
for i in it: # where it is often an iterator
yield modification-of-i
See the itertools module and docs and the examples of chaining iterators.

Terry J. Reedy

Jul 18 '05 #32

P: n/a

"Michael Spencer" <ma**@telcopartners.com> wrote in message
news:cu**********@sea.gmane.org...
Terry Reedy wrote:
iter(iterator) is iterator is part of the iterater protocol
But, notwithstanding the docs, it is not essential that
iter(iterator) is iterator
If a program depends on that invariant, then it is essential for that
program. Leaving such programs aside, I interpret this and your example
together as saying three things:

1. "There is more than one possible definition of 'iterator'."

Yes. Python could have defined many things differently. But I think it
important to have a clear definition of iterator (and other things) so one
can reason about program behavior.

2. "It is not essential to not do something wasteful as long as it is
otherwise inconsequential."

Usually true, but I don't see this as clarifying the relationship between
iterables and iterators in Python. (I am assuming that your example is
only meant to illustrate possibilities rather than usefulness.)

3. "You can substitute a copy of an object that is never mutated for the
object itself."

True, as long as you do not look at the id. But in practice, change of
state is essential to the function of nearly all iterators. For mutated
objects, one has to add the proviso that the copy is current and the
substitution total, so that there are no references left to the original.
But again, this has nothing in particular to do with iteration.
>>> class A(object): ... def __iter__(self):
... return AnIterator()
... >>> class AnIterator(object): # an iterator that copies itself


By the narrower definition of iterator that I used, this is not an
iterator. Also, the replacement is only a copy if the instance has not
been mutated. Is your broader definition limited to return of initialized
copies or would it allow other things also?
... def next(self):
... return "Something"
... def __iter__(self):
... return AnIterator()


The second def statement is equivalent (except for identity) to
__iter__ = A.__iter__

To illustrate the point about mutation with a simple toy example:

a = A()
a.doc = 'My iterator'
b = iter(a)

b is not a copy of a as it is, but only as it was initially.

Terry J. Reedy

Jul 18 '05 #33

P: n/a
Michael Spencer wrote:
But, notwithstanding the docs, it is not essential that
iter(iterator) is iterator
Terry Reedy wrote: iter(iterator) is iterator is part of the iterater protocol

[...]I interpret [your post] as saying three things:

1. "There is more than one possible definition of 'iterator'."
Terry, thanks for responding in depth.

2. "It is not essential to not do something wasteful as long as it is
otherwise inconsequential."
Not that "iter(iterator) is iterator" is somehow wasteful (actually it seems
conservative), but rather that alternative behavior is readily implmented. You
point out, reasonably, that if I do that, then what I get is not then an
iterator, because it fails to conform with the protocol.

However, I suggest that there may be cases where "iter(iterator) is not
iterator" is useful behavior. What to call such an object is another matter.

For example, consider:

import itertools as it
def tee2(iterable):
class itertee(object):
def __init__(self, iterator):
self.iterator = iterator
def __iter__(self):
return itertee(self.iterator.__copy__())
def next(self):
return self.iterator.next()
return itertee(it.tee(iterable, 1)[0])

This returns an itertee instance which simply wraps the tee iterator returned by
itertools. However iter(itertee instance) returns a copy of its iterator. So
this object creates as many independent iterators over iterable as are required.

In an earlier post in this thread, I included several examples of generating
infinite series using iterator-copying like this. I implemented the copying as
a method of a containing iterable 'Stream', rather than of the iterators
themselves, partly to respect the 'iterator protocol'.

3. "You can substitute a copy of an object that is never mutated for the
object itself."

This was not my intended point, although I accept that my example was too abstract.

Cheers

Michael

Jul 18 '05 #34

P: n/a
Le mardi 15 Février 2005 02:26, Terry Reedy a écrit*:
"Francis Girard" <fr************@free.fr> wrote in message
news:200502142131.53265.fr************@free.fr...

(Note for oldtimer nitpickers: except where relevant, I intentionally
ignore the old and now mostly obsolete pseudo-__getitem__-based iteration
protocol here and in other posts.)

Le dimanche 13 Fvrier 2005 23:58, Terry Reedy a crit :
Iterators are a subgroup of iterables. Being able to say iter(it)
without
having to worry about whether 'it' is just an iterable or already an
iterator is one of the nice features of the new iteration design.
I have difficulties to represent an iterator as a subspecie of an
iteratable


You are not the only one. That is why I say it in plain English.

You are perhaps thinking of 'iterable' as a collection of things. But in
Python, an 'iterable' is a broader and more abstract concept: anything with
an __iter__ method that returns an iterator.


Yes, I certainly do define an "iteratable" as something _upon_ which you
iterate (i.e. a container of elements). The iterator is something that serves
the purpose to iterate _upon_ something else, i.e. the iteratable. For me, it
makes little sense to iterate _upon_ an iterator. The fact that, in Python,
both iterators and iteratables must support the __iter__ method is only an
implementation detail. Concepts must come first.
To make iterators a separate, disjoint species then requires that they not
have an __iter__ method. Some problems:
A. This would mean either
1) We could not iterate with iterators, such as generators, which are
*not* derived from iterables, or, less severely
Well, generators are a bit special as they are both (conceptually) iterators
and iteratables by their very intimate nature -- since the elements are
_produced_ as needed, i.e. only when you do iterate.
But as for ordinary iterators, I don't see any good conceptual reason why a
generator-iterator should support the "__iter__" method. There might be other
reasons though (for example related with the for ... in ... construct whichI
discuss later in this reply).
2) We would, usually, have to 'protect' iter() calls with either
hasattr(it, '__iter__') or try: iter(it)...except: pass with probably no
net average time savings.
Well, I'm not interested in time savings for now. Only want to discuss more
conceptual issues.
B. This would prohibit self-reiterable objects, which require .__iter__ to
(re)set the iteration/cursor variable(s) used by .next().
To sharply distinguish in code what is conceptually different is certainly
very good and safe design in general. But what I am thinking about would not
_prohibit_ it. Neitheir is C++ STL prohibiting it.
C. There are compatibility issues not just just with classes using the old
iteration protocol but also with classes with .next methods that do *not*
raise StopIteration. The presence of .__iter__ cleanly marks an object as
one following the new iterable/iterator protocol. Another language might
accomplish the same flagging with inheritance from a base object, but that
is not Python.
(That is not C++ templates either. See below.)

Why not "__next__" (or something else) instead of "next" for iterators and,
yes, __iter__ for iteratables ?
[snip]...C++ STL where there is a clear (I resist to
say "clean") distinction between iteratable and iterator.


leaves out self-iterating iterables -- collection objects with a .next
method.


Nope. See the definition of an iterator in C++ STL below. Anything respecting
the standard protocol is an iterator. It might be the container itself. The
point is that with the standard C++ STL protocol, you are not ___obliged___
to define an iterator as _also_ being an iteratable. Both concepts are
clearly separated.
I am sure that this is a general, standard OO idiom and not a
Python-specific construct. Perhaps, ignoring these, you would prefer the
following nomenclature:
iterob = object with .__iter__
iterable= iterob without .next
iterator = iterob with .next

Does STL allow/have iterators that are *not* tied to an iterable?
Yes of course. A "forward iterator", for example is _anything_ that supports
the following :

===================
In what follows, we shall adopt the following convention.

X : A type that is a model of Trivial Iterator
T : The value type of X
x, y, y : Object of type X
t : Object of type T

Copy constructor : X(x) ------> X
Copy constructor : X x(y); or X x = y;
Assignment : x = y [1] ------> X&
Swap : swap(x,y) ------> void

Equality : x == y ------> Convertible to bool
Inequality : x != y ------> Convertible to bool

Default constructor : X x or X() ------> X
Dereference : *x ------> Convertible to T
Dereference assignment : *x = t ------> X is mutable
Member access : x->m [2] ------> T is a type for which x.m is defined
======================

Anything that respects this convention is a forward iterator. They might
produce their own content as we iterate upon them if that's what is needed.
They don't have to be a class inheriting from some another standard class.
That's the beauty of C++ templates, forgetting (but not forgiving) its very
ugly and complex syntax.
One of the result of not distinguishing them is that, at some point in
your
programming, you are not sure anymore if you have an iterator or an
iteratable ; and you might very well end up calling "iter()" or
"__iter__()" everywhere.
If you iterate with a while loop just after creating a new iterable or
iterator, then you probably do know which it is and can make the iter()
call only if needed.


Yes, true. But then you start factorize some code, and, then, oh well, you can
see the difficulties.
If you while-iterate with a function argument, then
iter() is a simpler way to be generic than the alternatives in A2 above.

You need that genereticity chiefly because you want to support both, iterators
and iteratables in some argument place. I don't see any good reasons for this
except historical ones (see for ... in ... constructs below). It might be
preferable for a method to only accepts iterator if the only thing the method
has to do with the iteratable is to iterate over it, and let the client of
the method call "__iter__" or "iter()" if all he has is an iteratable (i.e.a
container).

I am not concerned with the small performance issue involved here


Good. I think there are small. The number and time for iterations is far
more important.
(as I very seldom am) but with clarity. After all, why should you have to
> call __iter__ on an iterator you just constructed
As I said above, you don't, and most people wouldn't. The function
implementing for loops does because *it*, unlike you, only sees the object
passed and not the code that created the object!


As I said above, why should the function implementing for loops should accept
anything other than an iterator (not iteratable), except, maybe, for
historical reasons ?
I have a strong feeling that the problem arises from the difficulty to
marry the familiar ""for ... in ..."" construct with iterators. [snip]


What difficulty? For loops accept an iterable and iterate with the derived
iterator.


The second argument place of the "for ... in ..." construct, _before_ the
iteration protocol, had to be a container. At that time, it was the "for ...
in ..." syntax construct that palyed the conceptual roles of both iteration
and iterator. Now, the "for ... in ..." construct is the standard way to do
iterate. Therefore, the second argument place of the "for ... in ..." must
also be an iterator if we have to introduce iterators to Python. The question
must had been, at the time iterators were introduced, "how should we manage
to have these two different things in the same argument place ?" I think the
Python solution to his dilemma is very acceptable. It is certainly the python
way to be very polymorphic. But, at the same time, I can very well understand
that a lot of person do have difficulties to swallow that an iterator should
be a subspecie of an iteratable.

Would you really prohibit the use of for loops with generators and other
non-iterable-derived iterators? See A1 above.
Of course not. This question reveals the difficulty I just pointed out.
To have iterators act as iteratables might very well had been a
compromise to solve the problem.

I think it elegant. See below.
I am not sure at all that this is a "nice feature" to consider an iterator
at
the same level that an iteratable.


I think you are too stuck in the STL model.


Yes, very true. I just can't help thinking that iterators and iteratables are
very different concepts.
It makes it a bit more akward to have the
"mind impulse", so to speak, to build iterators on top of other iterators
to > slightly modify the way iteration is done.
On the contrary, what could be more elegant than
def itermodifier(it):
for i in it: # where it is often an iterator
yield modification-of-i
See the itertools module and docs and the examples of chaining iterators.


Yes, I certainly do think that generators are very, very, very elegant. I came
back to python a lot because of their beauty, coupled with all the other
advantages Python has to offer : simple syntax, well thought libraries, easy
portability, pragmaticism, etc. (there's a long list).

On the other hand, I certainly do notice that the itertools module
documentation says that all the functions defined there returns an iterator
(not iteratable) but accepts iteratable. There is something akward about it
for the beginner. You always have to re-think, oh ! yes ! they refer to the
protocol, not the concepts.

But this is not that bad and I can certainly live with it.

(Thank you if you had the courage to read it all !)

Regards

Francis Girard
Terry J. Reedy


Jul 18 '05 #35

P: n/a

"Michael Spencer" <ma**@telcopartners.com> wrote in message
news:cu**********@sea.gmane.org...
Terry, thanks for responding in depth.
We are both interested in the murky edges at and beyond conventional usage.
Terry wrote
2. "It is not essential to not do something wasteful as long as it is >>
otherwise inconsequential."
Not that "iter(iterator) is iterator" is somehow wasteful (actually it >
seems conservative), but rather that alternative behavior is readily
implmented.
What I was pointing to as wasteful is the application of your alternative
behavior where an object is replaced by a copy and then effectively tossed.
However, I suggest that there may be cases where
"iter(iterator) is not iterator" is useful behavior.
I am quite aware that multiple iterators for the same iterable (actual or
conceptual) can be useful (cross products, for example). But I am dubious
that initialized clones of 'iterators' are *more* useful, especially for
Python, than multiple iterators derived from repeated calling of the
callable that produced the first iterator. It might be different if Python
were a prototype/clone language rather than a typeclass/instance language.
It is also possible that we have slightly different ideas of 'useful' in
the Python context.

In your example, as I pointed out, A.__iter__ and AnIterator.__iter__ have
the same code, so I could not see any advantage to getting a copy through
calling the latter instead of the former. For the disadvantage, see below.
What to call such an object is another matter.


spencerator ;-?

Here are some related reasons why I think it useful if not essential to
restrict the notion of iterator by restricting iterator.__iter__ to
returning self unmodified.

Leaving Python aside, one can think of iterable as something that
represents a collection and that can produce an iterator that produces the
items of the collection one at a time. In this general conceptioning,
iterables and iterators seem distinct (if one ignores self-iterables). In
Python, giving iterators an __iter__ method, while quite useful, erases
(confuses) the (seeming) distinction, but giving them a minimal __iter__
does so minimally, keeping iterators a distinct subcategory of iterable.
Spencerators confuse the distinction more than minimally and define a
vaguer subcategory.

Essential? For something defined as minimal, it is essential that it be
minimal. But you point seems to be that it is not essential that iterator
be so defined. Ok.

(Aside: an iterator can be thought of as representing the sequence of
future .next() returns. From this viewpoint, making iterators a
subcategory of iterable is more than just a convenience.)

Taking Python as it is, a useful subcategory of iterable is 'reiterable'.
This is distinct from iterator strictly defined. This we have iterables
divided into iterators, reiterables, and other. I think this is
didactically useful. Spencerators are reiterables.

Iter(iterator) returning iterator unchanged makes iterator a fixed point of
iter. It ends any chain of objects returned by repeated iter calls.
Spencerators prolong any iter chain, making it infinite instead of finite.
Essential? Repeat the paragraph above with 'a fixed point' substituted for
'minimal'.

Terry J. Reedy

Jul 18 '05 #36

P: n/a
On Tue, 2005-02-15 at 19:25, Terry Reedy wrote:
"Michael Spencer" <ma**@telcopartners.com> wrote in message
news:cu**********@sea.gmane.org...
Terry, thanks for responding in depth.

We are both interested in the murky edges at and beyond conventional usage.
Terry wrote
2. "It is not essential to not do something wasteful as long as it is >>
otherwise inconsequential."

Not that "iter(iterator) is iterator" is somehow wasteful (actually it >
seems conservative), but rather that alternative behavior is readily
implmented.

What I was pointing to as wasteful is the application of your alternative
behavior where an object is replaced by a copy and then effectively tossed.
However, I suggest that there may be cases where
"iter(iterator) is not iterator" is useful behavior.

I am quite aware that multiple iterators for the same iterable (actual or
conceptual) can be useful (cross products, for example). But I am dubious
that initialized clones of 'iterators' are *more* useful, especially for
Python, than multiple iterators derived from repeated calling of the
callable that produced the first iterator. It might be different if Python
were a prototype/clone language rather than a typeclass/instance language.
It is also possible that we have slightly different ideas of 'useful' in
the Python context.
In your example, as I pointed out, A.__iter__ and AnIterator.__iter__ have
the same code, so I could not see any advantage to getting a copy through
calling the latter instead of the former. For the disadvantage, see below.
What to call such an object is another matter.

spencerator ;-?
Here are some related reasons why I think it useful if not essential to
restrict the notion of iterator by restricting iterator.__iter__ to
returning self unmodified.
Leaving Python aside, one can think of iterable as something that
represents a collection and that can produce an iterator that produces the
items of the collection one at a time. In this general conceptioning,
iterables and iterators seem distinct (if one ignores self-iterables). In
Python, giving iterators an __iter__ method, while quite useful, erases
(confuses) the (seeming) distinction, but giving them a minimal __iter__
does so minimally, keeping iterators a distinct subcategory of iterable.
Spencerators confuse the distinction more than minimally and define a
vaguer subcategory.
Essential? For something defined as minimal, it is essential that it be
minimal. But you point seems to be that it is not essential that iterator
be so defined. Ok.
(Aside: an iterator can be thought of as representing the sequence of
future .next() returns. From this viewpoint, making iterators a
subcategory of iterable is more than just a convenience.)
Taking Python as it is, a useful subcategory of iterable is 'reiterable'.
This is distinct from iterator strictly defined. This we have iterables
divided into iterators, reiterables, and other. I think this is
didactically useful. Spencerators are reiterables.
Iter(iterator) returning iterator unchanged makes iterator a fixed point of
iter. It ends any chain of objects returned by repeated iter calls.
Spencerators prolong any iter chain, making it infinite instead of finite.
Essential? Repeat the paragraph above with 'a fixed point' substituted for
'minimal'.


How is spencerator different than itertools.tee?
Adam DePrince
Jul 18 '05 #37

P: n/a
Terry Reedy wrote:
"Michael Spencer" <ma**@telcopartners.com> wrote in message

We are both interested in the murky edges at and beyond conventional usage.
.... I am quite aware that multiple iterators for the same iterable (actual or
conceptual) can be useful (cross products, for example). But I am dubious
that initialized clones of 'iterators' are *more* useful, especially for
Python, than multiple iterators derived from repeated calling of the
callable that produced the first iterator.
I'm not sure they are. In the one 'real' example I posted on infinite series, I
implemented the approach you advocate here. But I'm keeping copyable iterators
in mind.

Here are some related reasons why I think it useful if not essential to
restrict the notion of iterator by restricting iterator.__iter__ to
returning self unmodified.

Leaving Python aside, one can think of iterable as something that
represents a collection and that can produce an iterator that produces the
items of the collection one at a time. In this general conceptioning,
iterables and iterators seem distinct (if one ignores self-iterables).
The separation is appealing, but blurrier in practice, I believe. Neither
itertools.cycle nor itertools.tee fits cleanly into this model. Neither do the
self-iterables, as you point out.
... giving iterators an __iter__ method, while quite useful, erases
(confuses) the (seeming) distinction, but giving them a minimal __iter__
does so minimally, keeping iterators a distinct subcategory of iterable.
Iterators that could not be presented to other functions for filtering or
whatnot would be pretty limited. Unless every iterator is to be derived from
some special-cased object, how could they not have an __iter__ method?
I accept your point that keeping the functionality of iterator.__iter__ minimal
and predicatable limits the confusion between iterators and iterables. But
since that distinction is already blurred in several places, I don't find that
argument alone decisive.
...

Taking Python as it is, a useful subcategory of iterable is 'reiterable'.
This is distinct from iterator strictly defined.
What about itertools.cycle? Not strictly an iterator?

This we have iterables divided into iterators, reiterables, and other. I think this is
didactically useful. Spencerators are reiterables.
They may be: they are no more and no less than a thought experiment in which
iterator.__iter__ does not return self unmodified.

Iter(iterator) returning iterator unchanged makes iterator a fixed point of
iter. It ends any chain of objects returned by repeated iter calls.
Spencerators prolong any iter chain, making it infinite instead of finite.
Essential? Repeat the paragraph above with 'a fixed point' substituted for
'minimal'.

I don't understand this point except in the loosest sense that deviating from
the iterator protocol makes it harder to reason about the code. Do you mean
something more specific?

I have been thinking about iterator.__iter__ rather like object.__new__. Not
returning a new instance may be surprising and inadvisable in most cases. But
still there are accepted uses for the technique. Do you think these cases are
comparable?

Do you see the iterator protocol as the vanguard of a new set of python
protocols that are more semantically restictive than the "mapping, container,
file-like object etc..." interfaces? Defining iterator method semantics
strictly seems like a departure from the existing situation.

Cheers

Michael


Jul 18 '05 #38

P: n/a
Adam DePrince wrote:

How is a spencerator [an iterator that doesn't return itself unmodified on iter]
different than itertools.tee?

Taking your question literally, it changes the behavior of an itertools.tee
object 'tee', so that iter(tee) returns tee.__copy__(), rather than tee itself.

It was created for rhetorical purposes and has no known practical application.

Depending on your point of view it is evidence either for (a) why the iterator
protocol must be strictly adhered to, or (b) that iterators and iterables cannot
be disjoint sets.
Michael

Jul 18 '05 #39

This discussion thread is closed

Replies have been disabled for this discussion.