By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
425,992 Members | 934 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 425,992 IT Pros & Developers. It's quick & easy.

Attack a sacred Python Cow

P: n/a
Hi everyone,

I'm a big Python fan who used to be involved semi regularly in
comp.lang.python (lots of lurking, occasional posting) but kind of
trailed off a bit. I just wrote a frustration inspired rant on my
blog, and I thought it was relevant enough as a wider issue to the
Python community to post here for your discussion and consideration.

This is not flamebait. I love Python, and I'm not out to antagonise
the community. I also realise that one of the issues I raise is way
too ingrained to be changed now. I'd just like to share my thinking on
a misstep in Python's guiding principles that has done more harm than
good IMO. So anyway, here's the post.

I've become utterly convinced that at least one criticism leveled at
my favourite overall programming language, Python, is utterly true and
fair. After quite a while away from writing Python code, I started
last night on a whim to knock up some code for a prototype of an idea
I once had. It's going swimmingly; the Python Image Library, which I'd
never used before, seems quick, intuitive, and with the all the
features I need for this project. As for Python itself, well, my heart
still belongs to whitespace delimitation. All the basics of Python
coding are there in my mind like I never stopped using them, or like
I've been programming in this language for 10 years.

Except when it comes to Classes. I added some classes to code that had
previously just been functions, and you know what I did - or rather,
forgot to do? Put in the 'self'. In front of some of the variable
accesses, but more noticably, at the start of *every single method
argument list.* This cannot be any longer blamed as a hangover from
Java - I've written a ton more code, more recently in Python than in
Java or any other OO language. What's more, every time I go back to
Python after a break of more than about a week or so, I start making
this 'mistake' again. The perennial justification for this 'feature'
of the language? That old Python favourite, "Explicit is better than
implicit."

I'm sorry, but EXPLICIT IS NOT NECESSARILY BETTER THAN IMPLICIT.
Assembler is explicit FFS. Intuitive, clever, dependable, expected,
well-designed *implicit* behaviour is one of the chief reasons why I
use a high level language. Implicitly garbage collect old objects for
me? Yes, please!

I was once bitten by a Python wart I felt was bad enough to raise and
spend some effort advocating change for on comp.lang.python (never got
around to doing a PEP; partly laziness, partly young and inexperienced
enough to be intimidated at the thought. Still am, perhaps.)

The following doesn't work as any sane, reasonable person would
expect:

# Blog code, not tested
class A():
def __eq__(self, obj):
return True
a = A()
b = []
assert a == b
assert not (a != b)

The second assertion fails. Why? Because coding __eq__, the most
obvious way to make a class have equality based comparisons, buys you
nothing from the != operator. != isn't (by default) a synonym for the
negation of == (unlike in, say, every other language ever); not only
will Python let you make them mean different things, without
documenting this fact - it actively encourages you to do so.

There were a disturbingly high number of people defending this
(including one quite renowned Pythonista, think it might have been
Effbot). Some had the temerity to fall back on "Explicit is better
than implict: if you want != to work, you should damn well code
__ne__!"

Why, for heaven's sake, should I have to, when in 99.99% of use cases
(and of those 0.01% instances quoted in the argument at the time only
one struck me as remotely compelling) every programmer is going to
want __ne__ to be the logical negation of __eq__? Why, dear Python,
are you making me write evil Java-style language power reducing
boilerplate to do the thing you should be doing yourself anyway?
What's more, every programmer is going to unconciously expect it to
work this way, and be as utterly as mystified as me when it fails to
do so. Don't tell me to RTFM and don't tell me to be explicit. I'll
repeat myself - if I wanted to be explicit, I'd be using C and
managing my own memory thank you very much. Better yet, I'd explicitly
and graphically swear - swear in frustration at this entrenched design
philosophy madness that afflicts my favourite language.

I think the real problem with the explicit is better than implicit,
though, is that while you can see the underlying truth its trying to
get at (which is perhaps better expressed by Ruby's more equivocal,
less dependable, but more useful Principle of Least Surprise), in its
stated form its actually kind of meanginless and is used mainly in
defence of warts - no, we'll call them for what they are, a language
design *bugs*.

You see, the problem is, there's no such thing of explict in
programming. Its not a question of not doing things implicitly; its a
question of doing the most sensible thing implicitly. For example this
python code:

some_obj.some_meth(some_arg1, some_arg2)

is implicitly equivalent to

SomeClass.some_meth(some_obj, some_arg1, some_arg2)

which in turn gives us self as a reference to some_obj, and Python's
OO model merrily pretends its the same as Java's when in fact is a
smarter version that just superficially looks the same.

The problem is that the explicit requirement to have self at the start
of every method is something that should be shipped off to the
implicit category. You should have to be explicit, yes - explicit when
you want the *other* behaviour, of self *not* being an argument,
because thats the more unusual, less likely case.

Likewise,

a != b

is implicitly equivalent to something like calling this function (may
not be correct, its a while since I was heavily involved in this
issue):

def equal(a, b):
if hasattr(a, "__ne__"): return a.__ne__(b)
if hasattr(b, "__ne__"): return b.__ne__(a)
if hasattr(a, "__cmp__"): return not (a.__cmp__(b) == 0)
if hasattr(b, "__cmp__"): return not (b.__cmp__(a) == 0)
return not (a is b)

There's absolutely nothing explicit about this. I wasn't arguing for
making behaviour implicit; I was arguing for changing the stupid
implict behaviour to something more sensible and less surprising.

The sad thing is there are plenty of smart Python programmers who will
justify all kinds of idiocy in the name of their holy crusade against
the implict.

If there was one change I could make to Python, it would be to get
that damn line out of the Zen.
Jul 24 '08 #1
Share this Question
Share on Google+
270 Replies


P: n/a
On Jul 24, 3:41*pm, Jordan <jordanrastr...@gmail.comwrote:
Hi everyone,

I'm a big Python fan who used to be involved semi regularly in
comp.lang.python (lots of lurking, occasional posting) but kind of
trailed off a bit. I just wrote a frustration inspired rant on my
blog, and I thought it was relevant enough as a wider issue to the
Python community to post here for your discussion and consideration.

This is not flamebait. I love Python, and I'm not out to antagonise
the community. I also realise that one of the issues I raise is way
too ingrained to be changed now. I'd just like to share my thinking on
a misstep in Python's guiding principles that has done more harm than
good IMO. So anyway, here's the post.

I've become utterly convinced that at least one criticism leveled at
my favourite overall programming language, Python, is utterly true and
fair. After quite a while away from writing Python code, I started
last night on a whim to knock up some code for a prototype of an idea
I once had. It's going swimmingly; the Python Image Library, which I'd
never used before, seems quick, intuitive, and with the all the
features I need for this project. As for Python itself, well, my heart
still belongs to whitespace delimitation. All the basics of Python
coding are there in my mind like I never stopped using them, or like
I've been programming in this language for 10 years.

Except when it comes to Classes. I added some classes to code that had
previously just been functions, and you know what I did - or rather,
forgot to do? Put in the 'self'. In front of some of the variable
accesses, but more noticably, at the start of *every single method
argument list.* This cannot be any longer blamed as a hangover from
Java - I've written a ton more code, more recently in Python than in
Java or any other OO language. What's more, every time I go back to
Python after a break of more than about a week or so, I start making
this 'mistake' again. The perennial justification for this 'feature'
of the language? That old Python favourite, "Explicit is better than
implicit."

I'm sorry, but EXPLICIT IS NOT NECESSARILY BETTER THAN IMPLICIT.
Assembler is explicit FFS. Intuitive, clever, dependable, expected,
well-designed *implicit* behaviour is one of the chief reasons why I
use a high level language. Implicitly garbage collect old objects for
me? Yes, please!

I was once bitten by a Python wart I felt was bad enough to raise and
spend some effort advocating change for on comp.lang.python (never got
around to doing a PEP; partly laziness, partly young and inexperienced
enough to be intimidated at the thought. Still am, perhaps.)

The following doesn't work as any sane, reasonable person would
expect:

# Blog code, not tested
class A():
* def __eq__(self, obj):
* * return True
a = A()
b = []
assert a == b
assert not (a != b)

The second assertion fails. Why? Because coding __eq__, the most
obvious way to make a class have equality based comparisons, buys you
nothing from the != operator. != isn't (by default) a synonym for the
negation of == (unlike in, say, every other language ever); not only
will Python let you make them mean different things, without
documenting this fact - it actively encourages you to do so.

There were a disturbingly high number of people defending this
(including one quite renowned Pythonista, think it might have been
Effbot). Some had the temerity to fall back on "Explicit is better
than implict: if you want != to work, you should damn well code
__ne__!"

Why, for heaven's sake, should I have to, when in 99.99% of use cases
(and of those 0.01% instances quoted in the argument at the time only
one struck me as remotely compelling) every programmer is going to
want __ne__ to be the logical negation of __eq__? Why, dear Python,
are you making me write evil Java-style language power reducing
boilerplate to do the thing you should be doing yourself anyway?
What's more, every programmer is going to unconciously expect it to
work this way, and be as utterly as mystified as me when it fails to
do so. Don't tell me to RTFM and don't tell me to be explicit. I'll
repeat myself - if I wanted to be explicit, I'd be using C and
managing my own memory thank you very much. Better yet, I'd explicitly
and graphically swear - swear in frustration at this entrenched design
philosophy madness that afflicts my favourite language.

I think the real problem with the explicit is better than implicit,
though, is that while you can see the underlying truth its trying to
get at (which is perhaps better expressed by Ruby's more equivocal,
less dependable, but more useful Principle of Least Surprise), in its
stated form its actually kind of meanginless and is used mainly in
defence of warts - no, we'll call them for what they are, a language
design *bugs*.

You see, the problem is, there's no such thing of explict in
programming. Its not a question of not doing things implicitly; its a
question of doing the most sensible thing implicitly. For example this
python code:

some_obj.some_meth(some_arg1, some_arg2)

is implicitly equivalent to

SomeClass.some_meth(some_obj, some_arg1, some_arg2)

which in turn gives us self as a reference to some_obj, and Python's
OO model merrily pretends its the same as Java's when in fact is a
smarter version that just superficially looks the same.

The problem is that the explicit requirement to have self at the start
of every method is something that should be shipped off to the
implicit category. You should have to be explicit, yes - explicit when
you want the *other* behaviour, of self *not* being an argument,
because thats the more unusual, less likely case.

Likewise,

a != b

is implicitly equivalent to something like calling this function (may
not be correct, its a while since I was heavily involved in this
issue):

def equal(a, b):
* if hasattr(a, "__ne__"): return a.__ne__(b)
* if hasattr(b, "__ne__"): return b.__ne__(a)
* if hasattr(a, "__cmp__"): return not (a.__cmp__(b) == 0)
* if hasattr(b, "__cmp__"): return not (b.__cmp__(a) == 0)
* return not (a is b)

There's absolutely nothing explicit about this. I wasn't arguing for
making behaviour implicit; I was arguing for changing the stupid
implict behaviour to something more sensible and less surprising.

The sad thing is there are plenty of smart Python programmers who will
justify all kinds of idiocy in the name of their holy crusade against
the implict.

If there was one change I could make to Python, it would be to get
that damn line out of the Zen.
P.S. Forgive the typos, it was blogged in extreme haste and then only
quickly proofread and edited before posting here. Hopefully the point
I'm making is not diminshed by your reduced respect for me as a result
of my carelessness :-)
Jul 24 '08 #2

P: n/a
On Jul 24, 1:41*pm, Jordan <jordanrastr...@gmail.comwrote:
Hi everyone,

I'm a big Python fan who used to be involved semi regularly in
comp.lang.python (lots of lurking, occasional posting) but kind of
trailed off a bit. I just wrote a frustration inspired rant on my
blog, and I thought it was relevant enough as a wider issue to the
Python community to post here for your discussion and consideration.
[...snip...]

+1 for most of your opinion. I was also bitten by the __eq__/__ne__
problem this morning. :)
Jul 24 '08 #3

P: n/a
Jordan wrote:
Except when it comes to Classes. I added some classes to code that had
previously just been functions, and you know what I did - or rather,
forgot to do? Put in the 'self'. In front of some of the variable
accesses, but more noticably, at the start of *every single method
argument list.* This cannot be any longer blamed as a hangover from
Java - I've written a ton more code, more recently in Python than in
Java or any other OO language. What's more, every time I go back to
Python after a break of more than about a week or so, I start making
this 'mistake' again. The perennial justification for this 'feature'
of the language? That old Python favourite, "Explicit is better than
implicit."
Do you seriously think that Python is designed by mindless application
of a set of humorous and somewhat self-deprecating observations posted
to a newsgroup a number of years ago?

</F>

Jul 24 '08 #4

P: n/a
Of course not.

I just think Explicit is better than Implicit is taken seriously by a
large segment the Python community as a guiding principle, and overall
its influence does more harm than good.

Clearly self being in every argument list was a decision arrived at
long before the Zen was ever coined. Its merely an example of what I
feel is a shortcoming in the conventional 'pythonic' approach to
thinking about problems.

The reluctance to admit that the __eq__ behaviour is a poor design
choice is further evidence; its something (unlike self) that quite
conceivably could be changed, and should be changed, but its somehow
seen (by certain people) as the way that Python should do things.
Jul 24 '08 #5

P: n/a
Hallöchen!

Bruno Desthuilliers writes:
[...]

How would you handle this case with an implicit 'self' :

class Foo(object):
pass

def bar(self):
print self

Foo.bar = bar
Just like this. However, the compiler could add "self" to
non-decorated methods which are defined within "class".

Tschö,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus
Jabber ID: to*************@jabber.rwth-aachen.de
Jul 24 '08 #6

P: n/a
On Jul 24, 7:40*pm, Torsten Bronger <bron...@physik.rwth-aachen.de>
wrote:
Hallöchen!

Bruno Desthuilliers writes:
[...]
How would you handle this case with an implicit 'self' :
class Foo(object):
* *pass
def bar(self):
* *print self
Foo.bar = bar

Just like this. *However, the compiler could add "self" to
non-decorated methods which are defined within "class".

Tschö,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus
* * * * * * * * * *Jabber ID: torsten.bron...@jabber.rwth-aachen.de
Yeah, forgot what I said, Torsten's reply is much better :-)
Jul 24 '08 #7

P: n/a
In message
<52**********************************@r35g2000prm. googlegroups.com>, Jordan
wrote:
Except when it comes to Classes. I added some classes to code that had
previously just been functions, and you know what I did - or rather,
forgot to do? Put in the 'self'. In front of some of the variable
accesses, but more noticably, at the start of *every single method
argument list.*
The reason is quite simple. Python is not truly an "object-oriented"
language. It's sufficiently close to fool those accustomed to OO ways of
doing things, but it doesn't force you to do things that way. You still
have the choice. An implicit "self" would take away that choice.
Jul 24 '08 #8

P: n/a
Jordan <jo************@gmail.com>:
# Blog code, not tested
class A():
def __eq__(self, obj):
return True
a = A()
b = []
assert a == b
assert not (a != b)

The second assertion fails. Why? Because coding __eq__, the most
obvious way to make a class have equality based comparisons, buys you
nothing from the != operator. != isn't (by default) a synonym for the
negation of == (unlike in, say, every other language ever);
This is just plain wrong for at least C# and C++. C# wants you to
explicitly overload "!=", if you have overloaded "==", C++ complains
about "!=" not being defined for class A. If you had derived A from a
another class in C++, the compiler would happily use the operator from the
base class instead of doing silly aliasing of "!=" to "! ==" ...
The sad thing is there are plenty of smart Python programmers who will
justify all kinds of idiocy in the name of their holy crusade against
the implict.

If there was one change I could make to Python, it would be to get
that damn line out of the Zen.
Fortunately, Python isn't designed according to your ideas, and won't
change, so consider your posting a waste of time. If feeling like bringing
such old "issues" up again next time, spend your time learning another
programming language, as you would obviously not get happy with Python
anyway ...

--
Freedom is always the freedom of dissenters.
(Rosa Luxemburg)
Jul 24 '08 #9

P: n/a
This is just plain wrong for at least C# and C++. *C# wants you to
explicitly overload "!=", if you have overloaded "==",
While this is as inconvenient as Python at least it doesn't catch you
unawares. C# 1 (or maybe 0.5), Python 0.
C++ complains
about "!=" not being defined for class A. *
See above. C++ 1, Python 0.

So in showing my clearly hyperbolic comment was technically incorrect
(something I could have told you myself), you have merely shown that
two languages I find vastly inferior to Python overall are actually
better than it in this case.
Fortunately, Python isn't designed according to your ideas, and won't
change, so consider your posting a waste of time. *If feeling like bringing
such old "issues" up again next time, spend your time learning another
programming language, as you would obviously not get happy with Python
anyway ...
OK, if that's your response, that's sad. Of course, I try to learn new
languages all the time. Python is still IMO the best. If the attitude
in the community in response to feedback/criticism has gone from
"maybe you've got a point" to "your a lunatic, we'll never change",
well, only Python will suffer in the long term.

Jul 24 '08 #10

P: n/a
On Jul 24, 8:01*pm, Lawrence D'Oliveiro <l...@geek-
central.gen.new_zealandwrote:
In message
<52404933-ce08-4dc1-a558-935bbbae7...@r35g2000prm.googlegroups.com>, Jordan
wrote:
Except when it comes to Classes. I added some classes to code that had
previously just been functions, and you know what I did - or rather,
forgot to do? Put in the 'self'. In front of some of the variable
accesses, but more noticably, at the start of *every single method
argument list.*

The reason is quite simple. Python is not truly an "object-oriented"
language. It's sufficiently close to fool those accustomed to OO ways of
doing things, but it doesn't force you to do things that way. You still
have the choice. An implicit "self" would take away that choice.
You could still explicitly request non-implicit self on a method by
method basis.
Jul 24 '08 #11

P: n/a
On 24 Jul., 11:40, Torsten Bronger <bron...@physik.rwth-aachen.de>
wrote:
Hallöchen!

Bruno Desthuilliers writes:
[...]
How would you handle this case with an implicit 'self' :
class Foo(object):
pass
def bar(self):
print self
Foo.bar = bar

Just like this. However, the compiler could add "self" to
non-decorated methods which are defined within "class".
And $self2, $self3, ... to the object methods of nested classes and
$cls2, $cls3, ... to the classmethods of those classes...?

And when we are at it, here is a nice little exercise for the
proponents of compiler magic.

Write a decorator that takes and returns a method and prints the
object the method is bound to. It's very easy to do it when the object
is passed explicitely:

def print_self(func):
def call(self, *args, **kwd):
print self
return func(self, *args, **kwd)
return call

Conceptual clarity isn't always an entirely bad thing to have.

Jul 24 '08 #12

P: n/a
Hallöchen!

Kay Schluehr writes:
On 24 Jul., 11:40, Torsten Bronger <bron...@physik.rwth-aachen.de>
wrote:
>>
Bruno Desthuilliers writes:
>>[...]

How would you handle this case with an implicit 'self' :

class Foo(object):
pass

def bar(self):
print self

Foo.bar = bar

Just like this. However, the compiler could add "self" to
non-decorated methods which are defined within "class".

And $self2, $self3, ... to the object methods of nested classes
and $cls2, $cls3, ... to the classmethods of those classes...?
One could surely find ways to realise this. However, the design
goal should be: Make the frequent case simple, and the rare case
possible.

(Actually, I'm -0 on this anyway because I forget "self" very
seldomly, and you would still have ".self" all over the place.)

Tschö,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus
Jabber ID: to*************@jabber.rwth-aachen.de
Jul 24 '08 #13

P: n/a
On Jul 24, 8:21*pm, Jordan <jordanrastr...@gmail.comwrote:
If the attitude
in the community in response to feedback/criticism has gone from
"maybe you've got a point" to "your a lunatic, we'll never change",
well, only Python will suffer in the long term.
Personally, I think it has more to do with statements like "there are
plenty of smart Python programmers who will
justify all kinds of idiocy in the name of their holy crusade" than
with your position. You don't begin a discussion by discrediting
anyone who might disagree with you as some kind of religious bigot
while simultaneously holding that you are the only sane voice
speaking.

Jul 24 '08 #14

P: n/a
Jordan <jo************@gmail.com>:
>Fortunately, Python isn't designed according to your ideas, and won't
change, so consider your posting a waste of time. Â*If feeling like
bringing such old "issues" up again next time, spend your time learning
another programming language, as you would obviously not get happy with
Python anyway ...

OK, if that's your response, that's sad. Of course, I try to learn new
languages all the time. Python is still IMO the best. If the attitude
in the community in response to feedback/criticism has gone from
"maybe you've got a point" to "your a lunatic, we'll never change",
well, only Python will suffer in the long term.
I don't really mind, what you think about my response. Python will suffer
from it as little as it will suffer from your complaints: These things
will not change, whatever any of us says about them. So this discussion
unlikely to produce any new insight, especially because this as been
discussed over and over again in the past, without any effect on Python.

Let's just drop this, and if you want to complain next time, just complain
about something, that is really worth being complained about, like for
instance old and outdated modules in the standard library, or real
showstoppers in Python (e.g. the GIL).

--
Freedom is always the freedom of dissenters.
(Rosa Luxemburg)
Jul 24 '08 #15

P: n/a
Personally, I think it has more to do with statements like "there are
plenty of smart Python programmers who will
justify all kinds of idiocy in the name of their holy crusade" than
with your position. You don't begin a discussion by discrediting
anyone who might disagree with you as some kind of religious bigot
while simultaneously holding that you are the only sane voice
speaking.
I didn't set out to discredit anyone who might disagree with me; in
fact I didn't in anyway try to pre-empt any person who might disagree
with my thesis. I merely stated an observation - I have in the past
seen seemingly intelligent people take silly stands in the name of
Explicit is greater than Implicit (not just on comp.lang.python, and
not just concerning != or self).

I wish in retrospect I'd had the time, patience and focus to edit the
initial post to make it more measured and less inflammatory, because
its clear the tone detracts from the argument I'm making, which I feel
still stands.

So if you wish, ignore the specifics of the frustration that inspired
me and consider only the thrust of what I'm saying:

"Explicit is better than Implict" considered harmful. Discuss.
Jul 24 '08 #16

P: n/a
Jordan <jo************@gmail.comwrites:
I just think Explicit is better than Implicit is taken seriously by
a large segment the Python community as a guiding principle
Indeed it is. However, it has to compete with all the other principles
in the Zen of Python, which have equal status.
and overall its influence does more harm than good.
Thanks for your opinion. I disagree strongly: I think its influence is
nicely balanced by the other important principles that are also
followed.

--
\ “The way to build large Python applications is to componentize |
`\ and loosely-couple the hell out of everything.” —Aahz |
_o__) |
Ben Finney
Jul 24 '08 #17

P: n/a
Jordan <jo************@gmail.comwrites:
If the attitude in the community in response to feedback/criticism
has gone from "maybe you've got a point" to "your a lunatic, we'll
never change", well, only Python will suffer in the long term.
You're not a lunatic.

We, and Python itself, change quite readily.

Neither of those mean your ideas in this instance have merit.

--
\ “Pinky, are you pondering what I'm pondering?” “Well, I think |
`\ so, Brain, but do I really need two tongues?” —_Pinky and The |
_o__) Brain_ |
Ben Finney
Jul 24 '08 #18

P: n/a
I don't really mind, what you think about my response. *Python will suffer
from it as little as it will suffer from your complaints: *These things
will not change, whatever any of us says about them. *So this discussion
unlikely to produce any new insight, especially because this as been
discussed over and over again in the past, without any effect on Python. *
You're right, of course. Because Python is in so many ways what I'm
looking for in a language, I transform it in my mind to my own,
personal ideal, close to the real existing language but with what I
consider to be the imperfections removed.

I'm not suggesting getting rid of explicit self, even in "Python
4000." Not because of the advantages it gives, which are some but
don't outweigh the net loss in my ledger. It just wouldn't be
Pythonic. I know its not a Pythonic thing to want. Thats my problem -
because I have a largely Pythonic approach in some areas, it upsets me
when there's a mismatch. So lets say I'm -1 for introducing it into a
language and +0 for keeping it in Python now that its entrenched.

If a lot of users keep bringing up something like this, well my
attitude used to be the same as yours - "learn to love Python for what
it is." Maybe
Let's just drop this, and if you want to complain next time, just complain
about something, that is really worth being complained about, like for
instance old and outdated modules in the standard library, or real
showstoppers in Python (e.g. the GIL).
Its worth complaining about because I'm not just someone who has
stumbled across Python after years of Java and PHP, and hasn't really
grokked it, and has jumped on the net at once to start a flamewar. I'm
someone who loves Python, uses it in preference to other languages,
and have now returned to it after a bit of a break and its finally hit
me over the head like a tonne of bricks "Hey, I can see exactly what
all those internet trolls were talking about. This *is* a really
annoying and silly state of affairs."

I was trying not to change explicit self, or even != (which has a much
better case.) I was trying to ask the community to reconsider a
premise that the language is built around. Explicit is actually kinda
annoying a lot of the time, viz., java. This is about social and
philosophical adjustments, not technical ones.

In reality? I'll just keep writing Python (hopefully enough so that
explicit self become burned into muscle memory), and use other
languages when necessary (no more C than I have to, looking forward to
dabbling in Erlang soon, and one day overcoming the parentheses phobia
enough to really tackle Lisp properly). When I'm old enough and wise
enough, and have the time, energy and inclination, maybe I'll sit down
and put a proper effort into designing and implementing a new language
that bests suits my own particular style and needs. Just maybe it'll
be good enough that smart people will rally to defend its design
principles from people attacking them on the internet :-)
Jul 24 '08 #19

P: n/a
>
Please understand that I'm not arguing about this particular design
choice (and FWIW, I'd mostly agree on the point that having a != b
different from not (a == b) is actually a wart). I'm just correcting
your statement about the behaviour of __eq__ / __ne__ not being
documented, which is obviously false.

(snip)
What was the reasoning behind having both __eq__ / __ne__ anyway? To
fit in with the equality comparisons? I do agree this one seems like
a wart, but not a serious one. I'd say it would make more sense for
the interpreter to provide a warning on classes that define one and
not that other, at least if set to a certain level, similar to -3 for
depreciated.

(Or does this exist? I think a "wart" catching level that outputs
potential warts and issues would be a useful addition!)
Jul 24 '08 #20

P: n/a
Then why do you write, let me quote:
>
"""
(snip) coding __eq__ (snip) buys you
nothing from the != operator. != isn't (by default) a synonym for the
negation of == (unlike in, say, every other language ever); not only
will Python let you make them mean different things, without
documenting this fact - it actively encourages you to do so.
"""
My words aren't as clear as they should be. I mean that Python lets
*you* do something without documenting, or rather stating to use a
better term, that your intention is the non-obvious one. I'm not
saying that Python itself lacks documentation for its own behaviour;
I'm saying it should force you to make your intentions clear and
visible to someone reading your code when you want to do something non-
obvious.
I was not commenting on the actual design choice, just stating that it
is actually documented.
Yes, it is. I apologise for the poor construction of my statement
which led to this confusion.
And you're talking about strawman ??? Come on, you obviously can tell
the difference between a one-line statement and your above strawman
argument, don't you ?
I'm talking about strawmen because I was deliberately choosing to
invoke one with rhetorical flourish for the purposes of making my
point forcefully. I wanted people to be clear that I knew perfectly
well what I was doing and that they needn't call me out on it.
Please understand that I'm not arguing about this particular design
choice (and FWIW, I'd mostly agree on the point that having a != b
different from not (a == b) is actually a wart). I'm just correcting
your statement about the behaviour of __eq__ / __ne__ not being
documented, which is obviously false.
Good, at least we've come to a point in this discussion where I can
firmly agree with somebody.
Jul 24 '08 #21

P: n/a
>
My words aren't as clear as they should be. I mean that Python lets
*you* do something without documenting, or rather stating to use a
better term, that your intention is the non-obvious one. I'm not
saying that Python itself lacks documentation for its own behaviour;
I'm saying it should force you to make your intentions clear and
visible to someone reading your code when you want to do something non-
obvious.
I take it you are relating to the need for less comments within the
code as the idea is the Python code itself is readable? Or are you
saying that when someone does a clever trick it should be documented
better? I'm a little confused as what you mean by no-documenting? I
always add doc-strings to modules I will be using heavily as well as a
README with them. But that isn't different from programming in other
languages and using comments.

(Do you mean something like JavaDoc?)
Jul 24 '08 #22

P: n/a
Jordan <jo************@gmail.comwrites:
Explicit is actually kinda annoying a lot of the time
Yes. It is also very helpful for those who will later try to
understand, interface with, debug, modify, or otherwise work with the
code (including, in a great many cases, the original author of that
code).

The great boost that EIBTI grants to maintainability trumps, in my
view, the annoyance felt by some at the time of writing the code.

--
\ “An eye for an eye would make the whole world blind.” —Mahatma |
`\ Gandhi |
_o__) |
Ben Finney
Jul 24 '08 #23

P: n/a
>
You're not a lunatic.

We, and Python itself, change quite readily.

Neither of those mean your ideas in this instance have merit.
You're right, these premises don't lead to this conclusion. Neither do
they lead to its negation, of course.

As it happens, you're wrong on both counts. I do in fact suffer from a
mental illness and have spent a week in a psych ward, so am a lunatic
by some reasonable definitions. Python readily changes in some
regards, but not in others. Of course, a great many things of worth
have this as one of their essential qualities.

Pithy replies are fun.
Thanks for your opinion. I disagree strongly: I think its influence is
nicely balanced by the other important principles that are also
followed.
This isn't just being clever, there's substance here. A clearly stated
opposing position, with a decent if somewhat short justification.

I think you're right insofar as you go - that if someone really sits
down, and thinks clearly about all the Pythonic principles, as a
whole, and in detail, then the net result in the shaping their
thinking will be positive more often than not.

Perhaps we're just looking at an instance of a wider problem - smart
people boil good ideas down into short slogans, which are nice and
memorable and somewhat helpful, but can lead to bad consequences when
lots of others start overusing or misunderstanding them.
Jul 24 '08 #24

P: n/a
Just wondered whether the OP's Subject was a
deliberate play on "flog a dead horse" or
merely an ironic one :)

TJG
Jul 24 '08 #25

P: n/a
Hallöchen!

Bruno Desthuilliers writes:
Torsten Bronger a écrit :
>Kay Schluehr writes:
>>On 24 Jul., 11:40, Torsten Bronger <bron...@physik.rwth-aachen.de>
wrote:

[...] Just like this. However, the compiler could add "self"
to non-decorated methods which are defined within "class".

And $self2, $self3, ... to the object methods of nested classes
and $cls2, $cls3, ... to the classmethods of those classes...?

One could surely find ways to realise this. However, the design
goal should be: Make the frequent case simple, and the rare case
possible.

Given the (more and more prominent) use of decorators, metaclasses
and other meta-programming techniques in Python, I'm not sure the
cases where you really need access to Python's object model inners
are that "rare". Not in my code at least.
What does "not rare" mean for you?

Tschö,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus
Jabber ID: to*************@jabber.rwth-aachen.de
Jul 24 '08 #26

P: n/a
Hallöchen!

Bruno Desthuilliers writes:
Torsten Bronger a écrit :
>Bruno Desthuilliers writes:
>>[...]

How would you handle this case with an implicit 'self' :

class Foo(object):
pass

def bar(self):
print self

Foo.bar = bar

Just like this. However, the compiler could add "self" to
non-decorated methods which are defined within "class".

What's defined within classes are plain functions. It's actually
the lookup mechanism that wraps them into methods (and manage to
insert the current instance as first argument).
And why does this make the implicit insertion of "self" difficult?
I could easily write a preprocessor which does it after all.

Tschö,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus
Jabber ID: to*************@jabber.rwth-aachen.de
Jul 24 '08 #27

P: n/a
On 24 Jul, 12:02, "Sebastian \"lunar\" Wiesner"
<basti.wies...@gmx.netwrote:
>
Fortunately, Python isn't designed according to your ideas, and won't
change, so consider your posting a waste of time.
This is the kind of petty response that serves only to shut down
discussion that might actually lead to genuine attempts to remedy
issues (or "warts") with Python. Although the tone of the complaint
was badly chosen, it is always worth jumping over the fence and
considering whether things could be made better. Without complaints
being aired, how do you expect any advances around things like the
"old and outdated modules in the standard library, or real
showstoppers in Python (e.g. the GIL)" that you mention elsewhere?
If feeling like bringing such old "issues" up again next time, spend
your time learning another programming language, as you would
obviously not get happy with Python anyway ...
Such a constructive response that is! Instead, I think it is
interesting to consider why methods still require an explicit "self"
parameter - something which has been discussed previously - and
whether there might be a case for omitting it from the signature -
perhaps in a variant of Python - in methods which are defined within
class definitions (as opposed to those assigned to classes later).

Indeed, there's scope for experimentation with Python variants, just
to investigate whether certain features added to CPython can be
considered essential, and which features might be considered of
marginal benefit. I recall that some features (eg. lexical scoping and
closures) were eventually added to Python partly to remedy issues with
lambda definitions, but also because the lack of such features was
cited repeatedly by proponents of other languages. In such a climate,
it can be easier to "meet the challenge" and implement features to
silence the critics rather than to insist that they are of marginal
benefit (although it's interesting to note this in a thread where
improvement suggestions are deemed "a waste of time" - I suppose the
community is now more resistant to suggestions from "unofficial"
sources).

A review of such language "enhancement" decisions would be
interesting, but since one shouldn't expect this from the CPython
implementers, I feel that it is the role of others to do so in their
own experiments. Of course, such experiments are often derided as
"lesser Pythons" or misunderstood, but that's another unfortunate
trait exhibited by parts of the Python community.

Paul
Jul 24 '08 #28

P: n/a
Torsten Bronger <br*****@physik.rwth-aachen.de>:
Hallöchen!

Bruno Desthuilliers writes:
>Torsten Bronger a écrit :
>>Bruno Desthuilliers writes:

[...]

How would you handle this case with an implicit 'self' :

class Foo(object):
pass

def bar(self):
print self

Foo.bar = bar

Just like this. However, the compiler could add "self" to
non-decorated methods which are defined within "class".

What's defined within classes are plain functions. It's actually
the lookup mechanism that wraps them into methods (and manage to
insert the current instance as first argument).

And why does this make the implicit insertion of "self" difficult?
I could easily write a preprocessor which does it after all.
Who said, that it would be "difficult"? He just corrected your statement
about definitions inside a class, and did not make any assumption about
making "self" implicit.

I'd assume, that making self implicit wouldn't be that difficult to assume.
But does the fact, that it could easily be done, alone mean, that it
_should_ be done? The explicit "self" was a design decision, that can't
really be judged by technical arguments from implementation side. Its a
discussion about design from a programmers point of view ...

--
Freedom is always the freedom of dissenters.
(Rosa Luxemburg)
Jul 24 '08 #29

P: n/a
On Jul 25, 1:39 am, Torsten Bronger <bron...@physik.rwth-aachen.de>
wrote:
I could easily write a preprocessor which does it after all.
Have you considered actually doing so? That might resolve the whole
issue, if a tool exists for those who want implicit self. After all,
if -you- have the itch...

Perhaps you could leverage off of EasyExtend?

"EasyExtend (EE) is a preprocessor generator and metaprogramming
framework written in pure Python and integrated with CPython. The main
purpose of EasyExtend is the creation of extension languages i.e.
adding custom syntax and semantics to Python."

http://www.fiber-space.de/EasyExtend/doc/EE.html
Jul 24 '08 #30

P: n/a
On Jul 24, 11:49*am, "Sebastian \"lunar\" Wiesner"
<basti.wies...@gmx.netwrote:
Torsten Bronger <bron...@physik.rwth-aachen.de>:
Hallöchen!
Bruno Desthuilliers writes:
Torsten Bronger a écrit :
>Bruno Desthuilliers writes:
>>[...]
>>How would you handle this case with an implicit 'self' :
>>class Foo(object):
* *pass
>>def bar(self):
* *print self
>>Foo.bar = bar
>Just like this. *However, the compiler could add "self" to
non-decorated methods which are defined within "class".
What's defined within classes are plain functions. It's actually
the lookup mechanism that wraps them into methods (and manage to
insert the current instance as first argument).
And why does this make the implicit insertion of "self" difficult?
I could easily write a preprocessor which does it after all.

Who said, that it would be "difficult"? *He just corrected your statement
about definitions inside a class, and did not make any assumption about
making "self" implicit.

I'd assume, that making self implicit wouldn't be that difficult to assume.
But does the fact, that it could easily be done, alone mean, that it
_should_ be done? *The explicit "self" was a design decision, that can't
really be judged by technical arguments from implementation side. *Its a
discussion about design from a programmers point of view ...

--
Freedom is always the freedom of dissenters.
* * * * * * * * * * * * * * * * * * * (Rosa Luxemburg)
I don't think you can infer from 'explicit is better than implicit'
that 'the more explicit the better'. For instance, we don't use:

python.callbyvalue.foo( bar, 1, 2 )
python.callbyref.foo2( bar, x, y )

or further:

foo( byref bar, byval 1, byval 2 )
foo2( byref bar, byref x, byref x )

though some languages do.

Python doesn't do that much implicity, like copying, with the
exception of copying strings, which some string functions do, such as
lower, replace, strip. (Does slicing return a new string?)

What is the most surprisingly implicit behavior in Python? What is
the most explicit?
Jul 24 '08 #31

P: n/a
Hallöchen!

Sebastian \"lunar\" Wiesner writes:
Torsten Bronger <br*****@physik.rwth-aachen.de>:
>Bruno Desthuilliers writes:
>>Torsten Bronger a écrit :

Bruno Desthuilliers writes:

[...]

Just like this. However, the compiler could add "self" to
non-decorated methods which are defined within "class".

What's defined within classes are plain functions. It's actually
the lookup mechanism that wraps them into methods (and manage to
insert the current instance as first argument).

And why does this make the implicit insertion of "self"
difficult? I could easily write a preprocessor which does it
after all.

Who said, that it would be "difficult"? He just corrected your
statement about definitions inside a class, and did not make any
assumption about making "self" implicit.
If it is not the implementation, I don't see why the "definition
inside a class" matters at all. It can be realised as a
transformation of the syntax tree and would even be transparent for
the compiling steps after it.

Tschö,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus
Jabber ID: to*************@jabber.rwth-aachen.de
Jul 24 '08 #32

P: n/a
On Jul 24, 11:43*am, Bruno Desthuilliers
<bdesth.quelquech...@free.quelquepart.frwrote:
Jordan a écrit :
I don't really mind, what you think about my response. *Python will suffer
from it as little as it will suffer from your complaints: *These things
will not change, whatever any of us says about them. *So this discussion
unlikely to produce any new insight, especially because this as been
discussed over and over again in the past, without any effect on Python. *
You're right, of course. Because Python is in so many ways what I'm
looking for in a language, I transform it in my mind to my own,
personal ideal, close to the real existing language but with what I
consider to be the imperfections removed.

I guess you'll find a lot of us guilty here too - but do we really agree
on what we consider to be "imperfections" ?-)

(snip)
I was trying not to change explicit self, or even != (which has a much
better case.) I was trying to ask the community to reconsider a
premise that the language is built around. Explicit is actually kinda
annoying a lot of the time, viz., java. This is about social and
philosophical adjustments, not technical ones.

"explicit-is-etc" - just like the remaining of Python's zen - is a
general philosophy statement, not an absolute rule. Another quote states
that practicality beats purity.

So yes, Python has warts, and one can't get away dogmatically quoting
Python's zen. Even if I'm sometimes myself guilty here, it's certainly
worth taking time to better address criticism, either by aknowledging
effective warts when someone points them out or by explaining (or
pointing to explanations of) the unusual parts of Python's design.

Now since most of the times, criticisms expressed here fall in the
second category, we're happy to learn you'll now take appropriate action
here and help us keep c.l.py a newbie-friendly place !-)
Something that is pure and explicit is a conflict of priorities with
something that is practical and implicit. As with any rules, there
are going to be times when the priorities in the Zen conflict with one
another, and the Zen is silent on which combination ranks higher.

Some people will hate you for using 'sf' instead of 'self'... but some
hate you for spelling errors too. A temper lost is a flamewar earned.

If you post two equivalent code snippets that both work, we can help
you compare them.
Jul 24 '08 #33

P: n/a


Torsten Bronger wrote:
Hallöchen!
And why does this make the implicit insertion of "self" difficult?
I could easily write a preprocessor which does it after all.
class C():
def f():
a = 3

Inserting self into the arg list is trivial. Mindlessly deciding
correctly whether or not to insert 'self.' before 'a' is impossible when
'a' could ambiguously be either an attribute of self or a local variable
of f. Or do you and/or Jordan plan to abolish local variables for methods?

tjr

Jul 25 '08 #34

P: n/a


Jordan wrote:
I wish in retrospect I'd had the time, patience and focus to edit the
initial post to make it more measured and less inflammatory, because
its clear the tone
I will ignore that.
detracts from the argument I'm making, which I feel still stands.
class C():
def f():
a = 3

You made a complaint and an incomplete proposal identical to what others
have proposed. But here is where the idea always sinks. Suppose as you
propose 'self' is added to the arg list. How is a mindless algorithm to
decide whether to change a to 'self.a' to make it an attribute or leave
it alone as a local variable? Or would you abolish local vars from
methods (which would slow them down)?

As near as I can see, any concrete detailed implementable proposal would
be a more major change in Python or its implementation than
'down-with-self'ers usually admit.

If the argument you refer to is instead that 'Explicit is better than
Implicit' is a bit overused and inadequate as a technical response, then
I agree. But excuse me for being unsure ;-).

Terry Jan Reedy

Jul 25 '08 #35

P: n/a
On Jul 24, 11:43*am, Bruno Desthuilliers
<bdesth.quelquech...@free.quelquepart.frwrote:
>
"explicit-is-etc" - just like the remaining of Python's zen - is a
general philosophy statement, not an absolute rule. Another quote states
that practicality beats purity.
Very much so. In fact, I'd like you all to take a detour to a recent
bug report [1] where I gained some interesting insight into the Zen.

[1] http://bugs.python.org/issue3364

Jul 25 '08 #36

P: n/a
In message
<82**********************************@q5g2000prf.g ooglegroups.com>, Jordan
wrote:
On Jul 24, 8:01Â*pm, Lawrence D'Oliveiro <l...@geek-
central.gen.new_zealandwrote:
>In message
<52404933-ce08-4dc1-a558-935bbbae7...@r35g2000prm.googlegroups.com>,
Jordan wrote:
Except when it comes to Classes. I added some classes to code that had
previously just been functions, and you know what I did - or rather,
forgot to do? Put in the 'self'. In front of some of the variable
accesses, but more noticably, at the start of *every single method
argument list.*

The reason is quite simple. Python is not truly an "object-oriented"
language. It's sufficiently close to fool those accustomed to OO ways of
doing things, but it doesn't force you to do things that way. You still
have the choice. An implicit "self" would take away that choice.

You could still explicitly request non-implicit self on a method by
method basis.
That would mean making OO the default. Which Python doesn't do.
Jul 25 '08 #37

P: n/a
On 25 Jul., 03:01, Terry Reedy <tjre...@udel.eduwrote:
Torsten Bronger wrote:
Hallöchen!
And why does this make the implicit insertion of "self" difficult?
I could easily write a preprocessor which does it after all.

class C():
def f():
a = 3

Inserting self into the arg list is trivial. Mindlessly deciding
correctly whether or not to insert 'self.' before 'a' is impossible when
'a' could ambiguously be either an attribute of self or a local variable
of f. Or do you and/or Jordan plan to abolish local variables for methods?

tjr
This isn't the problem Jordan tries to address. It's really just about
`self` in the argument signature of f, not about its omission in the
body. Some problems occur when not `self` shall be used but e.g.
`this`. Here one has to specify more:

class C():
__self__ = 'this' # use `this` instead of `self`
def f(a):
this.a = a

or

class C():
def f($this, a): # use `this` instead of `self`
this.a = a

When an $-prefixed parameter is found the automatic insertion of
`self` will be blocked and the $-prefixed parameter name will be used
instead but without the prefix.
Jul 25 '08 #38

P: n/a
On Jul 24, 5:01 am, Lawrence D'Oliveiro <l...@geek-
central.gen.new_zealandwrote:
In message
<52404933-ce08-4dc1-a558-935bbbae7...@r35g2000prm.googlegroups.com>, Jordan
wrote:
Except when it comes to Classes. I added some classes to code that had
previously just been functions, and you know what I did - or rather,
forgot to do? Put in the 'self'. In front of some of the variable
accesses, but more noticably, at the start of *every single method
argument list.*

The reason is quite simple. Python is not truly an "object-oriented"
language. It's sufficiently close to fool those accustomed to OO ways of
doing things, but it doesn't force you to do things that way. You still
have the choice. An implicit "self" would take away that choice.
By that logic, C++ is not OO. By that logic, Ruby is not OO. By that
logic, I know of only one OO language: Java :)

The fact that a language doesn't force you to do object-oriented
programming doesn't mean that it's not object-oriented. In other
words, your words are nonsense.

Sebastian

Jul 25 '08 #39

P: n/a
Terry Reedy <tj*****@udel.eduwrites:
Torsten Bronger wrote:
>Hallöchen!
And why does this make the implicit insertion of "self" difficult?
I could easily write a preprocessor which does it after all.

class C():
def f():
a = 3

Inserting self into the arg list is trivial. Mindlessly deciding
correctly whether or not to insert 'self.' before 'a' is impossible
when 'a' could ambiguously be either an attribute of self or a local
variable of f. Or do you and/or Jordan plan to abolish local
variables for methods?
Why do you think that 'self' should be inserted anywhere except in the
arg list? AFAIU, the idea is to remove the need to write 'self' in the
arg list, not to get rid of it entirely.
Best,

-Nikolaus

--
»It is not worth an intelligent man's time to be in the majority.
By definition, there are already enough people to do that.«
-J.H. Hardy

PGP fingerprint: 5B93 61F8 4EA2 E279 ABF6 02CF A9AD B7F8 AE4E 425C

Jul 25 '08 #40

P: n/a
>
By that logic, C++ is not OO. By that logic, Ruby is not OO. By that
logic, I know of only one OO language: Java :)

The fact that a language doesn't force you to do object-oriented
programming doesn't mean that it's not object-oriented. In other
words, your words are nonsense.
No, what it means is that it might support OO but doesn't have to, it
isn't the only way to code.

Supporting and Being OO are very different.

Jul 25 '08 #41

P: n/a
On Jul 25, 3:38 am, co*********@gmail.com wrote:
By that logic, C++ is not OO. By that logic, Ruby is not OO. By that
logic, I know of only one OO language: Java :)
The fact that a language doesn't force you to do object-oriented
programming doesn't mean that it's not object-oriented. In other
words, your words are nonsense.

No, what it means is that it might support OO but doesn't have to, it
isn't the only way to code.
"Support OO but it doesn't have to"? That sounds like saying that in
some Python implementations you'll be able to use OO, but that you
just might bump into a Python distribution where you would type

class MClass:
pass

at the interpreter and it would give you an syntax error. Is that what
you mean?
Supporting and Being OO are very different.
I guess at this point it's rather pointless to discuss this because of
the different and deep level of interpretation that we might have on
the words "support" and "be." IMO, the obvious thing to say is that a
language that *supports* OO can also be said to *be* OO. However, it
would seem just ridiculous to me to say the same thing about a
language like, e.g., Perl, where OO is so pathetic. But maybe this is
just a game of words...

I consider Python to *be* OO, anyway.

Sebastian

Jul 25 '08 #42

P: n/a
Lie
On Jul 24, 9:26*pm, Jordan <jordanrastr...@gmail.comwrote:
In reality? I'll just keep writing Python (hopefully enough so that
explicit self become burned into muscle memory), and use other
languages when necessary (no more C than I have to, looking forward to
dabbling in Erlang soon, and one day overcoming the parentheses phobia
enough to really tackle Lisp properly). When I'm old enough and wise
enough, and have the time, energy and inclination, maybe I'll sit down
and put a proper effort into designing and implementing a new language
that bests suits my own particular style and needs.
Just maybe it'll
be good enough that smart people will rally to defend its design
principles from people attacking them on the internet :-)
Forgive me in advance, but that is just a day dream, however good a
language design is, there WILL be other people who disagree with the
corner cases. Python is one example, even with Guido that is great at
designing language, there are many edges in python that many people
disagree and attack from time to time, but not every one of those
edges got fixed, why? Because it is impossible to appeal everyone, if
the language is changed according to your (or other's) idea, there
will be some other people who don't like it (with your examples, it
might be the people who are used to functional programming and people
who want to implement a very complex behavior in __eq__ and __ne__).
What seemed to be the obviously correct behavior for you would be
unexpected for some other people (this is also in python's Zen
although in a slightly twisted kind of way[1]: "There should be one--
and preferably only one --obvious way to do it; Although that way may
not be obvious at first unless you're Dutch.").

[1] Basically, the Zen is saying that whether and idea is the
obviously correct way to do something is measured by Guido's measuring
tape. Basically it's also saying that python is designed according to
what HE thinks is obviously correct.

The Zen you're attacking: "Explicit is better than implicit," is a
generally good advice, although as you mentioned, it doesn't fit every
single cases in the world, which leads us to the other Zen[2]:
"Special cases aren't special enough to break the rules; Although
practicality beats purity."

[2] In our case, "Explicit is better than implicit" is the "rules",
the corner cases where implicit is a generally better choice is the
"special cases". The first verse ("Special cases ... break rules")
implies that everything should be explicit, no exceptions, while the
second verse ("practicality beats purity") means that if something
breaking the rule makes it more practical, then you don't have to
follow the rules[3]. These two statements contradicts each other,
implying an implicit Zen: "Foolish consistency is the hobgoblin's
little minds", it is OK to break the rules sometimes.

[3] Despite saying this, I also have to emphasize that what is
practical or not is measured by Guido's tape.

With this explained, I hope you understand the point I'm making:
"There is no The Perfect Language, that is liked by everyone in the
world." The moral is, if you like a language, try to resist its warts
and know that each wart have its own story. You don't have to like the
warts, but you just need to stand to it.
Jul 25 '08 #43

P: n/a
On Jul 25, 9:49*pm, Lie <Lie.1...@gmail.comwrote:
These two statements contradicts each other,
implying an implicit Zen: "Foolish consistency is the hobgoblin's
little minds", it is OK to break the rules sometimes.
"A foolish consistency is _the_ hobgoblin of little minds." (Ralph
Waldo Emerson, although the emphasis is mine)

I do like your version, though :)
Jul 25 '08 #44

P: n/a


Kay Schluehr wrote:
On 25 Jul., 03:01, Terry Reedy <tjre...@udel.eduwrote:
>Inserting self into the arg list is trivial. Mindlessly deciding
correctly whether or not to insert 'self.' before 'a' is impossible when
'a' could ambiguously be either an attribute of self or a local variable
of f. Or do you and/or Jordan plan to abolish local variables for methods?

tjr

This isn't the problem Jordan tries to address. It's really just about
`self` in the argument signature of f, not about its omission in the
body.
That is not at all how I read him, so I will let him respond if he
wishes. The main problem moving a function from module scope to class
scope is prefixing the proper variables. Adding a param name, whether
'self', 's', 'this', or whatever, is trivial and hardly worth the ink.

Jul 25 '08 #45

P: n/a


Nikolaus Rath wrote:
Terry Reedy <tj*****@udel.eduwrites:
>Torsten Bronger wrote:
>>Hallöchen!
And why does this make the implicit insertion of "self" difficult?
I could easily write a preprocessor which does it after all.
class C():
def f():
a = 3

Inserting self into the arg list is trivial. Mindlessly deciding
correctly whether or not to insert 'self.' before 'a' is impossible
when 'a' could ambiguously be either an attribute of self or a local
variable of f. Or do you and/or Jordan plan to abolish local
variables for methods?

Why do you think that 'self' should be inserted anywhere except in the
arg list? AFAIU, the idea is to remove the need to write 'self' in the
arg list, not to get rid of it entirely.
Because you must prefix self attributes with 'self.'. If you do not use
any attributes of the instance of the class you are making the function
an instance method of, then it is not really an instance method and need
not and I would say should not be masqueraded as one. If the function
is a static method, then it should be labeled as one and no 'self' is
not needed and auto insertion would be a mistake. In brief, I assume
the OP wants 'self' inserted in the body because inserting it only in
the parameter list and never using it in the body is either silly or wrong.

tjr

Jul 25 '08 #46

P: n/a
On Jul 24, 6:41*am, Jordan <jordanrastr...@gmail.comwrote:
Hi everyone,

I'm a big Python fan who used to be involved semi regularly in
comp.lang.python (lots of lurking, occasional posting) but kind of
trailed off a bit. I just wrote a frustration inspired rant on my
blog, and I thought it was relevant enough as a wider issue to the
Python community to post here for your discussion and consideration.

This is not flamebait. I love Python, and I'm not out to antagonise
the community. I also realise that one of the issues I raise is way
too ingrained to be changed now. I'd just like to share my thinking on
a misstep in Python's guiding principles that has done more harm than
good IMO. So anyway, here's the post.

I've become utterly convinced that at least one criticism leveled at
my favourite overall programming language, Python, is utterly true and
fair. After quite a while away from writing Python code, I started
last night on a whim to knock up some code for a prototype of an idea
I once had. It's going swimmingly; the Python Image Library, which I'd
never used before, seems quick, intuitive, and with the all the
features I need for this project. As for Python itself, well, my heart
still belongs to whitespace delimitation. All the basics of Python
coding are there in my mind like I never stopped using them, or like
I've been programming in this language for 10 years.

Except when it comes to Classes. I added some classes to code that had
previously just been functions, and you know what I did - or rather,
forgot to do? Put in the 'self'. In front of some of the variable
accesses, but more noticably, at the start of *every single method
argument list.* This cannot be any longer blamed as a hangover from
Java - I've written a ton more code, more recently in Python than in
Java or any other OO language. What's more, every time I go back to
Python after a break of more than about a week or so, I start making
this 'mistake' again. The perennial justification for this 'feature'
of the language? That old Python favourite, "Explicit is better than
implicit."

It's damn useful for scoping. You can look in the body of your method
and tell whether you are accessing local variables or instance
variables.

I'm a great fan of self and I'm afraid you're flogging a dead horse on
that one.

Your point about rich comparison (at least the == != problem) is fair..
This one is fixed in Python 3 though I believe.

Michael Foord
--
http://www.ironpythoninaction.com/
Jul 25 '08 #47

P: n/a
On 25 Jul, 22:37, Terry Reedy <tjre...@udel.eduwrote:
Kay Schluehr wrote:

This isn't the problem Jordan tries to address. It's really just about
`self` in the argument signature of f, not about its omission in the
body.

That is not at all how I read him, so I will let him respond if he
wishes. The main problem moving a function from module scope to class
scope is prefixing the proper variables. Adding a param name, whether
'self', 's', 'this', or whatever, is trivial and hardly worth the ink.
He wrote the following of relevance:

"I added some classes to code that had previously just been functions,
and you know what I did - or rather, forgot to do? Put in the 'self'.
In front of some of the variable accesses, but more noticably, at the
start of *every single method argument list.*"

And rounding off with this on the subject:

"The problem is that the explicit requirement to have self at the
start
of every method is something that should be shipped off to the
implicit category."

I guess the desire is to have Java-like behaviour: when defining a
method, which must typically be done in the class definition in Java
(unless they've enhanced that area in the past few years), you never
write a "this" parameter in the method signature, but you are free to
qualify instance attribute accesses with "this". Personally, I liked
the Modula-3-inspired requirement for "self" in Python, but I can see
how a reminder of what it is in every method signature (defined within
a class) might be regarded as overly explicit.

Paul
Jul 25 '08 #48

P: n/a
Fuzzyman wrote:
On Jul 24, 6:41 am, Jordan <jordanrastr...@gmail.comwrote:
>Hi everyone,

I'm a big Python fan who used to be involved semi regularly in
comp.lang.python (lots of lurking, occasional posting) but kind of
trailed off a bit. I just wrote a frustration inspired rant on my
blog, and I thought it was relevant enough as a wider issue to the
Python community to post here for your discussion and consideration.

This is not flamebait. I love Python, and I'm not out to antagonise
the community. I also realise that one of the issues I raise is way
too ingrained to be changed now. I'd just like to share my thinking on
a misstep in Python's guiding principles that has done more harm than
good IMO. So anyway, here's the post.

I've become utterly convinced that at least one criticism leveled at
my favourite overall programming language, Python, is utterly true and
fair. After quite a while away from writing Python code, I started
last night on a whim to knock up some code for a prototype of an idea
I once had. It's going swimmingly; the Python Image Library, which I'd
never used before, seems quick, intuitive, and with the all the
features I need for this project. As for Python itself, well, my heart
still belongs to whitespace delimitation. All the basics of Python
coding are there in my mind like I never stopped using them, or like
I've been programming in this language for 10 years.

Except when it comes to Classes. I added some classes to code that had
previously just been functions, and you know what I did - or rather,
forgot to do? Put in the 'self'. In front of some of the variable
accesses, but more noticably, at the start of *every single method
argument list.* This cannot be any longer blamed as a hangover from
Java - I've written a ton more code, more recently in Python than in
Java or any other OO language. What's more, every time I go back to
Python after a break of more than about a week or so, I start making
this 'mistake' again. The perennial justification for this 'feature'
of the language? That old Python favourite, "Explicit is better than
implicit."


It's damn useful for scoping. You can look in the body of your method
and tell whether you are accessing local variables or instance
variables.

I'm a great fan of self and I'm afraid you're flogging a dead horse on
that one.

Your point about rich comparison (at least the == != problem) is fair..
This one is fixed in Python 3 though I believe.

Michael Foord
--
http://www.ironpythoninaction.com/
I hope that the OP will develop some
specific proposal, narrowly focused on
the "self" issue.

It has always seemed redundant on the
argument line of a definition. It does
permit one to use some other word but is
that of any real value.

One can still use self.a= 'z' or perhaps
..a= 'z'.

Colin W.
Jul 25 '08 #49

P: n/a
Well this discussion is chugging along merrily now under its own
steam, but as the OP I should probably clarify a few things about my
own views since people continue to respond to them (and are in some
cases misunderstanding me.)

I *like* explicit self for instance variable access. There are
arguments for and against, and my personal opinion is that the
arguments for are stronger. Local variables and instance variables
should be explicitly differentiated somehow, for the sake of
readability. Python's approach works. I slightly prefer Ruby's @,
because I think the brevity is a net win for something so commonplace
(is it less readable? Maybe. is "def" less readable than "define"? I
don't know - I think about 10 seconds of coding in Python or Ruby is
enough for you to be able to instantly grok def. Likewise, @. The
argument is more aesthetic IMO - how many perl-style/1337speak pu|\|
(tu@t10n m@r|<$ can you stand?)

I have come to dislike explicit self in method argument lists. Sure,
there are reasons. I don't think they're at all strong enough.

I'm definitely against the != behaviour, and maybe will get around to
actually PEPing it.

The point I was trying to make originally was that applying any mantra
dogmatically, including Explicit is better than implicit, can lead to
bad results. Perhaps having Practicality beats purity is enough of a
reminder of that fact for the Python community :-)
Jul 26 '08 #50

270 Replies

This discussion thread is closed

Replies have been disabled for this discussion.