By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
435,594 Members | 3,215 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 435,594 IT Pros & Developers. It's quick & easy.

variable declaration

P: n/a
Hello All!

I'am novice in python, and I find one very bad thing (from my point of view) in
language. There is no keyword or syntax to declare variable, like 'var' in
Pascal, or special syntax in C. It can cause very ugly errors,like this:

epsilon=0
S=0
while epsilon<10:
S=S+epsilon
epselon=epsilon+1
print S

It will print zero, and it is not easy to find such a bug!

Even Visual Basic have 'Option Explicit' keyword! May be, python also have such
a feature, I just don't know about it?

Alexander, za**@bk.ru
Jul 18 '05 #1
Share this Question
Share on Google+
83 Replies


P: n/a
EP
> ------------Original Message------------
From: Al*********************@p131.f3.n5025.z2.fidonet.o rg (Alexander Zatvornitskiy)
Hello All!

I'am novice in python, and I find one very bad thing (from my point of
view) in
language. There is no keyword or syntax to declare variable, like 'var'
in
Pascal, or special syntax in C. It can cause very ugly errors,like
this:

epsilon=0
S=0
while epsilon<10:
S=S+epsilon
epselon=epsilon+1
print S

It will print zero, and it is not easy to find such a bug!

Hmmm. I am surely an expert in writing buggy code, but I can not say I make this error in Python. Why is that?

I'm not sure, but a couple things that may help me miss making this mistakein practice may be (somewhat informal in my case) unit testing - I test for correct results for at least a few cases.

It may also help that with Python I can code at a somewhat higher conceptual level, or maybe it is just the syntax that helps avoid these problems:
for epsilon in range (0,10): S=S+epsilon
for epsilon in range (0,10):

S=S+epselon

Traceback (most recent call last):
File "<pyshell#6>", line 2, in ?
S=S+epselon
NameError: name 'epselon' is not defined
It may seem like jumping off a cliff, but the improvement in readability (the variable declarations being visual clutter) makes it much easier for me to see my code, and any typos in it.

It seems it would be simple enough to have one's code, or another script, automatically print out a sorted list of the variables - which would make the error you note obvious. But I haven't needed this, yet at least.

You might like Python and find the lack of variable declaration checking not a problem. It's worth a shot.

Jul 18 '05 #2

P: n/a
Alexander Zatvornitskiy wrote:
epsilon=0
S=0
while epsilon<10:
S=S+epsilon
epselon=epsilon+1
print S

It will print zero, and it is not easy to find such a bug!


pychecker may help you find misspelled variable names. You have to move the
code into a function, though:

$ cat epsilon.py
def loop():
epsilon=0
S=0
while epsilon<10:
S=S+epsilon
epselon=epsilon+1
print S

$ pychecker epsilon.py
Processing epsilon...

Warnings...

epsilon.py:6: Local variable (epselon) not used

Peter

Jul 18 '05 #3

P: n/a
Alexander Zatvornitskiy
<Al*********************@p131.f3.n5025.z2.fidonet. org> wrote:
Hello All!

I'am novice in python, and I find one very bad thing (from my point of
view) in language. There is no keyword or syntax to declare variable, like
'var' in
Since the lack of declarations is such a crucial design choice for
Python, then, given that you're convinced it's a very bad thing, I
suggest you give up Python in favor of other languages that give you
what you crave. The suggestions about using pychecker, IMHO, amount to
"band-aids" -- the right way to deal with minor annoying scratches, but
surely not with seriously bleeding wounds, and your language ("very bad
thing", "very ugly errors") indicates this is a wound-level issue for
you. Therefore, using Python, for you, would mean you'd be fighting the
language and detesting its most fundamental design choice: and why
should you do that? There are zillions of languages -- use another one.
Pascal, or special syntax in C. It can cause very ugly errors,like this:

epsilon=0
S=0
while epsilon<10:
S=S+epsilon
epselon=epsilon+1
print S

It will print zero, and it is not easy to find such a bug!
Actually, this while loop never terminates and never prints anything, so
that's gonna be pretty hard to ignore;-). But, assume the code is
slightly changed so that the loop does terminate. In that case...:

It's absolutely trivial to find this bug, if you write even the tiniest
and most trivial kinds of unit tests. If you don't even have enough
unit tests to make it trivial to find this bug, I shudder to think at
the quality of the programs you code. Even just focusing on typos,
think of how many other typos you could have, besides the misspelling of
'epsilon', that unit tests would catch trivially AND would be caught in
no other way whatsoever -- there might be a <= where you meant a <, a
1.0 where you meant 10, a - where you meant a +, etc, etc.

You can't live without unit tests. And once you have unit tests, the
added value of declarations is tiny, and their cost remains.

A classic reflection on the subject by Robert Martin, a guru of C++ and
other languages requiring declarations, is at:
<http://www.artima.com/weblogs/viewpost.jsp?thread=4639>.

Even Visual Basic have 'Option Explicit' keyword! May be, python also have
such a feature, I just don't know about it?


Python has no declarations whatsoever. If you prefer Visual Basic, I
strongly suggest you use Visual Basic, rather than pining for Visual
Basic features in Python. If and when your programming practices ever
grow to include extensive unit-testing and other aspects of agile
programing, THEN you will be best advised to have a second look at
Python, and in such a case you will probably find Python's strengths,
including the lack of declarations, quite compelling.

Some people claim a language should change the way you think -- a
frequent poster, excellent Python contributor, and friend, even has that
claim in his signature. That may be alright in the groves of academia.
If you program to solve problems, rather than for personal growth, on
the other hand, changing "the way you think" with each programming
language is a rather steep price to pay. As a pragmatist, I prefer a
motto that I've also seen about Python: "it fits your brain". I find
it's true: Python gets out of my way and let me solve problems much
faster, because it fits my brain, rather than changing the way I think.

If Python doesn't fit YOUR brain, for example because your brain is
ossified around a craving for the declaration of variables, then, unless
you're specifically studying a new language just for personal growth
purposes, I think you might well be better off with a language that
DOES, at least until and unless your brain changes by other means.
Alex
Jul 18 '05 #4

P: n/a
Alex Martelli wrote:
Alexander Zatvornitskiy
<Al*********************@p131.f3.n5025.z2.fidonet. org> wrote:

Hello All!

I'am novice in python, and I find one very bad thing (from my point of
view) in language. There is no keyword or syntax to declare variable, like
'var' in

[...]
There are zillions of languages -- use another one.
[...] If Python doesn't fit YOUR brain, for example because your brain is
ossified around a craving for the declaration of variables, then, unless
you're specifically studying a new language just for personal growth
purposes, I think you might well be better off with a language that
DOES, at least until and unless your brain changes by other means.
Alex


I think we should all remember that Python isn't for everyone, and least
of all for those with little knowledge of Python and preconceptions
about what Python *should* be like.

regards
Steve
--
Steve Holden http://www.holdenweb.com/
Python Web Programming http://pydish.holdenweb.com/
Holden Web LLC +1 703 861 4237 +1 800 494 3119
Jul 18 '05 #5

P: n/a
With all due respect, I think "so go away if you don't like it" is
excessive, and "so go away if you don't like it and you obviously don't
like it so definitely go away" is more so. The writer is obviously
neither a native speaker of English nor an accomplished user of Python,
so there are two language issues here. Try expressing your reply in
Russian before deciding that "very ugly" means exactly what you think
it does. I think just saying that "experienced Python users have not
found the lack of declarations to be a major hindrance" would have been
more appropriate.

Also, the assertion that "Python has no declarations whatsoever" is no
longer obviously true. In the 2.4 decorator syntax, a decorator line is
not executable, but rather a modifier to a subsequent symbol binding. I
call it a declaration. I found this disappointing in that it seems to
me a violation of "Special cases aren't special enough to break the
rules" but on further reflection it enables consideration of what a
whole slew of declarative constructs could achieve.

To begin with, now that the design constraint of "no declarations" has
been shown to be less than absolute, why not allow for perl-style ('use
strict') declarations? It actually is very useful in the small-script
space (up to a few hundred lines) where the best Perl codes live.

Let me add that I remain unconvinced that a language cannot combine the
best features of Python with very high performance, which is ultimately
what I want. It seems to me that such a language (possibly Python,
possibly a Python fork, possibly something else) will need substantial
programmer control over references as well as referents.

It seems to me that Python has a weaker case for purity in this regard
now that the dam has been breached with decorator syntax, which is, I
think, a declaration.

Let me conclude with the confession that I'm still on the steep part of
the learning curve (if it ever flattens out at all...). Python has
definitely significantly modified how I think about things, and I
deeply appreciate all the efforts of you veterans. So I say this all
with some trepidation, because I don't want to join Alexander in
inadvertently offending you. And since I presumably missed some intense
flame wars about decorators by only a couple of months, this may be a
real hornet's nest I am poking.

In summary, again with all due respect and gratitude for the
spectacularly excellent product that Python is today, I wonder *why*
this strong aversion to declarative statements, and *whether* decorator
syntax constitutes a violation of it. I'd appreciate any responses or
links.

--
mt

Jul 18 '05 #6

P: n/a
Michael Tobis wrote:
Also, the assertion that "Python has no declarations whatsoever" is no
longer obviously true. In the 2.4 decorator syntax, a decorator line is
not executable


that's a nice theory, but since the decorator line is executed by the inter-
preter, it's a little weak.

</F>

Jul 18 '05 #7

P: n/a
> that's a nice theory, but since the decorator line is executed by the
inter-
preter, it's a little weak.


Well, uh, who else would process it?

"use strict' and 'my epsilon' in perl are executed by the perl
interpreter as well, but they have a declarative flavor.

A decorator is a modifier to a subsequent binding, and it modifies the
reference and not the referent. So how is it anythng but declarative?
--
mt

Jul 18 '05 #8

P: n/a
Michael Tobis wrote:
that's a nice theory, but since the decorator line is executed by the
interpreter, it's a little weak.


Well, uh, who else would process it?


the compiler.

from __future__ is a declaration. @expression is an executable statement.

</F>

Jul 18 '05 #9

P: n/a
"EP" <EP@zomething.com> said:
------------Original Message------------
From: Al*********************@p131.f3.n5025.z2.fidonet.o rg (Alexander Zatvornitskiy)


Hello All!

I'am novice in python, and I find one very bad thing (from my point of
view) in
language. There is no keyword or syntax to declare variable, like 'var'
in
Pascal, or special syntax in C. It can cause very ugly errors,like
this:

epsilon=0
S=0
while epsilon<10:
S=S+epsilon
epselon=epsilon+1
print S

It will print zero, and it is not easy to find such a bug!

Hmmm. I am surely an expert in writing buggy code, but I can not say I make this error in Python. Why is that?

I'm not sure, but a couple things that may help me miss making this mistake in practice may be (somewhat informal in my case) unit testing - I test for correct results for at least a few cases.

It may also help that with Python I can code at a somewhat higher conceptual level, or maybe it is just the syntax that helps avoid these problems:
for epsilon in range (0,10): S=S+epsilon
for epsilon in range (0,10):

S=S+epselon

Traceback (most recent call last):
File "<pyshell#6>", line 2, in ?
S=S+epselon
NameError: name 'epselon' is not defined
It may seem like jumping off a cliff, but the improvement in readability (the variable declarations being visual clutter) makes it much easier for me to see my code, and any typos in it.

It seems it would be simple enough to have one's code, or another script, automatically print out a sorted list of the variables - which would make the error you note obvious. But I haven't needed this, yet at least.

You might like Python and find the lack of variable declaration checking not a problem. It's worth a shot.

class MyVars(object):
__slots__ = ['epsilon', 'thud', 'foo']

mv = MyVars()

mv.epselon = 42
Traceback (most recent call last)

/home/dw/KirbyBase-1.7/<console>

AttributeError: 'MyVars' object has no attribute 'epselon'

mv.epsilon = 42

Jul 18 '05 #10

P: n/a
> A decorator is a modifier to a subsequent binding, and it modifies the
reference and not the referent. So how is it anythng but declarative?


I learned the hard way that it still is simply interpreted - its a pretty
straight forward syntactic sugaring, as this shows:

foo = classmethod(foo)

becomes:

@classmethod

Now @classmethod has to return a callable that gets passed foo, and the
result is assigned to foo - not more, not less, so it becomes equivalent to
the older type of creating a class method.

Is this a declaration? I'd personally say and think "practically, yes", as I
also view

class Bar:
....

as a declaration. But obviously some people like Alex Martelli have
different views on this (and are right), because you can do this in python:

if condition:
class Foo:
def bar(self):
pass

else:
class Foo:
def schnarz(self):
pass

So that makes class statements not as declarative as they are in languages
like java.

So to sum it up (for me at least): things like metaclasses, decorators and
so on make me write code more declarative - if they are a declaration in
the strict sense, I don't bother.

--
Regards,

Diez B. Roggisch
Jul 18 '05 #11

P: n/a
Michael Tobis <mt@3planes.com> wrote:
With all due respect, I think "so go away if you don't like it" is
excessive, and "so go away if you don't like it and you obviously don't
like it so definitely go away" is more so. The writer is obviously
I disagree: I believe that, if the poster really meant what he wrote, he
may well be happier using other languages and all the declarations he
cherishes, so recommending that course of action to him is quite proper
on my part. Nobody forces him to follow my advice, in particular if, as
you suggest, he's mis-expressed himself.
neither a native speaker of English nor an accomplished user of Python,
so there are two language issues here. Try expressing your reply in
Russian before deciding that "very ugly" means exactly what you think
it does.
I don't know any Russian, and I don't see how that's relevant. If the
poster said "it's very ugly" meaning "I'm not fully comfortable with it"
or "it's the best thing I've ever seen in my life", he can perfectly
well correct his misexpression if he cares to -- not my job to try to
read his mind and perform exegesis on his words.
I think just saying that "experienced Python users have not
found the lack of declarations to be a major hindrance" would have been
more appropriate.
I think it would have been true, but weak and insufficient. Not only
experienced Python users have that opinion: lack of declarations didn't
faze me even I was a total newbie (other things did, and I learned that
most of them were good only gradually; but if I'd blocked on something
as obviously *fundamental* to Python, I'd never have gone deeper, of
course). Plus, I did offer a URL for a classic post by Robert Martin,
who's not even talking about Python yet came to exactly the same
conclusion *while using statically typed languages*, just on the basis
of unit-testing experience.

Also, the assertion that "Python has no declarations whatsoever" is no
longer obviously true. In the 2.4 decorator syntax, a decorator line is
not executable, but rather a modifier to a subsequent symbol binding. I
call it a declaration.
You may call it a strawberry, if you wish, but that doesn't mean it will
taste good with fresh cream. It's nothing more and nothing less than an
arguably weird syntax for a perfectly executable statement:

@<expression>
def functionname <rest of function header and body>

means EXACTLY the same thing as

def functionname <rest of function header and body>
functionname = <expression>(functionname)

Nothing more, nothing less. The splat-syntax isn't a "declaration" in
any sense of the word, just a not-so-short shortcut for an assignment
statement. Proof by disassembly:
def f(): .... @foo
.... def g(): pass
.... dis.dis(f) 2 0 LOAD_GLOBAL 0 (foo)
3 LOAD_CONST 1 (<code object g at 0x3964e0,
file "<stdin>", line 2>)
6 MAKE_FUNCTION 0
9 CALL_FUNCTION 1
12 STORE_FAST 0 (g)
15 LOAD_CONST 0 (None)
18 RETURN_VALUE def f(): .... def g(): pass
.... g = foo(g)
.... dis.dis(f)

2 0 LOAD_CONST 1 (<code object g at 0x389ee0,
file "<stdin>", line 2>)
3 MAKE_FUNCTION 0
6 STORE_FAST 0 (g)

3 9 LOAD_GLOBAL 1 (foo)
12 LOAD_FAST 0 (g)
15 CALL_FUNCTION 1
18 STORE_FAST 0 (g)
21 LOAD_CONST 0 (None)
24 RETURN_VALUE

the splat-syntax optimizes away one STORE_FAST of g and the
corresponding LOAD_FAST, by having the LOAD_GLOBAL of foo earlier; nice,
but not earth-shaking, and definitely no "declaration" whatsoever.

Let me add that I remain unconvinced that a language cannot combine the
best features of Python with very high performance, which is ultimately
I'm also unconvinced. Fortunately, so is the EU, so they have approved
very substantial financing for the pypy project, which aims in good part
exactly at probing this issue. If any single individual can be called
the ideator of pypy, I think it's Armin Rigo, well-known for his
excellent psyco specializing-compiler for Python: the key research
thesis behind both projects is that a higher-level, dynamic language
need not have performance inferior to a lower-level one and indeed may
well beat it.
what I want. It seems to me that such a language (possibly Python,
possibly a Python fork, possibly something else) will need substantial
programmer control over references as well as referents.
pypy is dedicated to proving you're wrong. With at least half a dozen
great people now finally having started working full-time on the
project, and due to continue so doing for the next couple of years, I
like our chances.

It seems to me that Python has a weaker case for purity in this regard
now that the dam has been breached with decorator syntax, which is, I
think, a declaration.
I entirely disagree that a minor syntax wheeze to introduce a shortcut
for a particular executable statement ``is a a declaration''.

Let me conclude with the confession that I'm still on the steep part of
the learning curve (if it ever flattens out at all...). Python has
definitely significantly modified how I think about things, and I
deeply appreciate all the efforts of you veterans. So I say this all
with some trepidation, because I don't want to join Alexander in
inadvertently offending you. And since I presumably missed some intense
flame wars about decorators by only a couple of months, this may be a
real hornet's nest I am poking.
Almost nobody really liked the splat-syntax for decorators, except of
course Guido, who's the only one who really counts (the BDFL). But that
was strictly a syntax-sugar issue -- which was exactly the reason making
the flamewars SO ferocious. It's one of Parkinson's Laws: the amount
and energy of discussion on an issue is inversely proportional to the
issue's importance. Fortunately, I was otherwise busy, so didn't enter
the fray at all (I intensely dislike the splat, but I don't care all
that much about such minor syntax sugar one way or another, anyway).

fici In summary, again with all due respect and gratitude for the
spectacularly excellent product that Python is today, I wonder *why*
this strong aversion to declarative statements, and *whether* decorator
syntax constitutes a violation of it. I'd appreciate any responses or
links.


If "declarative statement" means anything, I guess it means "having to
tell stuff to the compiler to be taken into account during compilation
but irrelevant at runtime". Python does have one such wart, the
'global' statement, and it's just as ugly as one might imagine, but
fortunately quite marginal, so one can almost forget it.

I have nothing against a declarative style _per se_ -- it just doesn't
fit Python's "everything happens at runtime" overall worldview, and that
simple and powerful worldview is a good part of what makes Python tick
SO well. I think there's a space for declarative languages, and it is
mostly ``orthogonal'' to Python. See, for example,
<http://www.strakt.com/docs/ep04_caps.pdf>, specifically the part of the
presentation that's about BLAM -- that's a declarative language (with
embedded Python for ``actions'', e.g., triggers, and computations), and
I think a neat and useful design, too.
Alex
Jul 18 '05 #12

P: n/a
Alex Martelli wrote:
Michael Tobis <mt@3planes.com> wrote:

[...]
Let me add that I remain unconvinced that a language cannot combine the
best features of Python with very high performance, which is ultimately

I'm also unconvinced. Fortunately, so is the EU, so they have approved
very substantial financing for the pypy project, which aims in good part
exactly at probing this issue. If any single individual can be called
the ideator of pypy, I think it's Armin Rigo, well-known for his
excellent psyco specializing-compiler for Python: the key research
thesis behind both projects is that a higher-level, dynamic language
need not have performance inferior to a lower-level one and indeed may
well beat it.

I should be failing in my duty as Chairman if I didn't remind readers at
this point that they can hear Armin Rigo's talk "PyPy and Type
Inference" at PyCon at 5:30 on Wednesday March 23.

http://www.python.org/pycon/dc2005/register.html

While Alex is not necessarily too modest to mention it he might forget
that he is also giving three talks to PyCon. I believe this is the first
year he has been able to attend PyCon, so delegates are definitely in
for a treat this year.

regards
Steve
--
Steve Holden http://www.holdenweb.com/
Python Web Programming http://pydish.holdenweb.com/
Holden Web LLC +1 703 861 4237 +1 800 494 3119
Jul 18 '05 #13

P: n/a
"Alexander Zatvornitskiy"
<Al*********************@p131.f3.n5025.z2.fidonet. org> wrote in message
news:MS*****************************@fidonet.org.. .
Hello All!

I'am novice in python, and I find one very bad thing (from my point of view) in language. There is no keyword or syntax to declare variable, like 'var' in
Pascal, or special syntax in C. It can cause very ugly errors,like this:

epsilon=0
S=0
while epsilon<10:
S=S+epsilon
epselon=epsilon+1
print S

It will print zero, and it is not easy to find such a bug!

Even Visual Basic have 'Option Explicit' keyword! May be, python also have such a feature, I just don't know about it?


Exactly so!
Python *does* require that your variables be declared and initialized before
you use them. You did that with epsilon=0 and S=0 at the top. It is
unfortunate, however, that the statement epselon=epsilon+1 also declares a
new variable in the wrong place at the wrong time. Such mispellings are a
*common* error caught instantly in languages that require a more formal
declaration procedure.

Another irksome sitiuation is that while Python does enforce strict type
checking, you can re-declare variables and morph them in the middle of
nowhere.
S = 0 # It's an Integer!
S = S + 'Hello' # No can do! Strong type checking forbids this.
S = 'GoodBye' # Whoops - Now it's a string! Unfortunately
legal!
This seemingly demolishes all the good reasons one has for wanting strict
type checking.

That second example is just bad programming practice and easy to avoid. The
problem you point out in your code, however, hurts! Was that an I, an l or a
1 in the variable name?

Hey! - I like Python a lot.
But nothings perfect
Thomas Bartkus
Jul 18 '05 #14

P: n/a

Thomas Bartkus wrote:
Python *does* require that your variables be declared and initialized before you use them. You did that with epsilon=0 and S=0 at the top. It is
unfortunate, however, that the statement epselon=epsilon+1 also declares a new variable in the wrong place at the wrong time. Such mispellings are a *common* error caught instantly in languages that require a more formal declaration procedure.

I have no interest in arguing this right now, but it does raise a
question for me: How common is it for a local variable to be bound in
more than one place within a function? It seems that it isn't (or
shouldn't be) too common.

Certainly the most common case where this occurs is for temporary
variables and counters and stuff. These typically have short names and
thus are not as likely to be misspelled.

Another common place is for variables that get bound before and inside
a loop. I would guess that's not as common in Python as it is in other
languages, seeing that Python has features like iterators that obviate
the need to do this. (The OP's original example should have been "for
epsilon in range(10)"; epsilon only needed bound in one place.)

I guess this might be why, in practice, I don't seem to encounter the
misspelling-a-rebinding error too often, even though I'm prone to
spelling errors. Perhaps, if someone runs into this error a lot, the
problem is not with Python, but with their tendency to rebind variables
too much? Just a thought.
--
CARL BANKS

Jul 18 '05 #15

P: n/a
In article <1g****************************@yahoo.com>,
Alex Martelli <al*****@yahoo.com> wrote:

Some people claim a language should change the way you think -- a
frequent poster, excellent Python contributor, and friend, even has that
claim in his signature.


Generally speaking, I change my .sig either when I get bored or when
someone references it. So here's the new one.
--
Aahz (aa**@pythoncraft.com) <*> http://www.pythoncraft.com/

"Given that C++ has pointers and typecasts, it's really hard to have a serious
conversation about type safety with a C++ programmer and keep a straight face.
It's kind of like having a guy who juggles chainsaws wearing body armor
arguing with a guy who juggles rubber chickens wearing a T-shirt about who's
in more danger." --Roy Smith, c.l.py, 2004.05.23
Jul 18 '05 #16

P: n/a
This is definitely a wart:

.... z = 42.3
....
.... def f():
.... if False:
.... global z
.... z = -666
....
.... f()
.... print z

Jul 18 '05 #17

P: n/a
Alex Martelli wrote:
Michael Tobis <mt@3planes.com> wrote: he can perfectly
well correct his misexpression if he cares to -- not my job to try to
read his mind and perform exegesis on his words.
Well, I hate to try to tell you your job, but it doesn't seem to be to
be all that great of a marketing strategy to actively chase people
away... Hey, he might have been a Nutshell customer.
I think it would have been true, but weak and insufficient. Not only
experienced Python users have that opinion: lack of declarations didn't faze me even I was a total newbie
It did me, and it did many others. Perhaps you are unrepresentative.

It's one thing to say "no can do, sorry", it's another to say "you
don't need this anyway and if you think you do you aren't worthy".

In fact, it was your book I spent the most time thumbing through
looking for the "use strict" equivalent that I was absolutely certain
must exist. Hell, even Fortran eventually gave in to "IMPLICIT NONE".
It's practically the only thing I've ever expected to find in Python
that hasn't vastly exceeded my expectations, aand I'm sure Alexander is
not the only person to be put off by it. In fact, I'd recommend a
paragraph early in the Nutshell book saying "there are no declarations,
no use strict, no implicit none, sorry, forget it", and an index
listing under "declarations" pointing to a detailed exegesis of their
nonexistence. It would have saved me some time.

It's true that in some sense an assignment is all the declaration you
need. I think Carl Banks's point (what we think of as assignment as a
carryover from other languages is really rebinding, and in many cases
can be avoided) is also helpful.

But that doesn't make the "epselon" bug go away, and wanting to have a
way to catch it quickly isn't, to my mind, obviously a criminal act.
Also, based on what DogWalker demonstrates, it's really not that alien
to Python and should be feasible.
Also, the assertion that "Python has no declarations whatsoever" is no longer obviously true. In the 2.4 decorator syntax, a decorator line is not executable, but rather a modifier to a subsequent symbol binding. I call it a declaration.


You may call it a strawberry, if you wish, but that doesn't mean it

will taste good with fresh cream. It's nothing more and nothing less than an arguably weird syntax for a perfectly executable statement:
This may well be true in implementation, but cognitively it is a
declaration that modifies the reference and not the referent. I see
that it is a big deal to ask for more of these, but I don't see why.
Let me add that I remain unconvinced that a language cannot combine the best features of Python with very high performance, which is

ultimately
I'm also unconvinced. Fortunately, so is the EU, so they have approved very substantial financing for the pypy project, which aims in good part exactly at probing this issue.
I hope this works out, but it's hard for me to see how pypy will avoid
lots of hashing through dictionaries. I'm willing to help it by
declaring an immutable reference. Here, don't look this up; it always
points to that.

I'm guessing that this will also be considered a bad idea, and maybe
someday I'll understand why. I'm looking for insight, not controversy.
If any single individual can be called
the ideator of pypy, I think it's Armin Rigo, well-known for his
excellent psyco specializing-compiler for Python:
I'm hoping I can make the meeting. Maybe proximity to the core group
will help me approach the sort of enlightenment I seek. Just being in
the vicinity of Ian Bicking's aura on occasion has been most inspiring.
Almost nobody really liked the splat-syntax for decorators, except of
course Guido, who's the only one who really counts (the BDFL). But that was strictly a syntax-sugar issue
Um, sugar isn't exactly what I'd call it. I think it matters a lot
though. Python's being easy on the eyes is not a trivial advantage for
some people, myself incuded.
If "declarative statement" means anything, I guess it means "having to tell stuff to the compiler to be taken into account during compilation but irrelevant at runtime". Python does have one such wart, the
'global' statement, and it's just as ugly as one might imagine, but
fortunately quite marginal, so one can almost forget it.
I am trying to talk about having expressive power in constraining
references as well as the referents. Python studiously avoids this, but
decorators change that. I am not deep enough into the mojo as yet to
have more than a glimmer of an idea about the distinction you are
making. It's not the one I'm trying to make.

decorators may not be implemented as declarations, but they cognitively
act as declarations, and that's what I care about here.
I have nothing against a declarative style _per se_ -- it just doesn't fit Python's "everything happens at runtime" overall worldview, and that simple and powerful worldview is a good part of what makes Python tick SO well.


I'm glad you have said something something I absolutely agree with. I'm
alarmed at the suggestions here that class and def blocks are
declarative. The fact that they're executable is really a core part of
the beauty of Python.

However, I don't see how an 'import strict' would necessarily violate
this, nor an "import immutableref", which is something I would find
useful in trying to wrestle with NumArray, and which a high-performance
Python could (I think) use to advantage.

Now I may be wrong; in fact I'd bet against me and in favor of you and
Frederik if I had to bet. It's just that I don't see why I'm wrong.
--
mt

Jul 18 '05 #18

P: n/a
Michael Tobis wrote:
This is definitely a wart:

... z = 42.3
...
... def f():
... if False:
... global z
... z = -666
...
... f()
... print z


no, it's a declaration. from the documentation:

http://docs.python.org/ref/global.html

"The global statement is a declaration which holds for the entire
current code block."

"the global is a directive to the parser. It applies only to code
parsed at the same time as the global statement"

</F>

Jul 18 '05 #19

P: n/a
"Michael Tobis" <mt@3planes.com> writes:
In fact, I'd recommend a paragraph early in the Nutshell book saying
"there are no declarations, no use strict, no implicit none, sorry,
forget it",
It would have to be a pretty long paragraph, if it were to list all
the things that you do NOT find in Python.
and an index listing under "declarations" pointing to a detailed
exegesis of their nonexistence.
Think about this. You are either asking that the book's author
anticipate your personal expectations, and write the book to cater for
YOUR PERSONAL expectiations ... or you are asking for a book with an
inifititely long index. The list of things absent from Python is
infinite.
It would have saved me some time.


You don't want a book, you want a personal tutor.
Jul 18 '05 #20

P: n/a
Michael Tobis <mt@3planes.com> wrote:
Alex Martelli wrote:
Michael Tobis <mt@3planes.com> wrote:
he can perfectly
well correct his misexpression if he cares to -- not my job to try to
read his mind and perform exegesis on his words.


Well, I hate to try to tell you your job, but it doesn't seem to be to
be all that great of a marketing strategy to actively chase people
away... Hey, he might have been a Nutshell customer.


I'd far rather sell one fewer copy of the Nutshell, than sell one more
to somebody who is not _happy_ to use Python.

I think it would have been true, but weak and insufficient. Not only
experienced Python users have that opinion: lack of declarations

didn't
faze me even I was a total newbie


It did me, and it did many others. Perhaps you are unrepresentative.


Maybe; I did have good knowledge of a variety of languages, for example.
However, I have seen many people come upon ideas that were new to them
and meet them with interest and curiosity, even if initially doubtful
about the novelty, rather than with fear and loathing. I think that
such an attitude is a better predictor of how happy a person will be
with Python (or other technologies he's initially unfamiliar with) than
"previously accumulated knowledge".

It's one thing to say "no can do, sorry", it's another to say "you
don't need this anyway and if you think you do you aren't worthy".
To say one is sorry about something that, in fact, makes one deliriously
happy, would be the worst sort of hypocrisy, even though it's a socially
common ``white lie''. As for "worthy", that's a completely different
viewpoint, and I would appreciate it if you didn't put words into my
mouth.

Believing that somebody might be unhappy using Python (or any other
given technology), at least at this point in their personal development,
due to their ingrained mindset and strongly held opinions against one of
the cornerstones of the language, has nothing to do with "worth".

In fact, it was your book I spent the most time thumbing through
looking for the "use strict" equivalent that I was absolutely certain
must exist. Hell, even Fortran eventually gave in to "IMPLICIT NONE".
....which has nothing to do with the case, since in Fortran, and from day
one, all names had statically determinable types anyway.
It's practically the only thing I've ever expected to find in Python
that hasn't vastly exceeded my expectations, aand I'm sure Alexander is
not the only person to be put off by it.
By Hugol's Law, surely not. So what? If he sticks to his misguided
expectations, he won't be happy with Python. If he outgrows them, he
probably will. But the best way forwards is to get used to unit-testing
and thereby realize declarations are deadweight, THEN use a language
where they just aren't there.
In fact, I'd recommend a
paragraph early in the Nutshell book saying "there are no declarations,
What about a paragraph asking the reader to *READ* the book before
offering advice about it? On p.32, "Python has no declarations"; on
p.39, under Variables, right at the start of the paragraph, "In Python,
there are no declarations. The existence of a variable depends on a
statement that <i>binds</i> the variable, or, in other words, that sets
a name to hold a reference to some object".

Not only are these words pretty early, right towards the start of the
chapter giving an overview on the language, but I think the second
snippet has quite a prominent position. If you're looking for a way to
declare variables, and don't think of looking at the very start of the
section titled "Variables" -- or if the words you find there,
reinforcing the previous assertion to the same effect, still keep you
"thumbing through" the book in search of what I just said isn't there, I
disclaim responsibility.

A book, particularly a quick reference, needs to be considered as a
_cooperative_ effort between writer and reader. I can fairly be tasked
with expressing every important thing about the language, and I think I
do -- not all of the "sea lawyer"-level quibbles, but all of the aspects
most readers truly need. But I cannot fairly be asked to *belabor*
every aspect that might faze some reader or other, to ward against the
reader not being careful and not realizing that, in a work where
conciseness is of the essence, every single word is there for a purpose.

If you need repetition and belaboring, get a book that's intended as
slow-paced tutorial. Demanding tutorial-like repetitiousness of a
*quick reference* is absurd and irresponsible.
no use strict, no implicit none, sorry, forget it", and an index
This is the second time you ask or suggest that I take an apologetic
attitude (or at least mouth platitudes that sound apologetic without
_meaning_ to be apologetic in the least), and I repeat: forget it.
listing under "declarations" pointing to a detailed exegesis of their
nonexistence. It would have saved me some time.
....and would have cost time to careful readers, ones who know that "no
declarations" means (surprise!) *NO* declarations, and are sensible
enough to NOT expect "detailed exegesis" in a *quick reference* work.

No way -- and I'm not apologetic in the least about it, please note.

Having a tightly limited space budget, I carefully allocate it to
materials that I think it will do most good to most readers in the main
target audience: Python programmers. "I'm not trying to teach Python
here", as I say right at the start of that chapter (it's chapter 4, and
I believe it's just the sample chapter O'Reilly chose to place on their
site, so readers of this thread who don't own the book should be able to
have a look and follow this debate more closely) -- it's certainly
_possible_ to learn Python from the book, but only by reading very
closely and carefully, because the book eschews the repetition and
belaboring that a tutorial work would abound in (unless it be very, very
fast paced for a tutorial, which is also a possible stance; I believe
it's the stance taken by the excellent "Dive Into Python").

To you, it's the lack of declarations (because you can't or won't take
assertions about "no declarations" at face value); to another, it's
significant indentation, issues of typing, limits on recursion, argument
passing, the lack of a switch/case statement, -2**2... I mention each
and every one of these issues, of course, but it would be totally
inappropriate to offer for each the "detailed exegesis" you require.

It's true that in some sense an assignment is all the declaration you
need. I think Carl Banks's point (what we think of as assignment as a
carryover from other languages is really rebinding, and in many cases
can be avoided) is also helpful.
It's known as an "assignment statement" in Python, too, and its
semantics may be ones of binding or rebinding. "Assignment statements
are the most common way to bind variables and other references", and the
whole following paragraph about rebinding, Nutshell p. 39.
But that doesn't make the "epselon" bug go away, and wanting to have a
way to catch it quickly isn't, to my mind, obviously a criminal act.
Of course not, and test-driven design is the right way to catch quickly
that bug, and many others (including but not limited to ones caused by
other typos).
Also, based on what DogWalker demonstrates, it's really not that alien
to Python and should be feasible.
I assume you're referring to the abuse of __slots__ that's so
ridiculously popular? As I explain on p. 86, __slots__ is meant
strictly to let you save memory. It does NOT work well to catch typos
in real-life code (as opposed to toy-level examples). Michele Simionato
has a reasonable recipe on ActiveState's Cookbook (that recipe's also
going to be in the Cookbook's 2nd edition, due out in a couple months)
to show how to use __setattr__ instead (much more sensible) if you're so
tremendously "typo-phobic".

But, like the "leading __" idea, weird metaclasses and decorators that
let you omit the explicit 'self', and quite a few others, these nifty
tricks ARE in fact fully alien to Python "as she's spoken" -- they're
quoted, repeated, and varied upon, with ridiculous frequency, totally
out of proportion to their miniscule importance in the normal and
idiomatic practice of the language, because more and more people come to
Python, from a wide variety of other languages, keen to not change their
habits and "keep coding X in Python" for any value of X you might care
to name.

Pythonistas like showing off Python's power, in particular by mimicking
the different ways in which different languages work, and we may well
think that trying to soothe the worries of people coming to Python with
all the wrong expectations and thereby make Python more converts -- I
know I've been guilty of this particular error in the past. But I do
think it's an error when it gets overdone the way it's being overdone
these days; that to be really happy with Python, one should use Python
to do *PYTHON* programming, NOT to halfway-mimic Java, Visual Basic,
Fortran, PHP, Eiffel, Scheme, Dylan, Javascript, Ada, and Haskell, all
rolled into one.

As Steve Holden just posted, this may well mean that Python is not in
fact suitable for EVERYbody -- only people who are willing to give a try
to Python AS SUCH, rather than feeling a deep-seated need to mimic other
languages they're used to (perhaps due to unwillingness to use important
and excellent methodologies and tools, from unit-tests onwards). If so,
then, so be it: yeah, even if it means I sell fewer copies of my books,
let those fewer copies be sold to people who'll APPRECIATE AND ENJOY
both the books and the language, rather than pine for ``variable
declarations'', thumb endlessly looking for detailed exegeses of what
ISN'T there, and so on. I think the total amount of happiness in the
world will be enhanced thereby, and, in the end, that's what matters.

Also, the assertion that "Python has no declarations whatsoever" is no longer obviously true. In the 2.4 decorator syntax, a decorator line is not executable, but rather a modifier to a subsequent symbol binding. I call it a declaration.


You may call it a strawberry, if you wish, but that doesn't mean it

will
taste good with fresh cream. It's nothing more and nothing less than

an
arguably weird syntax for a perfectly executable statement:


This may well be true in implementation, but cognitively it is a
declaration that modifies the reference and not the referent. I see
that it is a big deal to ask for more of these, but I don't see why.


Because the "cognition" is simply and totally wrong and
counterproductive: if you have the wrong mental model of what "splat
foo" means, you won't be productive in coding and using various new
possibilities for ``foo'' within that syntax.

It's not a matter of implementation, but of SEMANTICS -- what the syntax
form actually DOES, quite independent of HOW it does it, is to call a
HOF once the following def statement is done executing, to ``filter''
the function object to be bound to the name. No more, no less, and
NOTHING to do with ``declarations'', any more than statements such as
def and class are declarations (thinking of them as such is a serious
conceptual error and gravely hampers programming productivity).

I think your total and utter misconception of decorator syntax as being
a "declaration" is the best possible argument against that syntax; I
also think the syntax was probably worth having anyway, despite that.
After all, somebody sufficiently crazed from a withdrawal crisis from
declarations, as you appear to be, may go around calling "declarations"
anything whatsoever, be it splats, def, class, import, assignments, and
so on -- we can't rip all statements out from Python to make it into a
safe padded cell for declaration-junkies where they won't hurt
themselves and others.

I hope one day to forward this exchange to Guido as part of a dossier
against the latest (and neat!) syntax tweak he's considering for Python
3.0 -- having something like:

def f(x: <expression>):
<body>

be a syntactic shortcut for

def f(x):
x = <something>(x, <expression>)
<body>

where the `<something>' might be the ``adapt'' builtin (per PEP 246) or
some variation thereon. The argument boils down to: many people will
mis-conceptualize this nifty shortcut as a DECLARATION and thereby get
hopelessly confused and utterly misuse it, completely failing to
understand or accept that it's just a syntax shortcut to promote an
important idiom, just like the splat-syntax for decorators.

He'll probably go ahead anyway, of course, just as he did for decorators
despite the huge flamewars -- if he wasn't such a stubborn guy, Python
would be unlikely to be so unique;-).

Let me add that I remain unconvinced that a language cannot combine the best features of Python with very high performance, which is ultimately

I'm also unconvinced. Fortunately, so is the EU, so they have

approved
very substantial financing for the pypy project, which aims in good

part
exactly at probing this issue.


I hope this works out, but it's hard for me to see how pypy will avoid
lots of hashing through dictionaries. I'm willing to help it by
declaring an immutable reference. Here, don't look this up; it always
points to that.


If such crutches are necessary, I guess that will emerge. The best
compilers for other dynamic languages, such as the stalin compiler for
scheme, can do without such crutches -- I'm sure it's hard for you to
see how, but if you study modern advanced compiler theory (which IS
hard, of course) then you might perhaps stand a chance.
I'm guessing that this will also be considered a bad idea, and maybe
someday I'll understand why. I'm looking for insight, not controversy.
I'm not sure there's much insight to be had: you want some forms of
redundancy, which you can get in other languages; we'd rather avoid such
localized redundancy, as the "end-to-end" checks that testing gives us
makes it so much deadweight -- and we NEED testing anyway, even with the
little redundancy aspects, because no such redundancy will catch many
errors, including typos such as a + where a - was meant, a < where a <=
was meant, and so on.

Introducing such redundancy as "optional" soon makes it practically
mandatory for most people in most situations, for many reasons, such as,
because people who manage programming efforts often like restricting
their employees under the misapprehension that this somehow makes things
better -- the history of all languages which introduced ``optional''
redundancy makes this abundantly clear. So, in practice, defending
redundancy as ``it's just optional'' is nothing but a figleaf: if it
gets its ugly foot in the door, it WILL be around most people's necks
most of the time foreverafter.

Python is one of the few widespread languages where one is blissfully
free of practically-mandated redundancy, and can thus practice the
healthy programming principle of "Once, and only once" -- not having to
repeat things twice like some magical formula in a ritual. This lack of
redundancy is a good part of Python's power, in the judgment of many of
us who deliberately chose Python rather than other languages which allow
(and indeed practically mandate) the redundancy.

I, and many others, chose Python over other languages, because (duh!) we
saw Python's differences wrt the others as its _strengths_ -- and ever
since we have had to fight off the well-meaning attempts of ceaseless
waves of newcomers, who are utterly keen to spoil Python by making it
more similar to those other languages we rejected in its favor, i.e.,
sap Python's strengths. Many old-timers already believe that changes to
Python in recent years have not been an overall good thing -- I
disagree, as I believe the changes so far have mostly strengthened
Python's support for its own design principles and goals, but I
understand their viewpoint.

If you're looking for insight on end-to-end checks as being preferable
to localized ones, the best paper I know is one on networking theory,
<http://mit.edu/Saltzer/www/publications/endtoend/endtoend.pdf> -- for
test-driven design, Beck's book, the already-referenced Robert Martin
blog entry <http://www.artima.com/weblogs/viewpost.jsp?thread=4639>, and
innumerable other entries about "unit testing", "agile programming",
"test driven design", and suchlike, which you can google for.

I am trying to talk about having expressive power in constraining
references as well as the referents. Python studiously avoids this, but
decorators change that.
No it doesn't! A decorator puts absolutely NO constraint on any
reference whatsoever -- you're perfectly free to rebind every reference
at will, as usual.

A *descriptor*, or any of several other OO mechanisms, may put whatever
constraints you like on ``compound'' references -- attributes and items.
Most simply, you can define __setitem__, etc, in a class, and then
assignments to an indexing on instances of that class will go through
such special methods which may do, *AT RUNTIME*, whatever checks you
like; and similarly for __setattr__ -- and if you can use metaclasses to
put such constraints on classes just like classes put them on instances.

A descriptor lets you ``constrain'' one specific attribute of instances
by having all bindings of that attribute go through the descriptor's
__set__ method (if it defines one, of course -- that's currently known
as a ``data descriptor'' though the terminology is shifting). Again,
that's entirely a runtime issue -- no ``declaration'' whatsoever.

I am not deep enough into the mojo as yet to
have more than a glimmer of an idea about the distinction you are
making. It's not the one I'm trying to make.
The distinction that seems relevant to me: a *declaration* is about
something you tell the _compiler_, something which has intrinsically
static effects, _preliminary_ to runtime; a *statement* has effects at
runtime _if and when it executes_.

In Python, "global" is a declaration -- a wart -- because it has exactly
these ``preliminary'' effects, independent of runtime and execution.

But, for example, ``class'' definitely isn't: it builds and binds a
separate class object each time it executes, IF it executes. E.g.:

classes = []
for i in range(3):
class Foo(object): pass
classes.append(Foo)

classes[2].zippo = 23
classes[0].zippo = 17
print classes[2].zippo

See? No declarations whatsoever -- just a few perfectly ordinary
statements, which do perfectly ordinary runtime operations, such as
making objects, binding names, calling methods, ...

If the first statement in the loop body was changed to:

Foo = type('Foo', (), {})

we'd have exact semantic equivalence. It's NOT a matter of "mere
implementation": this is how the class statement is DEFINED to work;
it's SEMANTICS.
decorators may not be implemented as declarations, but they cognitively
act as declarations, and that's what I care about here.
Once again, with feeling: implementation doesn't really matter. Their
SEMANTICS are defined in those terms -- perfectly ordinary runtime
executable statements. In *NO* *WAY* *WHATSOEVER* do the "act as
declarations", and if your cognition tells you differently, then it's
your cognition that is faulty in this case -- you're badly misreading
the whole situation.

In particular, decorators don't put any "constraints on references",
which you seem to somehow mysteriously equate to ``declarations''. Of
course, you may use a decorator to install a _descriptor_ object (which
may put constraints on compound-references, as above mentioned), just as
you might use an assignment for the same kind of "installing". But that
doesn't make
@foo(23)
def bar( ... :
...
any more of ``cognitively a declaration'' than the exact semantic
equivalent:
def bar( ... :
...
bar = foo(23)(bar)

If this occurs outside a class, there's no connection to compound
references, thus certainly no constraint; if inside a class, and foo(23)
is a callable which returns a data-descriptor object when called with
function object bar as its argument, then this may imply constraints on
access to x.bar where x is an instance of said class.

I have nothing against a declarative style _per se_ -- it just

doesn't
fit Python's "everything happens at runtime" overall worldview, and

that
simple and powerful worldview is a good part of what makes Python

tick
SO well.


I'm glad you have said something something I absolutely agree with. I'm
alarmed at the suggestions here that class and def blocks are
declarative. The fact that they're executable is really a core part of
the beauty of Python.


And the fact that *decorators* are executable is exactly equivalent.

However, I don't see how an 'import strict' would necessarily violate
this, nor an "import immutableref", which is something I would find
useful in trying to wrestle with NumArray, and which a high-performance
Python could (I think) use to advantage.

Now I may be wrong; in fact I'd bet against me and in favor of you and
Frederik if I had to bet. It's just that I don't see why I'm wrong.


What would the effects of such an ``import strict'' be? Stop any
binding or rebinding of barenames except by some arcane incantations?
Then, by inevitably becoming a "theoretically optional, practically
mandatory" part of Python use, anytime a typical pointy-haired boss has
anything to do with it, it would sabotage programmers' productivity, by
forcing them to use just such redundant incantations and thereby violate
"once and only once". Otherwise, I don't see how it would avert the
``epselon'' terror you keep waving at us.

If ``import immutableref'' is meant to make Python into a
single-assignment language (specifically and only for the module into
which it gets imported), it probably would not run as high a risk of
becoming mandatory -- single-assignment languages, and other functional
programming languages based on concepts of data being immutable, have
been around for ages but PHBs are terrified by them anyway (reasonably:
it takes a highly mathematical mind to be effective at functional
programming). I do not understand how this would help you with
numarray, in the least, so perhaps you could help with some examples of
how you would like such a declaration to work. Presumably the language
thus constrained would not have for loops (which do need to rebind the
control variable over and over), and while loops would also be pretty
iffy, so recursion abilities would have to be strengthened considerably,
for starters. More generally, I have my doubts that Python can be
usefully constrained to single-assignment semantics without needing some
compensating additions elsewhere. But maybe single-assignment is not
what you mean by your extremely elliptic mention, so I'll wait for your
examples of how that would help you with numarray in particular.
Alex
Jul 18 '05 #21

P: n/a
On 31 Jan 2005 19:41:27 -0800, "Michael Tobis" <mt@3planes.com> wrote:
You may call it a strawberry, if you wish, but that doesn't mean it

will
taste good with fresh cream. It's nothing more and nothing less than

an
arguably weird syntax for a perfectly executable statement:


This may well be true in implementation, but cognitively it is a
declaration that modifies the reference and not the referent. I see
that it is a big deal to ask for more of these, but I don't see why.


Thank you for bringing it and respecting the cognitive factor. My
"expereince* of the decorator, disassembly of internals quite aside,
is that it breaks old rules - or, if prferred, breaks new ground - by
impacting code one wouildn't expect it to kow about.

It frightens me a bit when the road to Guru seems to move in the
direction of most completely transcending the normal user experience,
rather than in best comprehending it.

Art

Jul 18 '05 #22

P: n/a

"Carl Banks" <in**********@aerojockey.com> wrote in message
news:11*********************@f14g2000cwb.googlegro ups.com...
<snip>
How common is it for a local variable to be bound in
more than one place within a function?


How common? It shouldn't happen at all and that was the point.
The original posters code demonstrates how it can occur inadvertently as a
result of a simple typographical error.

You won't hear me claim that Python is without mitigating virtues. Clearly,
there is much about Python that encourages good design which will in turn
reduce the incidence of such errors. Nevertheless, one has to admit to this
blemish. One also wonders if it is really necessary to endure what looks to
me like an omission. Is there a reason why the interpreter
couldn't/shouldn't require formal declarations?

I, too, wish there were a switch like VBs "Option Explicit" that would
require you to declare "epsilon = 0" and thereafter have the interpretor
refuse assignment to an undeclared "epselon". Sane VB programmers (and yes,
there are a few!) leave it on by default and consider it abomination that
the switch is optional. The original posters example was a good one. I had
to take a good long stare before I saw it even though the code is short,
sweet, and otherwise correct.

*Is* there a reason why the interpreter couldn't/shouldn't require formal
variable declaration?
It seems to me that lack of same may also be creating hellish barriers to
writing truly effective IDEs for Python.

Thomas Bartkus
Jul 18 '05 #23

P: n/a
Given the behavior, the documentation is gratifyingly correct.

Given that the syntax is legal, though, the behavior is not what one
would intuitively expect, and is therefore unPythonic by (rather
dramatically) violating the principle of least surprise.

It's also, to me, understandable why it's difficult for the language
design to avoid this behavior.

This little discovery of mine sheds some considerable light on the
awkwardness of what you guys will deign to call "declarations". This
being the case, I can understand the resistance to "declarations" in
Python.

I had thought, until the current conversation and this experiment, that
the globals statement, er, declaration was just another executable,
especially given all the stress on Python's being purely executable.

I still see "global" and "@" as expressions of the same fundamental
problem, even though decorators are not implemented as declarations.
They both take effect in a non-intuitive sequence and they both affect
the reference rather than the referent.

This points out the problem that Python has in qualifying references
rather than referents.

Since BDFL is contemplating some optional typing, does this also imply
qualifying the references?

Maybe you wizard types can agree that there is a useful abstraction
that I'm talking about here, whether you wish to call it "declarations"
or not, and try to factor out some sort of consistent strategy for
dealing with it, perhaps in P3K. (I will try to be in a position to
help someday, but I have a long way to go.)

Language features that modify references rather than referents appear
to be problematic. Python clearly chafes at these. Yet there are at
least a few compelling reasons to want them.

--
mt

Jul 18 '05 #24

P: n/a
> How common is it for a local variable to be bound in
more than one place within a function?


It's more natural for a beginner to read or write

..mystr = ""
..for snippet in snippets:
.. if ilike(snippet):
.. mystr = mystr + snippet

than

..mylist = []
..for snippet in snippets:
.. if ilike(snippet):
.. mylist.append(snippet)
..mystr = "".join(mylist)

for instance.

While the latter is superior in some ways, I frequently find my fingers
tossing off the former approach.

Of course in this case it's not hard to come up with

mystr = "".join([snippet in snippets if ilike(snippet)])

but it's also not too hard to imagine cases where the list
comprehension would be too complex or would require too much
refactoring.

I don't know that it's ever necessary to rebind, but it is, in fact,
common, and perhaps too easy. In numeric Python, avoiding rebinding
turns out to be a nontrivial skill.

mt

Jul 18 '05 #25

P: n/a
Thomas Bartkus wrote:
"Carl Banks" <in**********@aerojockey.com> wrote in message
news:11*********************@f14g2000cwb.googlegro ups.com...
<snip>
How common is it for a local variable to be bound in
more than one place within a function?

How common? It shouldn't happen at all and that was the point.


This seems a little excessive to me. Sample use case:

for something in lst:
if type(something) != type(()):
something = tuple(something)

regards
Steve
--
Meet the Python developers and your c.l.py favorites March 23-25
Come to PyCon DC 2005 http://www.python.org/pycon/2005/
Steve Holden http://www.holdenweb.com/
Jul 18 '05 #26

P: n/a
"Steve Holden" <st***@holdenweb.com> wrote in message
news:41**************@holdenweb.com...
Thomas Bartkus wrote:
"Carl Banks" <in**********@aerojockey.com> wrote in message
news:11*********************@f14g2000cwb.googlegro ups.com...
<snip>
How common is it for a local variable to be bound in
more than one place within a function?

How common? It shouldn't happen at all and that was the point.


This seems a little excessive to me. Sample use case:

for something in lst:
if type(something) != type(()):
something = tuple(something)


Hhhmmh!
I presume you are going through the list and want to gaurantee that every
item you encounter is a tuple! So if it ain't - you just re-declare
"something" to be a tuple. What was formerly a single string, integer,
whathaveyou is now a tuple *containing* a single string, integer,
whathaveyou.

Do you do it that way because you can? Or because you must?
And
If the former - is it a good idea?
OR did I just miss your codes intent completely?

My first inclination would be to create a new variable (type = tuple) and
accept (or typecast) each "something" into it as required. The notion that
you just morph "something" still seems rather abhorrent. It hadn't occurred
to me that iterating through a list like that means the iterater "something"
might need to constantly morph into a different type according to a lists
possibly eclectic contents.

It might explain why the interpreter is incapable of enforcing a type. It
would forbid iterating through lists containing a mix of different types.
EXCEPT- I must note, that other languages manage to pull off exactly such a
trick with a variant type. When you need to pull off a feat such as this,
you declare a variant type where the rules are relaxed *for that situation
only* and there is no need to toss the baby out with the bathwater.

Thomas Bartkus
Jul 18 '05 #27

P: n/a
On Tue, 01 Feb 2005 09:13:36 -0600, Thomas Bartkus wrote:
*Is* there a reason why the interpreter couldn't/shouldn't require formal
variable declaration?


You mean, other than the reasons already discussed at length in this
thread, not to mention many many others?

Your not *liking* the reasons doesn't make them any less the reasons. They
may not even be good reasons, nevertheless, there the reasons are.

If you're literally asking the question you are asking, re-read this
thread more carefully. If you're *really* asking "Give me a reason *I
like*", I suggest re-reading Alex's discussion on why maybe Python isn't
for everybody.

All I know is that I have created large programs and typos like you seem
mortally terrified of occur on average about once every *ten modules* or
so, and are generally caught even before I write the unit tests. Breaking
the language to avoid what *by construction* is demonstrated not be a real
problem is... well, I believe Alex covered that, too.

Blah blah blah, "what if... what if... what if..." We should concentrate
on *real* problems, ones that exist in real code, not ones that mostly
exist in wild-eyed prose that consists of predictions of pain and death
that conspicuously fail to occur, no matter how many times they are
repeated or we are exhorted to heed them or face our doom.

(The previous paragraph also describes my root problem with Java's strong
typing philosophy; death, doom, and destruction conspicuously fail to
occur in Python programs, so why the hell should I listen to the
doomsayers after I've already proved them false by extensive personal
experience? No amount of prose is going to convince me otherwise, nor
quite a lot of the rest of us.)
Jul 18 '05 #28

P: n/a
Michael Tobis <mt@3planes.com> wrote:
...
I don't know that it's ever necessary to rebind, but it is, in fact,
common, and perhaps too easy. In numeric Python, avoiding rebinding
turns out to be a nontrivial skill.


Well, a for-statement is BASED on rebinding, for example. Maybe you
don't mean to address rebinding per se, but rebinding in ``separate
statements''? The ``separation'' needs to be defined carefully to make
while-loops work, too. Normally, a while-statement's header clause
would be something like:
while <expression>:
where the expression depends on the values bound to some local
variables. For the first evaluation, the variables need to be bound by
earlier statements; for the expression to eventually become false, it's
likely (unless we're talking about mutable objects) that the variables
need to be re-bound in the loop body.

For example, consider:

def f(a, b):
x = 0
while x < 100000:
print x,
x = a*x + b
print

would you require the programmer to find out a closed-form expression
for this recurrence relation, in order to avoid having to rebind the
name 'x'? OK, in this particular case it may be OK, if you are willing
to put competence in college-level algebra as a prerequisite for using
Python. But what if the rebinding in the body of the loop was to a more
complicated expression on x? What if it was something like x=g(x)? I'm
afraid we'd end up with weird constructs such as:

def ff(g):
x = [0]
while x[-1] < 100000:
print x[-1],
x.append(g(x[-1]))
print

if we had to avoid rebinding completely. Or, you could force the
iteration to be changed into a recursion... but then you'd better be
prepared to remove the current 'recursion limit' AND offer tail-call
optimization possibilities, at the very least.

All in all, I fail to see what gains would be expected by making Python
into a single-assignment or single-binding language, even on a module by
module basis, to repay this kind of awkwardness.
Alex
Jul 18 '05 #29

P: n/a
aa**@pythoncraft.com (Aahz) writes:
It's kind of like having a guy who juggles chainsaws wearing body armor
arguing with a guy who juggles rubber chickens wearing a T-shirt about who's
in more danger." --Roy Smith, c.l.py, 2004.05.23


If it's Nethack, the guy in the T-shirt is in more danger. A _lot_
more danger.

Nick

--
# sigmask || 0.2 || 20030107 || public domain || feed this to a python
print reduce(lambda x,y:x+chr(ord(y)-1),' Ojdl!Wbshjti!=obwAcboefstobudi/psh?')
Jul 18 '05 #30

P: n/a
> All in all, I fail to see what gains would be expected by making
Python
into a single-assignment or single-binding language, even on a module by module basis, to repay this kind of awkwardness.


Just to be clear, if anyone was suggesting that, it wasn't me.

It would be helpful on occasion in a numarray development context to
have *specific* refererences be bound only once, and I wouldn't be
surprised if the compiler couldn't use that information to good
advantage too.

However, this subthread is about whether rebinding is completely
avoidable. Others including you have come up with better reasons than I
did that it's not.

If rebinding is normal, I think the 'epselon' bug can't be dismissed as
completely avoidable. This is something that I gather you disagree with
on the presumption that everyone who writes Python is sufficently
talented that they can use their skills to avoid getting too far into
this trap.

Since I'm very much a believer in Python as a beginner's language, that
doesn't satisfy me. "Declarations are impractical" would satisfy me,
but so far I'm not completely convinced of that.

mt

Jul 18 '05 #31

P: n/a
"Michael Tobis" <mt@3planes.com> wrote in message
news:11**********************@z14g2000cwz.googlegr oups.com...
<snip>
Since I'm very much a believer in Python as a beginner's language, that
doesn't satisfy me. "Declarations are impractical" would satisfy me,
but so far I'm not completely convinced of that.


As has been pointed out, it's not a big deal for a programmer who's been
there, done that. But the original posters example is a beginners trap for
certain.

*If* Python were a "beginners language", then it would be missing one of
it's training wheels.
Thomas Bartkus
Jul 18 '05 #32

P: n/a
Thomas Bartkus wrote:
"Steve Holden" <st***@holdenweb.com> wrote in message
news:41**************@holdenweb.com...
Thomas Bartkus wrote:

"Carl Banks" <in**********@aerojockey.com> wrote in message
news:11*********************@f14g2000cwb.google groups.com...
<snip>

How common is it for a local variable to be bound in
more than one place within a function?
How common? It shouldn't happen at all and that was the point.
This seems a little excessive to me. Sample use case:

for something in lst:
if type(something) != type(()):
something = tuple(something)

Hhhmmh!
I presume you are going through the list and want to gaurantee that every
item you encounter is a tuple! So if it ain't - you just re-declare
"something" to be a tuple. What was formerly a single string, integer,
whathaveyou is now a tuple *containing* a single string, integer,
whathaveyou.

Do you do it that way because you can? Or because you must?
And
If the former - is it a good idea?
OR did I just miss your codes intent completely?

I suspect you missed the intent completely.
My first inclination would be to create a new variable (type = tuple) and
accept (or typecast) each "something" into it as required. The notion that
OK, but if you do that then surely the loop looks like

for something in lst:
somethingElse = something
if type(somethingElse) != type(()):
somethingElse = ...
you just morph "something" still seems rather abhorrent. It hadn't occurred
to me that iterating through a list like that means the iterater "something"
might need to constantly morph into a different type according to a lists
possibly eclectic contents.
Now I suspect I'm missing *your* point.
It might explain why the interpreter is incapable of enforcing a type. It
would forbid iterating through lists containing a mix of different types.
EXCEPT- I must note, that other languages manage to pull off exactly such a
trick with a variant type. When you need to pull off a feat such as this,
you declare a variant type where the rules are relaxed *for that situation
only* and there is no need to toss the baby out with the bathwater.

Well I have to say that the longer I program (and I've been at it nearly
forty years now) the more I am convinced that type declarations don't
actually help. I can see their value in terms of code optimization, but
there is no way that I see them as an error-detection mechanism. "You
have tried to assign a string to an integer variable" just isn't a
mistake I make a lot.

regards
Steve
--
Meet the Python developers and your c.l.py favorites March 23-25
Come to PyCon DC 2005 http://www.python.org/pycon/2005/
Steve Holden http://www.holdenweb.com/
Jul 18 '05 #33

P: n/a
"Thomas Bartkus" wrote
As has been pointed out, it's not a big deal for a programmer who's
been
there, done that. But the original posters example is a beginners trap
for
certain.

*If* Python were a "beginners language", then it would be missing one
of
it's training wheels.

If you put training wheels on your bicycle, it's not going to be any good for moderately serious cycling. The OP was clearly not new to programming, and it was a hypothetical problem.

We're all adults here (even my 12 year old!) - and we have only beginners in my house. This purported wart has never bothered me -- Python is so friendly to develop in. If this sort of code error bites my 12 year old, I'm sure he will be able to find it and feel good about fixing it. It's not the kind of code error that has you shutting down your computer at 4AM, perplexed and frustrated - those feelings are usually attributable to subtle, complex, dastardly language features (unexpected behavoirs). Just my opinion, of course.

Among the great and enlightening posts in this thread, I liked this:

QOTW?
"""We should concentrate on *real* problems, ones that exist in real code, not ones that mostly
exist in wild-eyed prose that consists of predictions of pain and death
that conspicuously fail to occur, no matter how many times they are
repeated or we are exhorted to heed them or face our doom. """

http://groups-beta.google.com/group/...8fef06830cc779
[Go PyPy!]

Eric Pederson
http://www.songzilla.blogspot.com

:::::::::::::::::::::::::::::::::::
domainNot="@something.com"
domainIs=domainNot.replace("s","z")
ePrefix="".join([chr(ord(x)+1) for x in "do"])
mailMeAt=ePrefix+domainIs
:::::::::::::::::::::::::::::::::::

Jul 18 '05 #34

P: n/a
On Mon, 31 Jan 2005 18:49:15 +0100, Alex Martelli <al*****@yahoo.com> wrote:
Michael Tobis <mt@3planes.com> wrote:
With all due respect, I think "so go away if you don't like it" is
excessive, and "so go away if you don't like it and you obviously don't
like it so definitely go away" is more so. The writer is obviously


I disagree: I believe that, if the poster really meant what he wrote, he
may well be happier using other languages and all the declarations he
cherishes, so recommending that course of action to him is quite proper
on my part.


You are wrong, for once.

That poster could have been me a few years back, when I was younger, more
stupid, more arrogant and less experienced. He'll get over it.

Also, what he described /is/ a problem. I still get bitten by it now and
then. It's just that it has even larger /benefits/ which aren't obvious at
first.

In BASIC, /bin/sh and perl without 'use strict', the lack of declarations is
only a negative thing without benefits. If you know those languages, it's
easy to jump to the conclusion that this applies to Python, too.

/Jorgen

--
// Jorgen Grahn <jgrahn@ Ph'nglui mglw'nafh Cthulhu
\X/ algonet.se> R'lyeh wgah'nagl fhtagn!
Jul 18 '05 #35

P: n/a
Peter!

31 2005 09:09, Peter Otten All :
PO> pychecker may help you find misspelled variable names. You have to
PO> move the code into a function, though:

PO> $ cat epsilon.py
....skipped...
PO> $ pychecker epsilon.py
PO> epsilon.py:6: Local variable (epselon) not used

Well, I can change it a little to pass this check. Just add "print epselon"
line.

I think if as soon as I will make such error, I will write special checker. It
will take code like this:

def loop():
#var S,epsilon
epsilon=0
S=0
while epsilon<10:
S=S+epsilon
epselon=epsilon+1
print S

Such checker will say "error:epselon is not declared!" if I will use something
not declared. If everything is ok, it will call pychecker. Simple and tasty,
isn't it?
Of cource, it may be difficult to handle fields of classes:
MyClass.epsElon=MyClass.epsilon+1
but it is solvable, I think. What do you think, is it a good idea?
Alexander, za**@bk.ru
Jul 18 '05 #36

P: n/a
Hi, Alex!

31 jan 2005 at 13:46, Alex Martelli wrote:

(sorry for the delay,my mail client don't highlight me your answer)

AM> Since the lack of declarations is such a crucial design choice for
AM> Python, then, given that you're convinced it's a very bad thing, I
AM> suggest you give up Python in favor of other languages that give you
AM> what you crave.
Well, I like Python. But, as every language I know, it have some bad sides
which I don't like. One of them in Python is lack of variable declarations,
another (this problem is common with C/C++) is:
===
print 1/2
0
===
(I understand why it is so, but I don't like it anyway. Such behaviour also can
cause some hard-to-find-bugs)

AM> issue for you. Therefore, using Python, for you, would mean you'd be
AM> fighting the language and detesting its most fundamental design
AM> choice: and why should you do that? There are zillions of languages
AM> -- use another one.
Thank you for advice:) Pascal, or special syntax in C. It can cause very ugly errors,like
this: epsilon=0 S=0 while epsilon<10: S=S+epsilon
epselon=epsilon+1 print S It will print zero, and it is not easy to
find such a bug!

AM> Actually, this while loop never terminates and never prints anything,
Oh, I don't find it:)
AM> so that's gonna be pretty hard to ignore;-).
AM> But, assume the code is
AM> slightly changed so that the loop does terminate. In that case...:

AM> It's absolutely trivial to find this bug, if you write even the
AM> tiniest and most trivial kinds of unit tests. If you don't even have
AM> enough unit tests to make it trivial to find this bug, I shudder to
AM> think at the quality of the programs you code.

Thank you for advice again, I already use different tests in my work and I
found them usefull. But! I want to use Python for prototyping. I want to write
my algorithms on it, just to see they do almost they must to do. Next, I want
to play with them to understand their properties and limitations.
If sometimes such a program fall, or not very fast, or sometimes show wrong
results, it's not a big problem. So, I use Python like tool for prototyping.

After I debug the algorithm and understand it, I can rewrite it on C++ (if I
need), carefully, paying attention to speed, side effects, memory requirements,
and so on. With full testing, of course.

Hence, from "language for prototyping" I need next features:

1. I want to think about algorithm (!!!), and language must help me to do it.
It must take care on boring things like memory management, garbage collection,
strict type inference, my typos. It must provide easy-to-use packages for many
of my low-level needs. And so on.

2. goto 1:)

Python is realy very good for such demands. Except one: it force me to type
variable names carefully:) In other words, divert my attraction from algorithm,
to typing.

AM> Even just focusing on
AM> typos,
AM> think of how many other typos you could have, besides the misspelling
AM> of 'epsilon', that unit tests would catch trivially AND would be
AM> caught in no other way whatsoever -- there might be a <= where you
AM> meant a <, a 1.0 where you meant 10, a - where you meant a +, etc,
AM> etc.
AM> You can't live without unit tests. And once you have unit tests, the
AM> added value of declarations is tiny, and their cost remains.

Fine! Let interpreter never show us errors like division by zero, syntax
errors, and so on. If file not found, library don't need to say it. Just skip
it!!! Because every, even simple, test will find such bugs. Once you have unit
tests, the added value of <anything> is tiny, and their cost remains.

:)

Or, maybe, we will ask interpreter to find and prevent as many errors as he
can?

And, one more question: do you think code like this:

var S=0
var eps

for eps in xrange(10):
S=S+ups

is very bad? Please explain your answer:)
AM> Python has no declarations whatsoever. If you prefer Visual Basic, I
AM> strongly suggest you use Visual Basic, rather than pining for Visual
AM> Basic features in Python. If and when your programming practices ever
AM> grow to include extensive unit-testing and other aspects of agile
AM> programing, THEN you will be best advised to have a second look at
AM> Python, and in such a case you will probably find Python's strengths,
AM> including the lack of declarations, quite compelling.

Uh! And you! And you!... And you must never even come close to any languages
with variable declaration! Even to Visual Basic! :)

AM> brain". I find it's true: Python gets out of my way and let me solve
AM> problems much faster, because it fits my brain, rather than changing
AM> the way I think.

I'm agree with you.

AM> If Python doesn't fit YOUR brain, for example because your brain is
AM> ossified around a craving for the declaration of variables, then,
AM> unless you're specifically studying a new language just for personal
AM> growth purposes, I think you might well be better off with a language
AM> that DOES, at least until and unless your brain changes by other
AM> means.

Thank you for explanation of your opinion.

Alexander, za**@bk.ru
Jul 18 '05 #37

P: n/a
Alexander Zatvornitskiy
<Al*********************@p131.f3.n5025.z2.fidonet. org> wrote:
Hi, Alex!

31 jan 2005 at 13:46, Alex Martelli wrote:

(sorry for the delay,my mail client don't highlight me your answer)

AM> Since the lack of declarations is such a crucial design choice for
AM> Python, then, given that you're convinced it's a very bad thing, I
AM> suggest you give up Python in favor of other languages that give you
AM> what you crave.
Well, I like Python. But, as every language I know, it have some bad sides
which I don't like. One of them in Python is lack of variable declarations,
another (this problem is common with C/C++) is:
===
print 1/2 0
===
(I understand why it is so, but I don't like it anyway. Such behaviour
also can cause some hard-to-find-bugs)
You're conflating a fundamental, crucial language design choice, with a
rather accidental detail that's already acknowledged to be suboptimal
and is already being fixed (taking years to get fixed, of course,
because Python is always very careful to keep backwards compatibility).

Run Python with -Qnew to get the division behavior you probably want, or
-Qwarn to just get a warning for each use of integer division so those
hard to find bugs become trivially easy to find. Or import from the
future, etc, etc.

The fact that in Python there are ONLY statements, NO declarations, is a
completely different LEVEL of issue -- a totally deliberate design
choice taken in full awareness of all of its implications. I do not see
how you could be happy using Python if you think it went wrong in such
absolutely crucial design choices.

AM> issue for you. Therefore, using Python, for you, would mean you'd be
AM> fighting the language and detesting its most fundamental design
AM> choice: and why should you do that? There are zillions of languages
AM> -- use another one.
Thank you for advice:)
You're welcome.
>> Pascal, or special syntax in C. It can cause very ugly errors,like
>> this: epsilon=0 S=0 while epsilon<10: S=S+epsilon
>> epselon=epsilon+1 print S It will print zero, and it is not easy to
>> find such a bug!

AM> Actually, this while loop never terminates and never prints anything,
Oh, I don't find it:)


Hit control-C (or control-Break or whatever other key combination
interrupts a program on your machine) when the program is just hanging
there forever doing nothing, and Python will offer a traceback showing
exactly where the program was stuck.

In any case, you assertion that "it will print zero" is false. You
either made it without any checking, or chose to deliberately lie (in a
rather stupid way, because it's such an easy lie to recognize as such).

Fine! Let interpreter never show us errors like division by zero, syntax
errors, and so on. If file not found, library don't need to say it. Just skip
it!!! Because every, even simple, test will find such bugs. Once you have unit
tests, the added value of <anything> is tiny, and their cost remains.
Another false assertion, and a particularly ill-considered one in ALL
respects. Presence and absence of files, for example, is an
environmental issue, notoriously hard to verify precisely with unit
tests. Therefore, asserting that "every, even simple, test will find"
bugs connected with program behavior when a file is missing shows either
that you're totally ignorant about unit tests (and yet so arrogant to
not let your ignorance stop you from making false unqualified
assertions), or shamelessly lying.

Moreover, there IS no substantial cost connected with having the library
raise an exception as the way to point out that a file is missing, for
example. It's a vastly superior approach to the old idea of "returning
error codes" and forcing the programmer to check for those at every
step. If the alternative you propose is not to offer ANY indication of
whether a file is missing or present, then the cost of THAT alternative
would most obviously be grievous -- essentially making it impossible to
write correct programs, or forcing huge redundancy if the check for file
presence must always be performed before attempting I/O.

In brief: you're not just wrong, you're so totally, incredibly, utterly
and irredeemably wrong that it's not even funny.

And, one more question: do you think code like this:

var S=0
var eps

for eps in xrange(10):
S=S+ups

is very bad? Please explain your answer:)
Yes, the many wasted pixels in those idiotic 'var ' prefixes are a total
and utter waste of programmer time. Mandated redundancy, the very
opposite of the spirit of Python.
Uh! And you! And you!... And you must never even come close to any languages
with variable declaration! Even to Visual Basic! :)


Wrong again. I've made a good living for years as a C++ guru, I still
cover the role of C++ MVP for the Brainbench company, I'm (obviously)
totally fluent in C (otherwise I could hardly contribute to the
development of Python's C-coded infrastructure, now could I?), and as it
happens I have a decent command (a bit rusty for lack of recent use) of
dozens of other languages, including several Basic dialects and Visual
Basic in particular.

It should take you about 20 seconds with Google to find this out about
me, you know? OK, 30 seconds if you're on a slow dialup modem line.

So, I guess you just *LIKE* being utterly and monumentally wrong, since
it would be so easy to avoid at least some of the bloopers you instead
prefer to keep making.

I *CHOOSE* Python, exactly because I have vast programming experience in
such a huge variety of languages, across all kinds of application areas,
methodologies, and sizes and levels of programming teams. It's not
``perfect'', of course, being a human artifact, but it does implement
its main design ideas consistently and brilliantly, and gets the
inevitable compromises just about right.
Alex
Jul 18 '05 #38

P: n/a
In article <MS*****************************@fidonet.org>,
Al*********************@p131.f3.n5025.z2.fidonet.o rg (Alexander
Zatvornitskiy) wrote:
And, one more question: do you think code like this:

var S=0
var eps

for eps in xrange(10):
S=S+ups

is very bad? Please explain your answer:)


Let me answer that by way of counter-example.

Yesterday I was writing a little perl script. I always use "use strict" in
perl, which forces me to declare my variables. Unfortunately, my code was
giving me the wrong answer, even though the interpreter wasn't giving me
any error messages.

After a while of head-scratching, it turned out that I had written "$sum{x}
+= $y" instead of "$sum{$x} += $y". The need to declare variables didn't
find the problem. I *still* needed to test my work. Given that I needed
to write tests anyway, the crutch of having to declare my variables really
didn't do me any good.
Jul 18 '05 #39

P: n/a
Alexander Zatvornitskiy wrote:
Привет Peter!

31 января 2005 в 09:09, Peter Otten в своем письме к All писал:
PO> pychecker may help you find misspelled variable names. You have to
PO> move the code into a function, though:

PO> $ cat epsilon.py
...skipped...
PO> $ pychecker epsilon.py
PO> epsilon.py:6: Local variable (epselon) not used

Well, I can change it a little to pass this check. Just add "print
epselon" line.
You are now on a slippery slope. I'd rather think of ways to write my code
in a way for it to succeed or at least fail in an obvious way. I don't
consider a scenario likely where you both misspell a name nearly as often
as you write it correctly, and do that in a situation where the program
enters an infinite loop instead of just complaining with an exception,
which is by far the most likely reaction to a misspelt name.
I think if as soon as I will make such error, I will write special
checker. It will take code like this:

def loop():
#var S,epsilon
epsilon=0
S=0
while epsilon<10:
S=S+epsilon
epselon=epsilon+1
print S
Code not written is always errorfree, and in that spirit I'd rather strive
to write the function more concisely as

def loop2():
s = 0
for delta in range(10):
s += delta
print s

This illustrates another problem with your approach: would you have to
declare globals/builtins like range(), too?
Such checker will say "error:epselon is not declared!" if I will use
something not declared. If everything is ok, it will call pychecker.
Simple and tasty, isn't it?
That your program compiles in a somewhat stricter environment doesn't mean
that it works correctly.
Of cource, it may be difficult to handle fields of classes:
MyClass.epsElon=MyClass.epsilon+1
MyClass.epsilon += 1

reduces the risk of a spelling error by 50 percent. I doubt that variable
declarations reduce the likelihood of erroneous infinite loops by even 5
percent.
but it is solvable, I think. What do you think, is it a good idea?


I suggested pychecker more as a psychological bridge while you gain trust in
the Python way of ensuring reliable programs, i. e. writing small and
readable functions/classes that do one thing well and can easily be tested.
Administrative overhead -- as well as excessive comments -- only serve to
bury what is actually going on.

I guess that means no, not a good idea.

On the other hand, taking all names used in a function and looking for
similar ones, e. g. by calculating the Levenshtein distance, might be
worthwhile...

Peter
Jul 18 '05 #40

P: n/a
Hi, Alex!

05 feb 2005 at 12:52, Alex Martelli wrote:
declarations, another (this problem is common with C/C++)
is: === print 1/2 0 === (I understand why it is so, but I don't like
it anyway. Such behaviour also can cause some hard-to-find-bugs) AM> You're conflating a fundamental, crucial language design choice, with
AM> a rather accidental detail
It's not my problem:) I just wrote about two things that I don't like.
AM> Run Python with -Qnew to get the division behavior you probably want,
AM> or -Qwarn to just get a warning for each use of integer division so
AM> those hard to find bugs become trivially easy to find.
Thank you. It will help me.

AM> The fact that in Python there are ONLY statements, NO declarations,
===
def qq():
global z
z=5
===
What is "global"? Statement? Ok, I fill lack of "var" statement:)

AM> is a completely different LEVEL of issue -- a totally deliberate
AM> design choice taken in full awareness of all of its implications. I
AM> do not see how you could be happy using Python if you think it went
AM> wrong in such absolutely crucial design choices.

Ok, I understand your position.
>> errors,like this: epsilon=0 S=0 while epsilon<10:
>> S=S+epsilon epselon=epsilon+1 print S It will print zero, and it
>> is not easy to find such a bug!

AM> Actually, this while loop never terminates and never prints
AM> anything,
Oh, I don't find it:) AM> Hit control-C (or control-Break or whatever other key combination
AM> interrupts a program on your machine) when the program is just
AM> hanging there forever doing nothing, and Python will offer a
AM> traceback showing exactly where the program was stuck.
AM> In any case, you assertion that "it will print zero" is false. You
AM> either made it without any checking, or chose to deliberately lie (in
AM> a rather stupid way, because it's such an easy lie to recognize as
AM> such).
Sorry, while saying "I don't find it" I mean "I don't take it into account at
time I wrote original message. Now I find it, and it make me smile.". As you
understand, I'am not very good in English.
Fine! Let interpreter never show us errors like division by zero,
syntax errors, and so on. If file not found, library don't need to
say it. Just skip it!!! Because every, even simple, test will find
such bugs. Once you have unit tests, the added value of <anything>
is tiny, and their cost remains. AM> Another false assertion, and a particularly ill-considered one in ALL
AM> respects. Presence and absence of files, for example, is an
AM> environmental issue, notoriously hard to verify precisely with unit
AM> tests. Therefore, asserting that "every, even simple, test will find"
AM> bugs connected with program behavior when a file is missing shows
AM> either that you're totally ignorant about unit tests (and yet so
AM> arrogant to not let your ignorance stop you from making false
AM> unqualified assertions), or shamelessly lying.

Here, you take one detail and bravely fight with it. Just try to understand
meaning of my sentence, in all. It will help:)

AM> Moreover, there IS no substantial cost connected with having the
AM> library raise an exception as the way to point out that a file is
AM> missing, for example. It's a vastly superior approach to the old idea
AM> of "returning error codes" and forcing the programmer to check for
AM> those at every step. If the alternative you propose is not to offer
AM> ANY indication of whether a file is missing or present, then the cost
AM> of THAT alternative would most obviously be grievous -- essentially
AM> making it impossible to write correct programs, or forcing huge
AM> redundancy if the check for file presence must always be performed
AM> before attempting I/O.
AM> In brief: you're not just wrong, you're so totally, incredibly,
AM> utterly and irredeemably wrong that it's not even funny.
Hey, take it easy! Relax, reread that piece of text. It was written with smile
on my lips. Here it is for your convenience:
========
AM> And once you have unit tests, the added value of declarations is
AM> tiny, and their cost remains.

Fine! Let interpreter never show us errors like division by zero, syntax
errors, and so on. If file not found, library don't need to say it. Just skip
it!!! Because every, even simple, test will find such bugs. Once you have unit
tests, the added value of <anything> is tiny, and their cost remains.
========

Again, skip small details and take a look on the problem "in general". Here is,
again, the main idea:
========
Or, maybe, we will ask interpreter to find and prevent as many errors as he
can?
========

You wrote about "substantial cost" of var declarations. Yes, you are write. But
think about the cost of lack of var declarations. Compare time that programmer
will waste on search for the reason of bug caused by such typo, plus time what
programmer will waste while remembering exact variable name.

Compare it with time for looking on message:
===
Traceback (most recent call last):
File "<pyshell#16>", line 5, in -toplevel-
epselon
NameError: name 'epselon' is not defined, in strict mode
===
and fixing it, plus time on typing three letters (or less).
And, one more question: do you think code like this:

var S=0
var eps
for eps in xrange(10):
S=S+ups

is very bad? Please explain your answer:) AM> Yes, the many wasted pixels in those idiotic 'var ' prefixes are a
AM> total and utter waste of programmer time.

Hmmm, that code is not so pretty. Lets change it a little bit:

var S,eps
S=0
for eps in xrange(10):
S=S+ups

I think it is look well.

AM> Mandated redundancy, the very opposite of the spirit of Python. Uh! And you! And you!... And you must never even come close to any
languages with variable declaration! Even to Visual Basic! :)

AM> Wrong again. I've made a good living for years as a C++ guru, I
AM> still cover the role of C++ MVP for the Brainbench company, I'm
AM> (obviously) totally fluent in C (otherwise I could hardly contribute
AM> to the development of Python's C-coded infrastructure, now could I?),
AM> and as it happens I have a decent command (a bit rusty for lack of
AM> recent use) of dozens of other languages, including several Basic
AM> dialects and Visual Basic in particular.
AM> It should take you about 20 seconds with Google to find this out
AM> about me, you know? OK, 30 seconds if you're on a slow dialup modem
AM> line.
Ok, your are really cool guy. Also, I appreciate your contribution to Python's
C infrastructure. I'am not as cool as you, but Python is not my first language
too:)

Alexander, za**@bk.ru
---url: alex-zatv.narod.ru
Jul 18 '05 #41

P: n/a
Alexander Zatvornitskiy
<Al*********************@p131.f3.n5025.z2.fidonet. org> wrote:
...
AM> The fact that in Python there are ONLY statements, NO declarations,
===
def qq():
global z
z=5
===
What is "global"? Statement? Ok, I fill lack of "var" statement:)
'global' is an ugly wart, to all intents and purposes working "as if" it
was a declaration. If I had to vote about the one worst formal defect
of Python, it would surely be 'global'.

Fortunately, it's reasonably easy to avoid the ugliness, by avoiding
rebinding (within functions) global variables, which tend to be easy.

What you keep demanding is a way to inject far worse ugliness, and in a
context in which the typical behavior of pointy-haired bosses will be to
make it unavoidable for many of the people who work with Python. I am
strongly convinced that, if you had your wish, the total amount of
happiness in the world would be quite substantially diminished, and I
will do anything I can to stop it from happening.

>> Fine! Let interpreter never show us errors like division by zero,
>> syntax errors, and so on. If file not found, library don't need to
>> say it. Just skip it!!! Because every, even simple, test will find
>> such bugs. Once you have unit tests, the added value of <anything>
>> is tiny, and their cost remains.

AM> Another false assertion, and a particularly ill-considered one in ALL
AM> respects. Presence and absence of files, for example, is an

... Here, you take one detail and bravely fight with it. Just try to understand
meaning of my sentence, in all. It will help:)
I tear ONE of your examples to pieces in gory detail, because it's not
worth doing the same over and over again to every single piece of crap
you filled that sentence with -- very similar detailed arguments show
how utterly inane the whole assertion is.

There IS no meaning to your (several) sentences above-quoted, that it
can help anybody to "try to undestand": it's simply an insanely bad
attempt at totally invalid parallels.
AM> In brief: you're not just wrong, you're so totally, incredibly,
AM> utterly and irredeemably wrong that it's not even funny.
Hey, take it easy! Relax, reread that piece of text. It was written with smile
on my lips. Here it is for your convenience:
Do yourself a favor: don't even _try_ to be funny in a language you have
so much trouble with. Your communication in English is badly enough
impaired even without such lame attempts at humor: don't made bad things
even worse -- the result is NOT funny, anyway, just totally garbled.

I'm not a native English speaker, either, so I keep a careful watch on
this sort of thing, even though my English would appear to be a bit
better than yours.

Again, skip small details and take a look on the problem "in general". Here is,

There IS no ``problem "in general"'': Python does a pretty good job of
diagnosing as many errors as can be diagnosed ***without demanding
absurdities such as redundancy on the programmer's part***. Period.
again, the main idea:
========
Or, maybe, we will ask interpreter to find and prevent as many errors as he
can?
To show how absurd that would be: why not force every line of the
program to be written twice, then -- this would help diagnose typos,
because the interpreter could immediately mark as errors any case in
which the two copies aren't equal. Heck, why stop at TWICE -- even MORE
errors will be caught if every line has to be written TEN times. Or a
million. Why not? *AS MANY ERRORS AS [it] CAN* is an *ABSURD*
objective, if you don't qualify it with *WHILE AVOIDING ANY ENFORCED
REDUNDANCY* introduced solely for that purpose.

As soon as you see that such redundancy is a horror to avoid, you will
see that Python's design is essentially correct as it is.

You wrote about "substantial cost" of var declarations. Yes, you are
write. But think about the cost of lack of var declarations. Compare time
that programmer will waste on search for the reason of bug caused by such
typo, plus time what programmer will waste while remembering exact
variable name.
I've been programming essentially full-time in Python for about three
years, plus a few more years before then when I used Python as much as I
could, even though my salary was mostly earned with C++, Visual Basic,
Java, perl, and so on. My REAL LIFE EXPERIENCE programming in Python
temms me that the time I've "wasted on search" etc due to the lack of
variable declaration is ***FUNDAMENTALLY NOTHING AT ALL***. Other
hundreds of thousands of man-hours of similar Python programming
experience on the part of hundreds of other programmers essentially
confirm these findings.

Your, what, TENS?, of man-hours spent programming in Python tell you
otherwise. Fine, then *USE ANOTHER LANGUAGE* and be happy, and let US
be just as happy by continuing to use Python -- almost all languages do
things the way you want, so ***leave alone*** the few happy oases such
as Python and Ruby where programmers can happily avoid the idiotic
redundancy of variable declarations, and not even PHBs can impose
otherwise.

Compare it with time for looking on message:
===
Traceback (most recent call last):
File "<pyshell#16>", line 5, in -toplevel-
epselon
NameError: name 'epselon' is not defined, in strict mode
===
and fixing it, plus time on typing three letters (or less).


Like experience shows in all cases of such idiotic redundancies, the
real waste of time comes in the BAZILLION cases where your program WOULD
be just fine -- except you missed one redundancy, so you have to go and
put it in to make the gods of redundancy happy again. That happens with
VASTLY higher frequency than the cases where the enforced redundancy
saves you a comparable amount of time by catching some error earlier.

Plus, the FALSE confidence coming from redundancy works against you by
kidding you into believing that a BAZILLION other typos can't still be
lurking in your code, just because you've eliminated one TINY subset of
such typos (typos in names of variables that happen to leave the mangled
names syntactically valid BUT different from any other variable) -- and
*ONLY* that tiny subset of such typos which happened on the left of a
plain '=' (since all others, happening on the RIGHT of an '=' or on the
left of an _augmented_ '=', were already caught), and ONLY regarding
barenames (such typos on any but the rightmost component of compound
names were already caught intrinsically, and catching those on the
rightmost component is trivially easier than introducing a {YECCCCHH}
'vars' as you so stubbornly insist)...

Basically you're focusing on maybe ONE IN A MILLION of the errors you
could make and want to pervert the whole foundation of Python, and
seriously hamper the productivity of hundreds of thousands of Python
programmers in every normal case!, to save maybe two minutes in such a
one-in-a-million case.

I consider this one of the worst ideas to have been proposed on this
newsgroup over the years, which _IS_ saying something. Oh, you're not
the only one, for sure -- there must have been a dozen before you, at
least. Fortunately, even though almost each and every one of them has
wasted more of everybody's time with such ideas, than even their
scare-tactics claim of ``wastes of time'' due to lack of declarations
could account for, Python is still intact. A few of the stubborn lovers
of declarations tried doing without, and, to their astonishment, found
out that everybody else, with years of Python experience, was right, and
they, without ANY such experience, were wrong (just incredible, eh?!);
others have gone away to find their bliss in Perl, PHP, or whatever --
good riddance, and don't let the door slam behind you as you go, please.
Alex
Jul 18 '05 #42

P: n/a
Alexander Zatvornitskiy wrote:
You wrote about "substantial cost" of var declarations. Yes, you are write. But
think about the cost of lack of var declarations. Compare time that programmer
will waste on search for the reason of bug caused by such typo, plus time what
programmer will waste while remembering exact variable name.


This is a problem better solved through a decent editor with code completion
than through redundant variable declarations, which waste *far* more programmer
time than typos do.

The *only* time the typo is a potential problem is if a variable name gets
rebound to something different. This is because, in Python, the name binding
operation ('=') *is* the declaration of the variable.

Rebinding a name often begs the question, "why are you using the same name for
two different things in the one function?" It's not like using a different name
for the second thing will cost much in terms of program size (unless you have
some freakishly long functions) and it surely does little for readability.
(Granted, iteration can be an exception, but even then the name generally only
gets bound twice at most - once before the loop, and once in the loop body)

Consider all of the following cases which are detected while still preserving
the convenience of 'name binding is declaration':
Py> def f():
.... x = y
....
Py> f()
Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "<stdin>", line 2, in f
NameError: global name 'y' is not defined
Py> def f():
.... x += 1
....
Py> f()
Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "<stdin>", line 2, in f
UnboundLocalError: local variable 'x' referenced before assignment
Py> def f():
.... oops = 1
.... print ooops
....
Py> f()
Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "<stdin>", line 3, in f
NameError: global name 'ooops' is not defined
Py> def f():
.... class C: pass
.... c = C()
.... print c.x
....
Py> f()
Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "<stdin>", line 4, in f
AttributeError: C instance has no attribute 'x'

Now, if someone were to suggest a *rebinding* operator, that worked just like
regular name binding, but expected the name to be already bound, that would be
an entirely different kettle of fish - you keep all the benefits of the current
system, but gain the typo-checking that the rest of the augmented assignment
operators benefit from.

Hell, '@' just acquired rebinding semantics through its use in function
decorator syntax, so how does it look?:

S=0
for eps in xrange(10):
S @= S + ups

Meh. At symbols are still ugly. A full stop might work better, since the
semantics aren't very different from a reader's point of view:

S=0
for eps in xrange(10):
S .= S + ups

Anyway, the exact syntax isn't as important as the concept :)

Cheers,
Nick.

--
Nick Coghlan | nc******@email.com | Brisbane, Australia
---------------------------------------------------------------
http://boredomandlaziness.skystorm.net
Jul 18 '05 #43

P: n/a
Alex Martelli wrote:
'global' is an ugly wart, to all intents and purposes working "as if" it
was a declaration. If I had to vote about the one worst formal defect
of Python, it would surely be 'global'.

Fortunately, it's reasonably easy to avoid the ugliness, by avoiding
rebinding (within functions) global variables, which tend to be easy.
Hear, hear! And if I could write "gbls = namespace(globals())" in order to deal
with those rare cases where I *do* need to rebind globals, all uses of the
keyword could be handled nicely, while entirely eliminating the need for the
keyword itself.
*ONLY* that tiny subset of such typos which happened on the left of a
plain '=' (since all others, happening on the RIGHT of an '=' or on the
left of an _augmented_ '=', were already caught), and ONLY regarding
barenames (such typos on any but the rightmost component of compound
names were already caught intrinsically, and catching those on the
rightmost component is trivially easier than introducing a {YECCCCHH}
'vars' as you so stubbornly insist)...


Would you be as violently opposed to a 'rebinding' augmented assignment operator?

Since bare assignment statements essentially serve the purpose of variable
declarations, I sometimes *would* like a way to say 'bind this existing name to
something different'. A rebinding operation would provide a way to make that
intention explicit, without cluttering the language with useless declarations.

In addition to detecting typos in local variable names, it would *also* address
the problem of detecting typos in the right-most name in a compound name (e.g.
making a habit of always using the rebinding operator when modifying member
variables outside of __init__ would make it easier to avoid inadvertently
creating a new instance variable instead of modifying an existing one)

With a rebinding operator available, the only typos left to slip through the net
are those which match an existing visible name and those where the programmer
has explicitly requested an unconditional name binding by using '=' and then
made a typo on the left hand side.

Cheers,
Nick.
Did I mention the possible incidental benefit of reducing the whinging about the
lack of variable declarations?

--
Nick Coghlan | nc******@email.com | Brisbane, Australia
---------------------------------------------------------------
http://boredomandlaziness.skystorm.net
Jul 18 '05 #44

P: n/a
Nick Coghlan <nc******@iinet.net.au> wrote:
Alex Martelli wrote:
'global' is an ugly wart, to all intents and purposes working "as if" it
was a declaration. If I had to vote about the one worst formal defect
of Python, it would surely be 'global'.

Fortunately, it's reasonably easy to avoid the ugliness, by avoiding
rebinding (within functions) global variables, which tend to be easy.
Hear, hear! And if I could write "gbls = namespace(globals())" in order to
deal with those rare cases where I *do* need to rebind globals, all uses
of the keyword could be handled nicely, while entirely eliminating the
need for the keyword itself.


I entirely agree with you on this.

Would you be as violently opposed to a 'rebinding' augmented assignment operator?

Not at all. The only downside I can see, and it seems a minor one to
me, is having two "obvious ways" to re-bind a name, since '=' would keep
working for the purpose. But making it explicit that one IS rebinding a
name rather than binding it anew could sometimes make certain code more
readable, I think; quite apart from possible advantages in catching
typos (which may be OK, but appear minor to me), the pure gain in
reaability might make it worth it. Call my stance a +0.
the problem of detecting typos in the right-most name in a compound name (e.g.

It's not clear to me what semantics, exactly, x.y := z would be defined
to have (assuming := is the syntax sugar for ``rebinding''). Perhaps,
by analogy with every other augmented operator, it should be equivalent
to:

_temp = x.y
x.y = type(temp).__irebind__(temp, z)

This makes it crystal-clear what happens when type(x).y is a descriptor
with/without __set__ and __get__, when type(x) defines __getattribute__
or __setattr__, etc, etc. Giving type object an __irebind__ which just
returns the second operand would complete the semantics. Of course,
doing it this way would open the issue of types overriding __irebind__,
and it's not clear to me that this is intended, or desirable. So, maybe
some other semantics should be the definition. But since "rebinding" is
not a primitive, the semantics do need to be somehow defined in terms of
elementary operations of getting and setting (of attributes, items, &c).
Did I mention the possible incidental benefit of reducing the whinging
about the lack of variable declarations?


It might, with some luck;-). Probably worth a PEP, although I suspect
it will not be considered before 3.0.
Alex
Jul 18 '05 #45

P: n/a
On Sat, 5 Feb 2005 17:00:15 +0100, al*****@yahoo.com (Alex Martelli)
wrote:

I consider this one of the worst ideas to have been proposed on this
newsgroup over the years, which _IS_ saying something. \


I would disagree, but only to the extent that nothing that is only a
request for an option toggle should qualify for this award. For
anyone not interested in it's effects, it's business as usual.

They can even reward themselves with the knowledge that among the time
and typing they did not waste was in asking for the toggle to be on.

It is also true that mplementation might be a diversion of efforts
from the implementation of features solving more generally recognized
problems. There *are* a lot of bad things to say about the idea.

But I think it worth mentioning that the OP is requesting only a
toggle.

Why is it worth mentioning?

Because maybe one needs to save the heaviest rhetoric for when this is
not the case.

There was a simple answer to those who hate decorators - don't use
them. I won't.

And the most controversial suggestions about optional static typing
all perserve their sanity by remaining suggestions about something
defined as optional.

Generally speaking, it might then be said to be good community poilcy
to hold one's fire at least a bit if a feature request, or accepted
PEP, will not impact one's own code writing beyond the extent one
might choose it to.

Though I do think there *should* be a worst feature request contest
at PyCon,

What is with this white space business, anyway?

;)

Art
Jul 18 '05 #46

P: n/a
Arthur <aj******@optonline.com> wrote:
On Sat, 5 Feb 2005 17:00:15 +0100, al*****@yahoo.com (Alex Martelli)
wrote:

I consider this one of the worst ideas to have been proposed on this
newsgroup over the years, which _IS_ saying something. \


I would disagree, but only to the extent that nothing that is only a
request for an option toggle should qualify for this award. For
anyone not interested in it's effects, it's business as usual.


You must have lead a charmed life, I think, unusually and blissfully
free from pointy-haired bosses (PHBs). In the sublunar world that most
of us inhabit, ``optional'' idiocies of this kind soon become absolutely
mandatory -- thanks to micromanagement by PHBs.

For the last few years I've been working as a consultant -- mostly
(thanks be!) for a wonderful Swedish customer whose managers are in fact
great techies, but otherwise for a fair sample of typical development
shops. Such "fair samples" have weaned me from my own mostly-charmed
blissful life, confirming that the amount of utter stupidity in this
world is REALLY high, and far too much is that is in management
positions.

Now, I've recently had a great offer to work, doing mostly Python, for
another incredibly great firm, and, visa issues permitting, I'll gladly
leave the life of consultants' joys and sorrows behind me again -- I've
spent most of my life working as a full-time employee for a few great
firms, and the last few years have confirmed to me that this fits my
character and personality far better than being a consultant does (just
like the years between my two marriages have confirmed to me that I'm
better suited to be a husband, rather than a roving single... I guess
there's a correlation there!-). So, I'm not speaking for selfish
reasons: at my soon-to-be employer, techies rule, and optional idiocies
won't matter much. I _am_ speaking on behalf of maybe half of the
million or so Python programmers in the world, who are NOT so lucky as
to be working in environments free from the blight of micromanagement.
Alex
Jul 18 '05 #47

P: n/a
On Sat, 5 Feb 2005 20:02:44 +0100, al*****@yahoo.com (Alex Martelli)
wrote:
Arthur <aj******@optonline.com> wrote:
On Sat, 5 Feb 2005 17:00:15 +0100, al*****@yahoo.com (Alex Martelli)
wrote:
>
>I consider this one of the worst ideas to have been proposed on this
>newsgroup over the years, which _IS_ saying something. \


I would disagree, but only to the extent that nothing that is only a
request for an option toggle should qualify for this award. For
anyone not interested in it's effects, it's business as usual.


You must have lead a charmed life, I think, unusually and blissfully
free from pointy-haired bosses (PHBs). In the sublunar world that most
of us inhabit, ``optional'' idiocies of this kind soon become absolutely
mandatory -- thanks to micromanagement by PHBs.


It seems to me that you would be more accurate (and calmer) if you
generalized from your own experience, rather than in direct
contradiction to it.

That the firms that advocate the use of Python are the one's least
likely to be dominated by PHB's who DKTAFTE (who don't know their ass
from their elbows).

Unless I am misintepreting you.

Do the STUPID firms use Python as well.

Why?

Clearly there are stupider choices.

I prefer to use VB when doing certain kinds of stupid things. It's
the right tool in those cases.

Really.

Perhaps the answer here has more to do with backing off efforts to
make Python ubiquitous.

Why shouldn't the community stay a bit elitist?

Or else maybe that's where you started on this thread, in fact - and
in your own way.

If so, I agree - within reason.

One doesn't toggle between VB and Python, perhaps is the point. They
are of different species.

Art

Jul 18 '05 #48

P: n/a
Alex Martelli wrote:
It's not clear to me what semantics, exactly, x.y := z would be defined
to have (assuming := is the syntax sugar for ``rebinding''). Perhaps,
by analogy with every other augmented operator, it should be equivalent
to:

_temp = x.y
x.y = type(temp).__irebind__(temp, z)

This makes it crystal-clear what happens when type(x).y is a descriptor
with/without __set__ and __get__, when type(x) defines __getattribute__
or __setattr__, etc, etc. Giving type object an __irebind__ which just
returns the second operand would complete the semantics. Of course,
doing it this way would open the issue of types overriding __irebind__,
and it's not clear to me that this is intended, or desirable. So, maybe
some other semantics should be the definition. But since "rebinding" is
not a primitive, the semantics do need to be somehow defined in terms of
elementary operations of getting and setting (of attributes, items, &c).


I was thinking of something simpler:

x.y
x.y = z

That is, before the assignment attempt, x.y has to resolve to *something*, but
the interpreter isn't particularly fussy about what that something is.

Cheers,
Nick.

--
Nick Coghlan | nc******@email.com | Brisbane, Australia
---------------------------------------------------------------
http://boredomandlaziness.skystorm.net
Jul 18 '05 #49

P: n/a
Nick Coghlan <nc******@iinet.net.au> wrote:
...
_temp = x.y
x.y = type(temp).__irebind__(temp, z)
... I was thinking of something simpler:

x.y
x.y = z

That is, before the assignment attempt, x.y has to resolve to *something*, but
the interpreter isn't particularly fussy about what that something is.


OK, I guess this makes sense. I just feel a tad apprehensive at
thinking that the semantics differ so drastically from that of every
other augmented assignment, I guess. But probably it's preferable to
NOT let a type override what this one augmented assignment means; that
looks like an "attractive nuisance" tempting people to be too clever.

Still, if you write a PEP, I would mention the possible alternative and
why it's being rejected in favor of this simpler one.
Alex
Jul 18 '05 #50

83 Replies

This discussion thread is closed

Replies have been disabled for this discussion.