By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
428,659 Members | 910 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 428,659 IT Pros & Developers. It's quick & easy.

__name__ becoming read-write?

P: n/a


Did I hallucinate something about __name__ becoming read-write?

Not in alpha2.

Can't find the reference to this I thought I read - that it was
concluded to be necessary in connection with PEP318.

Better get my facts straight first....

But if true that would seem to solve the main objection to:

the_horrible_name_I _need_to_call=transform(__f)

And would mean that a byproduct of the PEP318 implementation would go
50% toward obviating the need for a PEP318 implementation. At least
by one measure.

But the whole thing might be a hallucination.

Art
Jul 18 '05 #1
Share this Question
Share on Google+
19 Replies


P: n/a
Arthur wrote:
Did I hallucinate something about __name__ becoming read-write?


_Becoming_ read-write? When was it read-only?

Python 1.5.2 (#1, Jul 14 2004, 20:34:28) [GCC 3.2.3] on sunos5
Copyright 1991-1995 Stichting Mathematisch Centrum, Amsterdam
class foo: pass .... foo.__name__ = 'bar'
foo.__module__ = 'baz'
foo

<class baz.bar at c5ee0>

--
Hallvard
Jul 18 '05 #2

P: n/a
Arthur <aj******@optonline.com> writes:
Did I hallucinate something about __name__ becoming read-write?
For functions? No, you didn't hallucinate that.
Not in alpha2.
Indeed. The changes were only checked in a couple of weeks ago.
They'll be in alpha3.
Can't find the reference to this I thought I read - that it was
concluded to be necessary in connection with PEP318.
It's something that's been nagging at me for ages, but it was a PEP
318-related comment that finally prodded me into writing the patch.
Better get my facts straight first....


Glad to be of service <wink>!

Cheers,
mwh

--
I've reinvented the idea of variables and types as in a
programming language, something I do on every project.
-- Greg Ward, September 1998
Jul 18 '05 #3

P: n/a
On 23 Aug 2004 16:50:22 +0200, Hallvard B Furuseth
<h.**********@usit.uio.no> wrote:
Arthur wrote:
Did I hallucinate something about __name__ becoming read-write?


_Becoming_ read-write? When was it read-only?

Python 1.5.2 (#1, Jul 14 2004, 20:34:28) [GCC 3.2.3] on sunos5
Copyright 1991-1995 Stichting Mathematisch Centrum, Amsterdam
class foo: pass... foo.__name__ = 'bar'
foo.__module__ = 'baz'
foo<class baz.bar at c5ee0>


Python 2.3.4 (#53, May 25 2004, 21:17:02) [MSC v.1200 32 bit (Intel)]
on win32
def foo(): pass
foo.__name__='bar'


Traceback (most recent call last):
File "<pyshell#3>", line 1, in -toplevel-
foo.__name__='bar'
TypeError: readonly attribute

I hope I'm not being stupid.

And what's with running 1.5.2 ;)

Art
Jul 18 '05 #4

P: n/a
Hallvard B Furuseth <h.**********@usit.uio.no> writes:
Arthur wrote:
Did I hallucinate something about __name__ becoming read-write?


_Becoming_ read-write? When was it read-only?


When it was a function :-) Or a new style class in 2.2.

Cheers,
mwh

--
I never disputed the Perl hacking skill of the Slashdot creators.
My objections are to the editors' taste, the site's ugly visual
design, and the Slashdot community's raging stupidity.
-- http://www.cs.washington.edu/homes/k...shdot.html#faq
Jul 18 '05 #5

P: n/a
On Mon, 23 Aug 2004 14:49:41 GMT, Michael Hudson <mw*@python.net>
wrote:
Arthur <aj******@optonline.com> writes:
Did I hallucinate something about __name__ becoming read-write?


For functions? No, you didn't hallucinate that.
Not in alpha2.


Indeed. The changes were only checked in a couple of weeks ago.
They'll be in alpha3.


Thanks for confirming I am not hullicinating. This time, anyway.

My argument - obviously, I think - is that this as a stand-alone
change does enough to ease the pain of the current syntax, and is in
proportion to the problem.

The fact that some of think we are way, way out of proportion with the
direction now headed, having been previously established.

Art
Jul 18 '05 #6

P: n/a
On Mon, 23 Aug 2004 14:07:42 GMT, Arthur <aj******@optonline.com> wrote:


Did I hallucinate something about __name__ becoming read-write? Better get my facts straight first....

But if true that would seem to solve the main objection to: the_horrible_name_I _need_to_call=transform(__f) And would mean that a byproduct of the PEP318 implementation would go
50% toward obviating the need for a PEP318 implementation. At least
by one measure.


No, it is now read-write, thanks to mwh. I think, though, that you're
misunderstanding the difference between setting a local variable in
the function, called '__name__', and setting the actual __name__ of
the function object.
def foo(): pass .... foo.__name__ = 'bar'
foo.__name__ 'bar' def foo(): .... __name__ = 'bar'
.... foo.__name__ 'foo' foo.func_code.co_names

('__name__',)
Jul 18 '05 #7

P: n/a
On Tue, 24 Aug 2004 01:40:29 +1000, Anthony Baxter
<an***********@gmail.com> wrote:
On Mon, 23 Aug 2004 14:07:42 GMT, Arthur <aj******@optonline.com> wrote:


Did I hallucinate something about __name__ becoming read-write?

Better get my facts straight first....

But if true that would seem to solve the main objection to:

the_horrible_name_I _need_to_call=transform(__f)

And would mean that a byproduct of the PEP318 implementation would go
50% toward obviating the need for a PEP318 implementation. At least
by one measure.


No, it is now read-write, thanks to mwh. I think, though, that you're
misunderstanding the difference between setting a local variable in
the function, called '__name__', and setting the actual __name__ of
the function object.


You are right. I did know this to be *possibly* true, but had no
implementation to test.

Though __doc__ (and I am sure other stuff) exhjibits the same
behavior, so there is no real excuse to be surprised.

This sabotages my approach, but only to the extent that we would still
need until after __f's def to know the name we are imposing on it.

Of course I am curious as to why, and what would be involved, and
wrong,. with merging the local variable and the actual name for these
special syntax items. It would seem to have merit on its own terms.

For example I had noticed to use string substition on a function doc I
needed to assign to __doc__ outside the function.

Unless I was hallucinating.

Though I would understand if you are not in tutorial mood, or mode.

Art

def foo(): pass... foo.__name__ = 'bar'
foo.__name__'bar' def foo():... __name__ = 'bar'
... foo.__name__'foo' foo.func_code.co_names

('__name__',)


Jul 18 '05 #8

P: n/a
On Mon, 23 Aug 2004 15:58:26 GMT, Arthur <aj******@optonline.com> wrote:
Of course I am curious as to why, and what would be involved, and
wrong,. with merging the local variable and the actual name for these
special syntax items. It would seem to have merit on its own terms.

For example I had noticed to use string substition on a function doc I
needed to assign to __doc__ outside the function.


How would you envisage this working? Look at the following code:

def foo(arg):
__doc__ = "bingle!"
if arg < 0:
__doc__ = "bangle!"
if arg > 0:
__doc__ = "bongle!"

Now, _before_ this code is run, what's foo.__doc__ supposed to be set
to? Remember, at this point, the code has not been run. The local
__doc__ has no value at this point.

Special casing __doc__ (or __name__) so that assignments to a local
like that inside a function assign magically to the function object is
bad magic. It leads to confusion and poor coding. In general, inside a
function, you don't have access to the function object itself[1]

Anthony

[1] I except using sys._getframe(), or raising an exception and
traversing up through the traceback object, as they're hacks.
Jul 18 '05 #9

P: n/a
On Tue, 24 Aug 2004 02:28:09 +1000, Anthony Baxter
<an***********@gmail.com> wrote:
On Mon, 23 Aug 2004 15:58:26 GMT, Arthur <aj******@optonline.com> wrote:
Of course I am curious as to why, and what would be involved, and
wrong,. with merging the local variable and the actual name for these
special syntax items. It would seem to have merit on its own terms.

For example I had noticed to use string substition on a function doc I
needed to assign to __doc__ outside the function.


How would you envisage this working? Look at the following code:

def foo(arg):
__doc__ = "bingle!"
if arg < 0:
__doc__ = "bangle!"
if arg > 0:
__doc__ = "bongle!"

Now, _before_ this code is run, what's foo.__doc__ supposed to be set
to? Remember, at this point, the code has not been run. The local
__doc__ has no value at this point.

Special casing __doc__ (or __name__) so that assignments to a local
like that inside a function assign magically to the function object is
bad magic. It leads to confusion and poor coding. In general, inside a
function, you don't have access to the function object itself[1]

I see the point.

But.. there is always a but.

I'm thinking now a special sytnax item in the __form__ at the top of
function that would:

1) put one on notice that the function is to be transformed (as in
"see below").

2) allow a name to be assigned to it, which will become the
transform's __name__.

Something in the general direction, I think, of where Paul Morrow's
instincts were going.

I like it.

And have no idea whether it is feasible.

Art
Jul 18 '05 #10

P: n/a
On Mon, 23 Aug 2004 17:00:39 GMT, Arthur <aj******@optonline.com> wrote:
I see the point.

But.. there is always a but.

I'm thinking now a special sytnax item in the __form__ at the top of
function that would:

1) put one on notice that the function is to be transformed (as in
"see below").

2) allow a name to be assigned to it, which will become the
transform's __name__.


This is a bad idea - code inside the function should be executed when
the function is executed. docstrings are a special-case (because
they're not actually code), but, to be honest, this whole discussion
has made me deeply uncomfortable with where the docstrings sit at the
moment. Once 2.4 is out, I know I'll probably start using something
like:

def doc(str):
def endoculate(func, str=str):
func.__doc__ = str
return func

@doc('''This function frobulates the meta-whatsit''')
def frobulate(metawhatsit):
.....

(insert final decorator syntax as required)

Paul Morrow's idea of special magic __foo__ inside the function is
deeply magical, and not likely to make any new users running across it
any happier. "So wait, this stuff that _looks_ like code in a
function, isn't actually? what the hell?" The new syntax[1] for
decorators is at least very obvious that something _new_ is going on.
This has been one of the things I've been using as an internal filter
for various syntax proposals for decorators. It should _not_ look like
some existing valid python that "just happens" to have a new effect.
I'd also prefer that the syntax _not_ be valid python in pre-2.4, in
case code accidently gets run under an old interpreter. The decorator
usage I have planned will be very very ugly if the decorators _don't_
get applied to the function, for whatever reason.

[1] whether @syntax, or block-before-def
Jul 18 '05 #11

P: n/a
Anthony Baxter wrote:
On Mon, 23 Aug 2004 17:00:39 GMT, Arthur <aj******@optonline.com> wrote:
I see the point.

But.. there is always a but.

I'm thinking now a special sytnax item in the __form__ at the top of
function that would:

1) put one on notice that the function is to be transformed (as in
"see below").

2) allow a name to be assigned to it, which will become the
transform's __name__.

This is a bad idea - code inside the function should be executed when
the function is executed. docstrings are a special-case (because
they're not actually code), but, to be honest, this whole discussion
has made me deeply uncomfortable with where the docstrings sit at the
moment. Once 2.4 is out, I know I'll probably start using something
like:

def doc(str):
def endoculate(func, str=str):
func.__doc__ = str
return func

@doc('''This function frobulates the meta-whatsit''')
def frobulate(metawhatsit):
.....

(insert final decorator syntax as required)


No Anthony. Please don't ever write doc strings like that. Please...
I don't care what kind of technical problems that solves, that's not
good looking code (IMO).
Paul Morrow's idea of special magic __foo__ inside the function is
deeply magical, and not likely to make any new users running across it
any happier. "So wait, this stuff that _looks_ like code in a
function, isn't actually? what the hell?" The new syntax[1] for
decorators is at least very obvious that something _new_ is going on.
This has been one of the things I've been using as an internal filter
for various syntax proposals for decorators. It should _not_ look like
some existing valid python that "just happens" to have a new effect.
I'd also prefer that the syntax _not_ be valid python in pre-2.4, in
case code accidently gets run under an old interpreter. The decorator
usage I have planned will be very very ugly if the decorators _don't_
get applied to the function, for whatever reason.

[1] whether @syntax, or block-before-def


New users just need to learn that __anythingThatLooksLikeThis__ is
probably magical; that care should be taken anytime they mess with
variables so obviously and particularly 'decorated' (sorry :-)). Of
course experienced pythonistas already know that these things are
special and that they (almost) never have the 'local variable' semantics
they appear to have. So it's an education problem; making new users
aware of the consequences of manipulating magical attributes.

And yes, I've been suggesting that we do more with that syntax. That we
create magical attributes (or a single magical attribute) that change
the behavior of functions, just as __metaclass__ changes the behavior of
classes[*]. Hey, how about a __features__ attribute?

def foo():
__features__ = synchronized, memoized
* When I say "just as", I don't mean to indicate that the mechanics of
the transformation are the same.

Paul

Jul 18 '05 #12

P: n/a
On Tue, 24 Aug 2004 03:17:42 +1000, Anthony Baxter
<an***********@gmail.com> wrote:
On Mon, 23 Aug 2004 17:00:39 GMT, Arthur <aj******@optonline.com> wrote:
I see the point.

But.. there is always a but.

I'm thinking now a special sytnax item in the __form__ at the top of
function that would:

1) put one on notice that the function is to be transformed (as in
"see below").

2) allow a name to be assigned to it, which will become the
transform's __name__.
This is a bad idea - code inside the function should be executed when
the function is executed. docstrings are a special-case (because
they're not actually code),


Well - accepted that you identify this issue.

I would be curious if you could see the sense of the apporach I am
suggesting outside of this issue.

def doc(str):
def endoculate(func, str=str):
func.__doc__ = str
return func

def frobulate(metawhatsit):
I am old enough - I think - to have been frobutaling metawhatsits
since before you were born. But have not been, as it happens. So I
recognize that my solution might be a bit frobulated -technically.

A little fortipation is perhaps all it needs, though.
Paul Morrow's idea of special magic __foo__ inside the function is
deeply magical, and not likely to make any new users running across it
any happier.
Well I still see - by far - that best fallback position as A1 (the
alpha 2 implementation) - precisely because it implies black magic.
At least to me.

The least thing I want to see is sweeter, more embedded syntax for
this stuff.

So I am totally off the "how do we improve the syntax" bus.
"So wait, this stuff that _looks_ like code in a
function, isn't actually? what the hell?" The new syntax[1] for
decorators is at least very obvious that something _new_ is going on.


I am thinking (and I think Paul is thinking) that we can say the same
thing, succintly, in a manner that has precedence in the language

Mine's a one-liner that does not totally relieve the magicians burden
to be - somwhere down the line - a bit expressive

Art
Jul 18 '05 #13

P: n/a
On Mon, 23 Aug 2004 23:17:25 GMT, Arthur <aj******@optonline.com> wrote:
"So wait, this stuff that _looks_ like code in a
function, isn't actually? what the hell?" The new syntax[1] for
decorators is at least very obvious that something _new_ is going on.


I am thinking (and I think Paul is thinking) that we can say the same
thing, succintly, in a manner that has precedence in the language

Mine's a one-liner that does not totally relieve the magicians burden
to be - somwhere down the line - a bit expressive


I'm not _quite_ sure what it is you're asking for here, but it _seems_
like you want to have two different blocks inside a def - the first is
the meta-block, which contains docstrings and other magic attributes,
while the second is the actual code body? Is this correct? If this is
the case, I think it's _possible_ that in the future we might see
something like this, for typing purposes. Maybe. I'm not sure. But if
we do, I can't see any point _at_ _all_ to signifying magic things
with arbitrary words surrounded with __under__ meaning 'under is a
decorator'. While Python does use __foo__, the list of values for foo
that are meaningful is well described and documented.

As to the specific detail about whether you can assign to a functions
__name__ from inside the function - this will break a lot of tools
that attempt to handle python code. How is an editor to find the
method 'frobulate' in a .py file? Right now, it can be done with a
pretty simple regexp. Even if one of the syntaxes for decorators that
insert the decorators on the def line goes in, you can still do it
with a more complex regexp. But something like this?

class Frobozz:
def frobulate():
meta:
__name__ = defrobulate

An IDE or editor that tries to find defrobulate is going to go insane.

One final clarification - altering a method's __name__ does not change
it's name in the class's __dict__.
class A: .... def foo(self): pass
.... foo.__name__ = 'bar'
.... A().bar() Traceback (most recent call last):
File "<stdin>", line 1, in ?
AttributeError: A instance has no attribute 'bar' A().foo()

Jul 18 '05 #14

P: n/a
On Tue, 24 Aug 2004 14:27:15 +1000, Anthony Baxter
<an***********@gmail.com> wrote:
On Mon, 23 Aug 2004 23:17:25 GMT, Arthur <aj******@optonline.com> wrote:
> "So wait, this stuff that _looks_ like code in a
>function, isn't actually? what the hell?" The new syntax[1] for
>decorators is at least very obvious that something _new_ is going on.


I am thinking (and I think Paul is thinking) that we can say the same
thing, succintly, in a manner that has precedence in the language

Mine's a one-liner that does not totally relieve the magicians burden
to be - somwhere down the line - a bit expressive


I'm not _quite_ sure what it is you're asking for here, but it _seems_
like you want to have two different blocks inside a def - the first is
the meta-block, which contains docstrings and other magic attributes,
while the second is the actual code body? Is this correct? If this is
the case, I think it's _possible_ that in the future we might see
something like this, for typing purposes. Maybe. I'm not sure. But if
we do, I can't see any point _at_ _all_ to signifying magic things
with arbitrary words surrounded with __under__ meaning 'under is a
decorator'. While Python does use __foo__, the list of values for foo
that are meaningful is well described and documented.


All I think I am looking for is proportionality. The solution should
be proportional to the problem. The current syntax is expressive in
the way that, I thought, was always considered to be fundamental to
the concept of Pythonic.

What is happening is no more and no less than what is obvious from the
code to someone who may have never studied the Python documentation,
but who is simply generally literate.

The problems identified with the current syntax is:

1) the placement of the transformative code under the function -
leading to a purely hypothetical problem - I am not aware of any
reports of acutal issues arising - of a reader missing important
information related to the function.

2) burdensome amounts of typing when dealing with long function names.

If Python needs to apologize for being itself in the limited
circumstances that give rise to these conditions, perhaps it can be
done with a bit more - I don't know - dignity. Which would involve
recognizing the issues as minor annoyances, and providing some limited
relief.

I am though concluding that , if anything, tmy own suggestion goes
further than it should. No __under__ is necessary. We have #comment
for the developer to commnicate anything he needs to communicate about
the function from within the function.

And in those cases where the developer's editor's cut and paste
facility are on the fritz and they are dealing with 70 letter function
names, you have given them a writeable __name__ attribute as another
weapon to solve their issue.

No it won't solve everything. But then there isn't that much to be
solved.

The solution of fixing these limtied annoyances by redefining good
Python decoding style to include and encourage magic declarations
that can only be meaningful to a studied Pythoneers, is totally out of
left field, IMO.

The shifting sands have and will have their own costs. IMO, here those
costs are substantial. The hundreds of arguments that have gone on
here on python-list, with Python explaining itself and defending
itself, will have to be restated. Because the Zen no longer holds as
nearly as true, and there becomes little left to distinguish Python as
a programming apporach.

For those who think that Python's future is - perhaps - *the* CLR
language, and that this kind of redefintion of the language is a
necessary step in the direction that will make that kind of thing
possible, I would say - not in my shop.

Art

Art
Jul 18 '05 #15

P: n/a
On Tue, 24 Aug 2004 13:09:56 GMT, Arthur <aj******@optonline.com> wrote:
All I think I am looking for is proportionality. The solution should
be proportional to the problem. The current syntax is expressive in
the way that, I thought, was always considered to be fundamental to
the concept of Pythonic.
See, I think decorators _are_ proportional to the problem. I think one
thing is that decorators are a nice language feature that will allow
for a large number of new approaches - things that wouldn't
necessarily have been considered before now.
1) the placement of the transformative code under the function -
leading to a purely hypothetical problem - I am not aware of any
reports of acutal issues arising - of a reader missing important
information related to the function.
Aside from anything else, it's ugly and hard to read code - you have
to flick to the bottom of the function to see what transforms might,
or might not, have been done.
2) burdensome amounts of typing when dealing with long function names.
It's not just a matter of typing, it's a matter of elegance. The
current syntax was always a placeholder until we figured out what, if
anything, needed to be added.
If Python needs to apologize for being itself in the limited
circumstances that give rise to these conditions, perhaps it can be
done with a bit more - I don't know - dignity. Which would involve
recognizing the issues as minor annoyances, and providing some limited
relief.
Again, I think you're misunderstanding the point of decorators.
They're a more generally useful feature, and I think people will be
far more likely to use them once they're a little more prominent. Not
everyone, sure, but those who can use it will find them incredibly
useful. The classic "synchronized()" decorator is one obvious example.
I think I've mentioned descriptors before in a similar context - 90+%
of Python programmers don't (knowingly) use them, but those who need
them find them incredibly powerful. And they can then write libraries
that use these, and other people can then use those libraries.
And in those cases where the developer's editor's cut and paste
facility are on the fritz and they are dealing with 70 letter function
names, you have given them a writeable __name__ attribute as another
weapon to solve their issue.


A couple of people have latched onto the writable __name__ thing, without
understanding some very basic problems with it - by itself, it's not
particularly useful. If you modify __name__, it does _nothing_ apart
from alter the name of the function that appears in tracebacks. It's
only when combined with decorators that it becomes useful, imho.
Jul 18 '05 #16

P: n/a
Arthur wrote:
"So wait, this stuff that _looks_ like code in a
function, isn't actually? what the hell?" The new syntax[1] for
decorators is at least very obvious that something _new_ is going on.

I am thinking (and I think Paul is thinking) that we can say the same
thing, succintly, in a manner that has precedence in the language


Absolutely. That's a good way to think about it.

Jul 18 '05 #17

P: n/a
On Tue, 24 Aug 2004 23:41:57 +1000, Anthony Baxter
<an***********@gmail.com> wrote:
On Tue, 24 Aug 2004 13:09:56 GMT, Arthur <aj******@optonline.com> wrote:
All I think I am looking for is proportionality. The solution should
be proportional to the problem. The current syntax is expressive in
the way that, I thought, was always considered to be fundamental to
the concept of Pythonic.


See, I think decorators _are_ proportional to the problem. I think one
thing is that decorators are a nice language feature that will allow
for a large number of new approaches - things that wouldn't
necessarily have been considered before now.


On the other hand, its just sugar - and nothing but.

But yes, it will encourage new approaches. Decorator libraries.
Modularization. Code reuse.

I think I see some of the good.

But all at a cost. I would be comforted to hear you say something
about the costs you perceive. If you present it is all just a win,
it becomes too easy to challenge your assessment. So easy, that even
someone like myself can pull it off, at least to an extent - and at
least in my own judgement.

I reserve the right to be wrong in my overall assessment. But I have
to doubt that I am wrong in stressing that none of this new power and
possibility comes for free.

Art
Jul 18 '05 #18

P: n/a
On Tue, 24 Aug 2004 14:00:35 GMT, Arthur <aj******@optonline.com> wrote:
But all at a cost. I would be comforted to hear you say something
about the costs you perceive. If you present it is all just a win,
it becomes too easy to challenge your assessment. So easy, that even
someone like myself can pull it off, at least to an extent - and at
least in my own judgement.


_Any_ new feature has a cost - whether it is in the additional
training needed, the potential for truly horrendous hacks, the
backwards-incompatibility, or whatever else.

The additional training issue:
One of my internal measures for evaluating new decorator syntax
options is that it be *obvious* that "this is something new". The
various ideas of "doing something wacky with a list" or "a magic
'decorate()' function" fail this test, for me. They're not obviously
doing something new.
Following on from that, the new feature should be explainable in
the context of existing knowledge. The before-def decorator syntax is
easily explainable in this context. It might be that new python
programmers might not realise that a function or a method is just an
object, like any other, and can be treated just like any other object
- passed as an argument to a function, or whatever. If they've come
from a particularly limited language before Python, there might be a
bit of mental dissonance before they realise this, but they will
realise it eventually. And this is a good thing to realise - once you
"get" that everything's an object, you're on the path to understanding
Python well (I'd add as a footnote the understanding of what a
"variable name" actually means usually follows soon after this).

The potential for truly horrendous hacks:
This is a given, and is the case for any new feature, particularly
one like decorators (or metaclasses, or descriptors, or ...). The
potential for nasty and clever hacks isn't a new thing - take, for
instance the "Don Beaudry hook" introduced for metaclasses way back in
Python 1.5. Jim Fulton took this small hole and drove a large truck
called ExtensionClass through it, leading to the brain-warping of
Acquisition. Or the current Zope3 Interfaces package, which uses a
truly nasty sys._getframe() hack. But in both those cases, end users
of the features don't have to care about the underlying nastiness,
they can just use the new features.
Will people write unpleasant decorator functions? Sure. But the
same people probably write hideously complex __del__ methods on every
object, and don't understand why cyclic GC then doesn't work for them,
or why their destructors aren't being called reliably. Hell, look at
the recent:
class Argh(random.choice(dict,list)):
....
thing posted by someone (mwh? I forget)

The backward incompatibility:
I regard this as a non-issue. For starters, pretty much *every*
major release of Python has had some new feature that wasn't backwards
compatible. For example, generator expressions were also added to 2.4.
They will cause a SyntaxError in Python 2.3 and earlier. And I would
guess that genexprs will be used far far more often than decorators in
most codebases.
If backwards compatibility *is* an issue, note that there's a
couple of solutions here. For starters, there's the maintenance
releases of the previous release. I'm being a complete
pain-in-the-arse with stopping any new features creeping into those
releases, for *damn* good reason. I'm trying to make sure that people
can get the benefits of bug fixes, without having their entire
codebase break. And once 2.4 final ships, I'll be cutting what I plan
to be the last of the 2.3 series, and moving on to 2.4.1. If you
really, really want to keep using 2.3, and this is a problem for you
(because you want the continuing stream of bugfixes), I'm happy to
help bring someone up to speed so that they can take on 2.3.6 and
onwards. (As should be obvious, I'm using "you" in the general sense
here).
Jul 18 '05 #19

P: n/a
On Wed, 25 Aug 2004 01:26:30 +1000, Anthony Baxter
<an***********@gmail.com> wrote:
On Tue, 24 Aug 2004 14:00:35 GMT, Arthur <aj******@optonline.com> wrote:
But all at a cost. I would be comforted to hear you say something
about the costs you perceive. If you present it is all just a win,
it becomes too easy to challenge your assessment. So easy, that even
someone like myself can pull it off, at least to an extent - and at
least in my own judgement.
_Any_ new feature has a cost - whether it is in the additional
training needed, the potential for truly horrendous hacks, the
backwards-incompatibility, or whatever else.


I appreciate you taking the time to respond.

The additional training issue:
One of my internal measures for evaluating new decorator syntax
options is that it be *obvious* that "this is something new". The
various ideas of "doing something wacky with a list" or "a magic
'decorate()' function" fail this test, for me. They're not obviously
doing something new.
Or else, "this is something deep", and "we are violating the normal
rules of expressiveness, and know it".

One argument presented with some vehemence against the pre-alpha2
syntax and in support of the new approach is that the old represented
unacceptable "action at a distance".

But isn't that inherent in a transformation. A trivial sense of
action at a distance might be solved with the new syntax. Replaced by
a more profound sense of action at a distance. Knowing that *my*
saying these kinds of things carries an implicit assertion that a
non-technical impression is valid. Or worth hearing, anyway.

The best precedence some of us find for all this is __metaclass__.

It spells "deep", it spells "on guard, rolled my own", it spells
"action at a distance" (where and what 'M' - the metaclass - is, is
not something revelaed in the declaration itself)

Again, that is a non-technical assessment. Python as language, in the
more general sense of language.

But none of this is where we are very far off from one another,
anyway.

From the wioki list as it stands, I'm an A1 guy, after all.
Following on from that, the new feature should be explainable in
the context of existing knowledge.
For someone like myself, __something__ would do much of the explaining
in and of itself There would be something visceral I understood,
before I understand anything else.

But I will let Paul continue to try to carry this torch. It doesn't
look like he is getting too far, too fast.

I always need to consider the fact that there are technical issues I
don't understand that override the the intuitive, and language, issues
I beleive that I do. Certainly that seems to be your position on
__something__ inside the fucntion def. It sounds to my ears
unnecessarily, almost arbitrarily, purist. But in the end this is not
something I am in a position to argue.
The before-def decorator syntax is


Without going on, I would say that I don't think *you* understand some
the depth of the problem here. There is actually not yet a way to
talk about the mechanism which you are calling decorator syntax
because of the fact that the lack of consensus runs that deep. Some
feel it is a misnomer on technical grounds and others consider it a
misnomer on the basis of being unexpressive and uninformative. To
discuss "decorator syntax" is a concession. To meet on an even playing
field, we can only discuss The Mechanism.

Art

Jul 18 '05 #20

This discussion thread is closed

Replies have been disabled for this discussion.