By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
454,519 Members | 1,792 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 454,519 IT Pros & Developers. It's quick & easy.

Docorator Disected

P: n/a

I was having some difficulty figuring out just what was going on with
decorators. So after a considerable amount of experimenting I was
able to take one apart in a way. It required me to take a closer look
at function def's and call's, which is something I tend to take for
granted.

I'm not sure this is 100%, or if there are other ways to view it, but
it seems to make sense when viewed this way.

Is there a way to do this same thing in a more direct way? Like
taking values off the function stack directly. How much of it get's
optimized out by the compiler?
#
# Follow the numbers starting with zero.
#

# (0) Read defined functions into memory

def decorator(d_arg): # (7) Get 'Goodbye' off stack

def get_function(function): # (8) Get func object off stack

def wrapper(f_arg): # (9) Get 'Hello' off stack

new_arg = f_arg+'-'+d_arg
result = function(new_arg) # (10) Put new_arg on stack
# (11) Call func object

return result # (14) Return result to wrapper

return wrapper # (15) Return result to get_function

return get_function # (16) Return result to caller of func

@decorator('Goodbye') # (5) Put 'Goodbye' on stack
# (6) Do decorator

def func(s): # (12) Get new_arg off stack

return s # (13) Return s to result

# (1) Done Reading definitions
print func('Hello') # (2) Put 'Hello' on stack
# (3) Put func object on stack
# (4) Do @decorator
# (17) print 'Hello-Goodbye'

# Hello-Goodbye
Jul 18 '05 #1
Share this Question
Share on Google+
30 Replies


P: n/a
Ron_Adam wrote:

# (0) Read defined functions into memory

def decorator(d_arg): # (7) Get 'Goodbye' off stack

def get_function(function): # (8) Get func object off stack

def wrapper(f_arg): # (9) Get 'Hello' off stack

new_arg = f_arg+'-'+d_arg
result = function(new_arg) # (10) Put new_arg on stack
# (11) Call func object

return result # (14) Return result to wrapper

return wrapper # (15) Return result to get_function

return get_function # (16) Return result to caller of func

@decorator('Goodbye') # (5) Put 'Goodbye' on stack
# (6) Do decorator

def func(s): # (12) Get new_arg off stack

return s # (13) Return s to result

# (1) Done Reading definitions
print func('Hello') # (2) Put 'Hello' on stack
# (3) Put func object on stack
# (4) Do @decorator
# (17) print 'Hello-Goodbye'

# Hello-Goodbye


Is it possible that you mistakenly believe your @decorator() is being
executed at the line "func('Hello')"?

Please add a print statement to your code:

def decorator(d_arg):
def get_function(function):
print 'decorator invoked'
def wrapper(f_arg):
new_arg = f_arg+'-'+d_arg
result = function(new_arg)
return result
return wrapper
return get_function

When you run the program, you will see that the comment "decorator
invoked" is printed out at the moment when you finish defining:

@decorator('Goodbye')
def func(s):
return s

That is, decorator is invoked before you run the line "func('Hello')".

Decorator feature is a metaprogramming feature. Not a programming
feature. By metaprogramming I mean you are taking a function/code
object, and try to do something with it (e.g., wrap it around.) By the
time you finish defining the function "func(s)", the decorator
"get_function()" was already invoked and will never be invoked again.

It's better to view functions as individual objects. And try to think
who holds reference to these objects. If no one holds reference to an
object, it will be garbage collected and will be gone. After you define
the function "func()" and before you execute "func('Hello')", this is
the situation:

decorator() <--- held by the module
get_function() <--- temporary object, garbage collected
wrapper() <--- held by the module, under the name "func"
func() <--- held by wrapper(), under the name "function"

'Goodbye' <--- string object, held by the wrapper function object,
under the name d_arg

Objects can be rebound to different names. In your code you have
rebound the original wrapper() and func() function objects to different
names.

I think the confusing part is that, for function name binding, Python
does not use the = operator, but instead relies on the "def" keyword.
Maybe this is something to be considered for Python 3K. Anonymous
function or codeblock objects are good to have, when you are doing
metaprogramming.

Jul 18 '05 #2

P: n/a
On 2 Apr 2005 07:22:39 -0800, "El Pitonero" <pi******@gmail.com>
wrote:
Is it possible that you mistakenly believe your @decorator() is being
executed at the line "func('Hello')"?

Please add a print statement to your code:

def decorator(d_arg):
def get_function(function):
print 'decorator invoked'
def wrapper(f_arg):
new_arg = f_arg+'-'+d_arg
result = function(new_arg)
return result
return wrapper
return get_function


Thanks, you are correct. I'll post a revised dissection with print
statements documenting the flow in a few minutes. I'm still a bit
fuzzy on how the arguments are stored and passed.

Regards,
Ron_Adam

Jul 18 '05 #3

P: n/a
Ron_Adam wrote:
def decorator(d_arg): # (7) Get 'Goodbye' off stack

def get_function(function): # (8) Get func object off stack

def wrapper(f_arg): # (9) Get 'Hello' off stack

new_arg = f_arg+'-'+d_arg
result = function(new_arg) # (10) Put new_arg on stack
# (11) Call func object

return result # (14) Return result to wrapper

return wrapper # (15) Return result to get_function

return get_function # (16) Return result to caller of func

@decorator('Goodbye') # (5) Put 'Goodbye' on stack
# (6) Do decorator

def func(s): # (12) Get new_arg off stack

return s # (13) Return s to result


There is actually nothing mysterious about decorators. It is nothing
more than ordinary function composition, executed when the decorated
function is defined. In case of Your definition it, the composition
rules are:

decorator("Goodbye")(func)(s) = get_function(func)(s) = wrapper(s),
where wrapper stores "Goodbye" in the local d_arg.

Or a bit more formally we state the composition principle:

Args x Func -> Func, where decorator() is a function of Args, that
returns a function Func -> Func. As Guido had shown recently in his
Artima blog, Func need not be an instance of an ordinary function but
can be a function-object like his MultiMethod :

http://www.artima.com/weblogs/viewpo...?thread=101605

It is also possible to extend this view by "chaining" decorators.

decorator : Args(2) x (Args(1) x Func - > Func ) -> Func.

To understand decorator chains it is very helpfull to accept the
functional view instead of arguing in a procedural picture i.e. pushing
and popping arguments onto and from the stack.

Someone asked once for a solution of the following problem that is
similar in character to Guidos multimethod but some more general.

def mul(m1,m2):
def default(m1,m2):
return "default",1+m1*m2
def mul_dec(m1,m2):
return "mul_dec",Decimal(str(m1))*Decimal(str(m2))
def mul_float(m1,m2):
return "mul_float",m1*m2
return (default,mul_dec,mul_float)

The function mul defines the inner functions default, mul_float and
mul_dec. What we want is a unified access to this functions by means of
mul. Guidos solution would decompose mul in three different versions of
mul:

@multimethod(int,float)
def mul(m1,m2):
return m1*m2

@multimethod(float,float)
def mul(m1,m2):
return m1*m2
@multimethod(Decimal,Decimal)
def mul(m1,m2):
return m1*m2

but it is hard to tell, what should be done if no argument tuple
matches.

An attempt like:

@multimethod(object,object)
def mul(m1,m2):
return 1+m1*m2

would be useless, because there is no concrete match of argument types
onto (object,object).

So I introduced an "external switch" over argument tuples, using a
decorator chain:

@case(None,"default")
@case((float,float),'mul_float')
@case((int,float),'mul_float')
@case((Decimal,Decimal),'mul_dec')

def mul(m1,m2):
def default(m1,m2):
return "default",1+m1*m2
def mul_dec(m1,m2):
return "mul_dec",Decimal(str(m1))*Decimal(str(m2))
def mul_float(m1,m2):
return "mul_float",m1*m2
return (default,mul_dec,mul_float)

Can You imagine how "case" works internally?

Regards,
Kay

Jul 18 '05 #4

P: n/a
> statements documenting the flow in a few minutes. I'm still a bit
fuzzy on how the arguments are stored and passed.


The arguments are part of the outer scope of the function returned, and thus
they ar kept around. That's standart python,too:

def foo():
a = 10
def bar():
return a*a
return bar

print foo()()
No decorator-specific magic here - just references kept to outer frames
which form the scope for the inner function.

--
Regards,

Diez B. Roggisch
Jul 18 '05 #5

P: n/a
On Sat, 02 Apr 2005 19:59:30 +0200, "Diez B. Roggisch"
<de*********@web.de> wrote:
statements documenting the flow in a few minutes. I'm still a bit
fuzzy on how the arguments are stored and passed.


The arguments are part of the outer scope of the function returned, and thus
they ar kept around. That's standart python,too:

def foo():
a = 10
def bar():
return a*a
return bar

print foo()()
No decorator-specific magic here - just references kept to outer frames
which form the scope for the inner function.


I followed that part. The part that I'm having problems with is the
first nested function get's the argument for the function name without
a previous reference to the argument name in the outer frames. So, a
function call to it is being made with the function name as the
argument, and that isn't visable so it looks as if it's magic.

Ok, Since I was using the wrong model the first time, probably due to
not sleeping well and mixing past language experience in improperly,
we will try again.

In the below model, the @decorator, (object or the interpreter
executing the @decorator statement?), calls nested functions in the
function of the same name until it reaches the inner loop which is
then attached to the function name. Is this correct now?

Cheers,
Ron
### Decorator Dissection V.2 ###

print "\n(0) Start reading decorator defs"
def decorator(d_arg):
print "(3) decorator: gets '"+d_arg+"'"

def get_function(function):
print "(6) get_function: gets 'func' object"

def wrapper(f_arg):
print "(10) wrapper: gets '"+f_arg+"'"
new_arg = f_arg+'-'+d_arg

print "(11) wrapper: calls func('"+new_arg+"')"
result = function(new_arg)

print "(13) wrapper: returns '"+result+"'"
return result

print "(7) get_function: returns 'wrapper' object"
return wrapper

w = get_function
print "(4) decorator: return 'get_function' object"
print '(5) @decorator: calls get_function(func)'
# Need to print this here, done at *(5)
return w

print "(1) Done reading decorator defs\n"
print "(2) @decorator: calls decorator('goodbye')"
# *(5) @decorator: call get_funtion(func)
@decorator('Goodbye')
def func(s):
print '(12) func returns:', s
return s
print "(8) @decorator: func = wrapper\n"
print "(9) Call func('Hello') which is now wrapper object:"
result = func('Hello')
print "(14) result gets '"+result+"'\n"

print result
#---output---

(0) Start reading decorator defs
(1) Done reading decorator defs

(2) @decorator: calls decorator('Goodbye')
(3) decorator: gets 'Goodbye'
(4) decorator: return 'get_function' object
(5) @decorator: calls get_function(func)
(6) get_function: gets 'func' object
(7) get_function: returns 'wrapper' object
(8) @decorator: func = wrapper

(9) Call func('Hello') which is now wrapper object:
(10) wrapper: gets 'Hello'
(11) wrapper: calls func('Hello-Goodbye')
(12) func returns: Hello-Goodbye
(13) wrapper: returns 'Hello-Goodbye'
(14) result gets 'Hello-Goodbye'

Hello-Goodbye
Jul 18 '05 #6

P: n/a
> I followed that part. The part that I'm having problems with is the
first nested function get's the argument for the function name without
a previous reference to the argument name in the outer frames. So, a
function call to it is being made with the function name as the
argument, and that isn't visable so it looks as if it's magic.


No, its not - but I stepped into that trap before - and thought its magic :)

The trick is to know that

- a decorator is a callable
- get passed a callable
- has to return a callable

So this is the simplest decorator imaginable is:

def identity(f):
return f

And the decorator _syntax_ is just a python expression that has to be
_evaluated_ to a yield decorator. So

@identity
def foo(self):
pass

the @identity is just the expression evaluated - to the function reference
to identity, which is callable and follows the decorator protocol - and the
_result_ of that evaluation is called with the callable in question.

So if you want to have _parametrized_ decorators, that expression is
_evaluated_ and has to yield a decorator. Like this:

def arg_decorator(arg):
def real_decorator(f):
return f
return real_decorator

So, this works

@arg_decorator('fooobar')
def foo(self):
pass

@arg_decorator('fooobar') is evaluated to real_decorator (which a scope
containing arg), and _that_ gets called with foo.

HTH - bit me the first time too :)
--
Regards,

Diez B. Roggisch
Jul 18 '05 #7

P: n/a
On 2 Apr 2005 08:39:35 -0800, "Kay Schluehr" <ka**********@gmx.net>
wrote:

There is actually nothing mysterious about decorators.
I've heard this quite a few times now, but *is* quite mysterious if
you are not already familiar with how they work. Or instead of
mysterious, you could say complex, as they can be used in quite
complex ways.

What is missing most with them is some really good documentation. I
got the basic idea and syntax of decorators down right away, but ran
into problems implementing them because, the structure of the
functions being used for the decorators wasn't clear.
It is nothing
more than ordinary function composition, executed when the decorated
function is defined. In case of Your definition it, the composition
rules are:

decorator("Goodbye")(func)(s) = get_function(func)(s) = wrapper(s),
where wrapper stores "Goodbye" in the local d_arg.
It worked as a model, but I mixed in concepts from cstacks and
function calls, which apparently isn't correct. I posted another
model, it should be a bit closer. (with the Subject line spelled
correctly, continue this thread there. ;)
Or a bit more formally we state the composition principle:

Args x Func -> Func, where decorator() is a function of Args, that
returns a function Func -> Func. As Guido had shown recently in his
Artima blog, Func need not be an instance of an ordinary function but
can be a function-object like his MultiMethod :

http://www.artima.com/weblogs/viewpo...?thread=101605
I read this, this morning it was very interesting.
It is also possible to extend this view by "chaining" decorators.

decorator : Args(2) x (Args(1) x Func - > Func ) -> Func.

To understand decorator chains it is very helpfull to accept the
functional view instead of arguing in a procedural picture i.e. pushing
and popping arguments onto and from the stack.
Understanding chains is next on my list. :)
Someone asked once for a solution of the following problem that is
similar in character to Guidos multimethod but some more general.

def mul(m1,m2):
def default(m1,m2):
return "default",1+m1*m2
def mul_dec(m1,m2):
return "mul_dec",Decimal(str(m1))*Decimal(str(m2))
def mul_float(m1,m2):
return "mul_float",m1*m2
return (default,mul_dec,mul_float)

The function mul defines the inner functions default, mul_float and
mul_dec. What we want is a unified access to this functions by means of
mul. Guidos solution would decompose mul in three different versions of
mul:
This is similar to c++'s polymorphism which I've played with nearly 10
years ago. I generally found it useful only in small doses even then.
I seem to think now that c++'s version of it was implemented at
compile time, with each function call being matched up with the
correct function, by the argument types. Where as Guido's version, is
dynamic and handles the situation at run time. I may not be correct
in this, it's been a while.
@multimethod(int,float)
def mul(m1,m2):
return m1*m2

@multimethod(float,float)
def mul(m1,m2):
return m1*m2
@multimethod(Decimal,Decimal)
def mul(m1,m2):
return m1*m2

but it is hard to tell, what should be done if no argument tuple
matches.
It could then invoke the adapt() function to determine if a possible
single way to continue is available. But with that you could run into
some very subtle bugs. Or just annoying windows like behavior, such
as a word processor auto correcting a word when don't want it to.
An attempt like:

@multimethod(object,object)
def mul(m1,m2):
return 1+m1*m2

would be useless, because there is no concrete match of argument types
onto (object,object).

So I introduced an "external switch" over argument tuples, using a
decorator chain:

@case(None,"default")
@case((float,float),'mul_float')
@case((int,float),'mul_float')
@case((Decimal,Decimal),'mul_dec')

def mul(m1,m2):
def default(m1,m2):
return "default",1+m1*m2
def mul_dec(m1,m2):
return "mul_dec",Decimal(str(m1))*Decimal(str(m2))
def mul_float(m1,m2):
return "mul_float",m1*m2
return (default,mul_dec,mul_float)

Can You imagine how "case" works internally?

Regards,
Kay


Sure, That should be fairly straight forward. Although I can imagine
several ways of implementing it at the moment. I think after I play
with decorator chains, one way will probably stand out as being
cleaner than the others.

Cheers,
Ron

Jul 18 '05 #8

P: n/a
Ron_Adam wrote:
On 2 Apr 2005 08:39:35 -0800, "Kay Schluehr" <ka**********@gmx.net>
wrote:
There is actually nothing mysterious about decorators.


I've heard this quite a few times now, but *is* quite mysterious if
you are not already familiar with how they work. Or instead of
mysterious, you could say complex, as they can be used in quite
complex ways.


If the syntax were like:

decorator = function(d_arg) {
return function(f) {
return function(f_arg) {
new_arg = f_arg+'-'+d_arg;
return f(new_arg);
}
}
}

func = decorator('Goodbye') function(s) {
return s;
}

Would you think it would be more easily understandable? Here,
"function()" is a metafunction (or function factory) whose role is to
manufacture a function given a parameter spec and a code body. And in
the expression

func = decorator('Goodbye')(function(s){return s;})

one pair of outter parenthesis have been omitted. Sure, it's not as
readable as Python's "def", but with today's syntax highlighters, the
special word "function" can be highlighted easily.

If the decorator does not have parameters, one has:

func = decorator function(s) {
....
}

or in the general case:

func = deco1 deco2 deco3 function(s) {
....
}

Jul 18 '05 #9

P: n/a
On Sat, 02 Apr 2005 21:04:57 +0200, "Diez B. Roggisch"
<de*********@web.de> wrote:
I followed that part. The part that I'm having problems with is the
first nested function get's the argument for the function name without
a previous reference to the argument name in the outer frames. So, a
function call to it is being made with the function name as the
argument, and that isn't visable so it looks as if it's magic.
No, its not - but I stepped into that trap before - and thought its magic :)


It's magic until we understand it. ;)

I get the feeling that those who have gotten to know decorators find
them easy, and those who haven't, find them nearly impossible to
understand. Which means there is a fairly large first few steps to
get over, then it gets easy. There *is* some underlying processes
being made, which is also the reason that makes them attractive. Less
type/declare/organize/etc... but that is also what causes the
difficulty in understanding and using them at first.
The trick is to know that

- a decorator is a callable
- get passed a callable
- has to return a callable

So this is the simplest decorator imaginable is:

def identity(f):
return f

And the decorator _syntax_ is just a python expression that has to be
_evaluated_ to a yield decorator. So

@identity
def foo(self):
pass
This much I understand.
the @identity is just the expression evaluated - to the function reference
to identity, which is callable and follows the decorator protocol - and the
_result_ of that evaluation is called with the callable in question.
This tells me what it is, and what it does, but not how it works. How
is the ***expression evaluated***, what is the ***decorator
protocol***.

Those are the parts I'm trying to understand at this point. I know
this is the equivalent of looking behind the curtains to reveal the
little man who is the wizard. But I can't resist. :)
So if you want to have _parametrized_ decorators, that expression is
_evaluated_ and has to yield a decorator. Like this:

There's that word again... **evaluated**. How?
def arg_decorator(arg):
def real_decorator(f):
return f
return real_decorator

So, this works

@arg_decorator('fooobar')
def foo(self):
pass

@arg_decorator('fooobar') is evaluated to real_decorator (which a scope
containing arg), and _that_ gets called with foo.

So if I'm following you right?

When the interpreter gets to the line @arge_decorator('fooobar')

it does the following?

foo = arg_decorator('fooobar')(foo)() #?

(experiment with idle a bit...)

Ok I got it. :)

I wasn't aware that the form:

result = function(args)(args)

Was a legal python statement.

So python has a built in mechanism for passing multiple argument sets
to nested defined functions! (click) Which means this is a decorator
without the decorator syntax.

def arg_decorator(arg1):
def real_decorator(function):
def wrapper(arg2)
return f(arg2)
return real_decorator

def foo(arg2):
pass
foo = arg_decorator('fooobar')(foo)(2arg)

The apparent magic is the silent passing of the second two arguments.

So this isn't a decorator question any more. Each argument gets
passed to the next inner defined function, via... a stack(?) ;)

Somehow I think I've completed a circle. LOL

Cheers,
Ron

HTH - bit me the first time too :)


Jul 18 '05 #10

P: n/a
On Sat, 02 Apr 2005 18:39:41 GMT, Ron_Adam <ra****@tampabay.rr.com>
wrote:
def foo():
a = 10
def bar():
return a*a
return bar

print foo()() <--------------- *Here*
No decorator-specific magic here - just references kept to outer frames
which form the scope for the inner function.


Thanks Kay, I wasn't aware of pythons ability to pass arguments to
nested functions in this way. I missed it the first time.

Cheers,
Ron
Jul 18 '05 #11

P: n/a
On Sat, 02 Apr 2005 14:29:08 GMT, Ron_Adam <ra****@tampabay.rr.com> wrote:

I was having some difficulty figuring out just what was going on with
decorators. So after a considerable amount of experimenting I was
able to take one apart in a way. It required me to take a closer look
at function def's and call's, which is something I tend to take for
granted.
I think it might help you to start out with very plain decorators rather than
decorators as factory functions that return decorator functions that wrap the
decorated function in a wrapper function. E.g., (this could obviously be
parameterized as a single decorator factory, but I wanted to show the simplest level
of decorator functionality)
def decoa(f): ... f.decstr = getattr(f, 'decstr', '') + 'a'
... return f
... def decob(f): ... f.decstr = getattr(f, 'decstr', '') + 'b'
... return f
... def decoc(f): ... f.decstr = getattr(f, 'decstr', '') + 'c'
... return f
... @decoa ... @decoc
... @decob
... @decob
... def foo(): pass
... foo.decstr 'bbca'

I.e.,

@decoa
@decoc
def foo(): pass

is almost[1] exactly equal to (note calling order c then a)

def foo(): pass
foo = decoc(foo)
foo = decoa(foo)

[1] One difference is that foo = deco(foo) is a RE-binding of foo,
and the binding wouldn't happen at all if the @deco version
raised an exception in deco. E.g.,
def deco(f): raise NotImplementedError ...

foo not yet defined: foo Traceback (most recent call last):
File "<stdin>", line 1, in ?
NameError: name 'foo' is not defined

Try the bad decorator: @deco ... def foo(): pass
...
Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "<stdin>", line 1, in deco
NotImplementedError

No go, and foo still undefined: foo Traceback (most recent call last):
File "<stdin>", line 1, in ?
NameError: name 'foo' is not defined

But the other way around:

bar undefined to start: bar Traceback (most recent call last):
File "<stdin>", line 1, in ?
NameError: name 'bar' is not defined

Define it sucessfully: def bar():pass ... bar <function bar at 0x02EE8B54>

Try to decorate the old-fashioned way: bar = deco(bar) Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "<stdin>", line 1, in deco
NotImplementedError
bar <function bar at 0x02EE8B54>

Still there, defined as before (well, strictly speaking, not
necessarily as before: with bar already defined, deco could
have messed with the existing bar and THEN raised the exception).
Which would also happen with @deco, just that the new binding to bar
wouln't happen.
def decobomb(f): ... f.bomb = 'bombed'
... raise Exception, 'Did it bomb the existing function?'
... def foo(): pass ... vars(foo).keys() [] @decobomb ... def foo(): pass
...
Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "<stdin>", line 3, in decobomb
Exception: Did it bomb the existing function? vars(foo).keys() []

Nope, seems that was a new foo that got bombed and then not
bound to replace the old foo.
foo.bomb

Traceback (most recent call last):
File "<stdin>", line 1, in ?
AttributeError: 'function' object has no attribute 'bomb'

I'm not sure this is 100%, or if there are other ways to view it, but
it seems to make sense when viewed this way.

I like annotated code walk-throughs. But as others have pointed out,
it's still a bit buggy ;-)

Regards,
Bengt Richter
Jul 18 '05 #12

P: n/a
On Sat, 02 Apr 2005 21:04:57 +0200, "Diez B. Roggisch" <de*********@web.de> wrote:
I followed that part. The part that I'm having problems with is the
first nested function get's the argument for the function name without
a previous reference to the argument name in the outer frames. So, a
function call to it is being made with the function name as the
argument, and that isn't visable so it looks as if it's magic.
No, its not - but I stepped into that trap before - and thought its magic :)

The trick is to know that

- a decorator is a callable

Strictly speaking, UIAM "deco" in @deco is a non-general
very-limited-syntax expression that should evaluate to a callable. - get passed a callable
- has to return a callable


That last is a "should" ;-) Actually, you can abuse the def just
to provide a binding name whose associated function is otherwise ignored
(other than syntax check and compilation ), e.g.,
def maverick(f): return 'maverick' ... @maverick ... def foo(): pass
... foo 'maverick'

Or
@str ... def foo(): pass
... foo '<function foo at 0x02EE8D14>'

Note the quotes ;-)
type(foo)

<type 'str'>

Just being picky about absolute statements ;-)

Regards,
Bengt Richter
Jul 18 '05 #13

P: n/a
Hello Ron ,
You have many good explanations already, but I thought that this
__might__ help others.
Like you I was confused by the decorator syntax. till I realized it was
shorthand for ...

def identity(f):
return f

def foo():
pass

# this is the 'old way'
foo = identity(foo)

It just rebinds foo to the return value of the decorator function.
With the new syntax it becomes.

def identity(f):
return f

@identity
def foo(self):
pass
This is the same as above but now the function is
passed and rebound behind the scenes.
Also note that decorators don't have to be a nested function, it really
depends on what you are trying to achieve.

hth,
M.E.Farmer

Jul 18 '05 #14

P: n/a
On Sat, 02 Apr 2005 21:28:36 GMT, bo**@oz.net (Bengt Richter) wrote:
I think it might help you to start out with very plain decorators rather than
decorators as factory functions that return decorator functions that wrap the
decorated function in a wrapper function. E.g., (this could obviously be
parameterized as a single decorator factory, but I wanted to show the simplest level
of decorator functionality)
<clipped interesting examples>

Thanks for the examples of stacked decorators! :-)

I think I pretty much got it now, I had never needed to pass arguments
to nested "defined" functions before and none of the documentation I
have, ever mentioned that alternative.

So I didn't know I could do this:
def foo(a1):
def fee(a2):
return a1+a2
return fee

fum = foo(2)(6) <------ !!!

# fum is 8
The interesting thing about this is the 'return fee' statement gets
the (6) apparently appended to it. So it becomes 'return fee(6).

That subtle action is confusing if you don't already know about it,
which I didn't.

In this example.
def foo(a1):
def fee(a2):
return a1+a2
return fee

fum = foo(2)
There is no second set of arguments to append to 'return fee', so the
name fum is pointed to object fee instead and fee is not evaluated.

This second subtle action, is also confusing if you aren't aware of
it. Since the two look the same when you examine the def statements.
So there is no reason to think they would not act the same, both
returning an function object.

Now, add in the @decorator syntax to the mix. Which hides the extra
argument sets that are passed to the nested defined functions and the
obscuration is complete. There then is no visual indication of where
the function calls get their arguments from, and this is what I
believe caused me to have so much trouble with this.

Another inconsistency, although not a bad one, is that nested
'defined' function share scope, but nested function calls do not.

Now what this means, is it will be very difficult for some people to
put it all together. I would have gotten it sooner or later, but I
really happy to have help from comp.lang.python. on this one. :)

I like annotated code walk-throughs. But as others have pointed out,
it's still a bit buggy ;-)


It helped a lot, but notice that it took me several tries. That's a
strong indicator that decorators are more implicit than explicit and
that goes against the "Explicit is better than Implicit" guideline
that python tries to follow.

Maybe there are ways to make decorators -and- nested function calls a
bit more explicit?

I think a having indicators on the return statements that are meant to
return a value vs object would help readability and take some of the
mystery out as far as the un initiated are concerned.

def foo(a1):
def fee(a2):
def fiddle(a3):
pass
return a3
return fee # Always return a function object.
# Error, if argument is passed to it.
# and

return fee(a2) # always require an argument,
# error if none is passed to it.

Or some other way if this breaks something. But it will make it more
apparent what nested function should do. And give clearer feed back
when trying to use or write decorators.

I'm not sure what might make @decorator more explicit. Maybe allowing
all the function to be specified as an option. Maybe it is already(?)

@decorator(a1)(foo)
def foo():
pass
So we will have:

def foo(a1):
def fee(a2):
def fiddle(a3):
pass
return a3
return fee # Object always returned here or
# or error if argument is received.

@decorator(a1)(fum) # Last argument optional.
def fum(a3):
return a3

These I think are small changes that might be acceptable.

A little more aggressive alterations would be: Requiring the
'function' argument may have a use when using stacked decorators. Then
it could be inserted into a sequence?

@deco3(postcalc)
@deco2(fum)
@deco1(precalc)
def fum(pointxyz):
return translatepoint(pointxyz)

.... and that reversed order... (yuck!), is it really necessary?
Readability is important, and it is a big reason people don't jump
ship for some other language. Why the exceptions here?

Ok, don't mean to grip. :-) I'm sure there's been plenty of that in
past discussions.

Cheers,
Ron

Jul 18 '05 #15

P: n/a
Ron_Adam wrote:

So I didn't know I could do this:

def foo(a1):
def fee(a2):
return a1+a2
return fee

fum = foo(2)(6) <------ !!!


Ah, so you did not know functions are objects just like numbers,
strings or dictionaries. I think you may have been influenced by other
languages where there is a concept of static declaration of functions.

The last line can be better visualized as:

fum = (foo(2)) (6)

where foo(2) is a callable.

-----------

Since a function is an object, they can be assigned (rebound) to other
names, pass as parameters to other functions, returned as a value
inside another function, etc. E.g.:

def g(x):
return x+3

h = g # <-- have you done this before? assignment of function

print h(1) # prints 4

def f(p):
return p # <-- function as return value

p = f(h) # <-- passing a function object

print p(5) # prints 8

Python's use of "def" keyword instead of the "=" assignment operator
makes it less clear that functions are indeed objects. As I said
before, this is something to think about for Python 3K (the future
version of Python.)

------------

Function modifiers exist in other languages. Java particularly is
loaded with them.

public static synchronized double random() {
....
}

So your new syntax:

@decorator(a1)(foo)
def foo():
pass

is a bit out of the line with other languages.

Jul 18 '05 #16

P: n/a
On 2 Apr 2005 20:02:47 -0800, "El Pitonero" <pi******@gmail.com>
wrote:
Ron_Adam wrote:

So I didn't know I could do this:

def foo(a1):
def fee(a2):
return a1+a2
return fee

fum = foo(2)(6) <------ !!!
Ah, so you did not know functions are objects just like numbers,
strings or dictionaries. I think you may have been influenced by other
languages where there is a concept of static declaration of functions.


No, I did not know that you could pass multiple sets of arguments to
nested defined functions in that manner. Just haven't ran acrossed it
in the two years I've been playing around with python. I haven't had a
reason to try it either. But maybe now that I'm aware of it, I'll
find more uses for it.
The last line can be better visualized as:

fum = (foo(2)) (6)

where foo(2) is a callable.

-----------

Since a function is an object, they can be assigned (rebound) to other
names, pass as parameters to other functions, returned as a value
inside another function, etc. E.g.:

def g(x):
return x+3

h = g # <-- have you done this before? assignment of function
Sure, I have no problem with that. Been doing it for quite a while. :)
print h(1) # prints 4

def f(p):
return p # <-- function as return value

p = f(h) # <-- passing a function object

print p(5) # prints 8

Python's use of "def" keyword instead of the "=" assignment operator
makes it less clear that functions are indeed objects. As I said
before, this is something to think about for Python 3K (the future
version of Python.)
I've always equated 'def' as if it were 'make', or in Python its just
a variation of 'class' for a subset of objects of type 'function'.
------------

Function modifiers exist in other languages. Java particularly is
loaded with them.

public static synchronized double random() {
...
}

So your new syntax:

@decorator(a1)(foo)
def foo():
pass

is a bit out of the line with other languages.


So? Why would it need to be the same as other languages? I like
Python because it's not the same. :)

The above syntax suggestion, just matches the already existing
behavior,

Thanks for helping BTW, I think I have it down pretty good now.

Cheers,
Ron
Jul 18 '05 #17

P: n/a
Ron_Adam wrote:
I wasn't aware that the form:

result = function(args)(args)

Was a legal python statement.

So python has a built in mechanism for passing multiple argument sets
to nested defined functions! (click) Which means this is a decorator
without the decorator syntax.
No. There is no mechanism for passing multiple argument sets to
nested functions. Instead, functions are objects, which can be
assigned to variables, passed as arguments to other functions,
and returned:
len <built-in function len> other=len
other([1,2,3]) 3 other <built-in function len>

Here, len is *an object* that gets *assigned to* the variable other.
The grammatical construct

<something>(<list of something>)

is evaluated as follows:

1. evaluate <something>. Evaluation always returns an object,
be it 1+2, other, or f(x).
2. evaluate <list of something>, from left to right.
3. call the object returned in step 1, with the arguments
computed in step 2.

Because the something before the parentheses can be any expression,
you can also write things like
items=[]
items.append(len)
items[0](range(10)) 10

Functions as parameters and return values are not much different:
def identity(x): .... return x
.... identity(len)(range(10)) 10

Now, a nested function is locally defined, but then the function
is returned
def getlenfun(): .... def lenfun(o):
.... return 2*len(o)
.... return lenfun
.... getlenfun()(range(10))

20

Here, first getlenfun() is evaluated, returning a function lenfun.
Then, range(10) is evaluated, returning a list. Then, lenfun is
invoked with this list.

So far, this has nothing to do with decorators.
So this isn't a decorator question any more. Each argument gets
passed to the next inner defined function, via... a stack(?) ;)


No, functions are objects. Notice that in step 1, the object returned
doesn't have to be a function - other things are callable, too, like
types, classes, and objects implementing __call__.

Regards,
Martin
Jul 18 '05 #18

P: n/a
Ron_Adam wrote:
Ah, so you did not know functions are objects just like numbers,
strings or dictionaries. I think you may have been influenced by other
languages where there is a concept of static declaration of functions.

No, I did not know that you could pass multiple sets of arguments to
nested defined functions in that manner.


Please read the statements carefully, and try to understand the mental
model behind them. He did not say that you can pass around multiple
sets of arguments. He said that functions (not function calls, but
the functions themselves) are objects just like numbers. There is
a way of "truly" understanding this notion, and I would encourage
you to try doing so.

Regards,
Martin
Jul 18 '05 #19

P: n/a
On Sun, 03 Apr 2005 05:09:07 GMT, Ron_Adam <ra****@tampabay.rr.com> wrote:
On 2 Apr 2005 20:02:47 -0800, "El Pitonero" <pi******@gmail.com>
wrote:
Ron_Adam wrote:

So I didn't know I could do this:

def foo(a1):
def fee(a2):
return a1+a2
return fee

fum = foo(2)(6) <------ !!!
Ah, so you did not know functions are objects just like numbers,
strings or dictionaries. I think you may have been influenced by other
languages where there is a concept of static declaration of functions.


No, I did not know that you could pass multiple sets of arguments to

That phraseology doesn't sound to me like your concept space is quite isomorphic
with reality yet, sorry ;-) It sounds like you are thinking of "multiple sets of arguments"
as an aggregate that is passed as such, and that isn't happening, as I believe El Pitonero
is trying to indicate with his parenthesized visualization below.

What is happening is that an expression "foo(2)(6)" is being evaluated left to right.
First foo as a name evaluates to whatever it is bound to, which is the foo function.
Then () is the calling operator, which says evaluate the list inside the parens left to right
and call the thing you had so far, which was foo here. The arg list was just 2, so foo is called
with 2, and foo returns something, with which we will do the next operation if there is one.

If the next operation was "." (i.e., attribute getting) the next thing following would have had
to be an attribute name, e.g. like func_name. foo(2).func_name would evaluate to the same as fee.func_name
for the fee returned by foo(2). But we are not doing .func_name, we are doing (6) as the next operation
in the left-to-right evaluation of the expression. And whatever we have at the foo(2) stage, the (6) means
we should take it and call it with 6 as an argument.

So if you are seeing (2)(6) as something to pass, as opposed to a sequence of operations, I think there's
a misconception involved. Perhaps I am taking your words askew ;-)
nested defined functions in that manner. Just haven't ran acrossed it
in the two years I've been playing around with python. I haven't had a
reason to try it either. But maybe now that I'm aware of it, I'll
find more uses for it.
The last line can be better visualized as:

fum = (foo(2)) (6)

where foo(2) is a callable.
That's clear to me, anyway ;-)

The code shows it too:
import dis, compiler
dis.dis(compiler.compile('foo(2)','','eval')) 1 0 LOAD_NAME 0 (foo)
3 LOAD_CONST 1 (2)
6 CALL_FUNCTION 1
9 RETURN_VALUE

The (6) just calls whatever the result of the preceding was
dis.dis(compiler.compile('foo(2)(6)','','eval'))

1 0 LOAD_NAME 0 (foo)
3 LOAD_CONST 1 (2)
6 CALL_FUNCTION 1
9 LOAD_CONST 2 (6)
12 CALL_FUNCTION 1
15 RETURN_VALUE

HTH

Regards,
Bengt Richter
Jul 18 '05 #20

P: n/a
Martin v. Lwis wrote:
Ron_Adam wrote:

No, I did not know that you could pass multiple sets of arguments to nested defined functions in that manner.
Please read the statements carefully, and try to understand the

mental model behind them. He did not say that you can pass around multiple
sets of arguments. He said that functions (not function calls, but
the functions themselves) are objects just like numbers. There is
a way of "truly" understanding this notion, and I would encourage
you to try doing so.


I have the same feeling as Martin and Bengt. That is, Ron you are still
not getting the correct picture. The fact that you have three-level
nested definition of functions is almost incidental: that's not the
important part (despite the nested scope variables.) The important part
is that you have to understand functions are objects.

Perhaps this will make you think a bit more:

x=1

if x==1:
def f(): return 'Hello'
else:
def f(): return 'Bye'

for x in range(3):
def f(x=x):
return x

Do you realize that I have introduced 5 function objects in the above
code? Do you realize that function objects could be created *anywhere*
you can write a Python statement? Whether it's inside another function,
or inside a if...else... statement, or inside a loop, doesn't matter.
Whereever you can write a Python statement, you can create a function
there. I don't know what your previous programming language is, but you
have to stop treating functions as "declarations". The "def" is an
executable statement.

Another example:

def f():
return f

g = f()()()()()()()()()()()

is perfectly valid.

Jul 18 '05 #21

P: n/a
On Sun, 03 Apr 2005 08:37:02 +0200, "Martin v. Lwis"
<ma****@v.loewis.de> wrote:
Ron_Adam wrote:
Ah, so you did not know functions are objects just like numbers,
strings or dictionaries. I think you may have been influenced by other
languages where there is a concept of static declaration of functions.

No, I did not know that you could pass multiple sets of arguments to
nested defined functions in that manner.


Please read the statements carefully, and try to understand the mental
model behind them. He did not say that you can pass around multiple
sets of arguments. He said that functions (not function calls, but
the functions themselves) are objects just like numbers. There is
a way of "truly" understanding this notion, and I would encourage
you to try doing so.


Hello Martin,

It is interesting how sometimes what we already know, and a new
situation presented in an indirect way, can lead us to viewing an
isolated situation in a biased way.

That's pretty much the situation I've experienced here with this one
point. I already knew that functions are objects, and objects can be
passed around. My mind just wasn't clicking on this particular set of
conditions for some reason, probably because I was looking too closely
at the problem.

(Starting off as a tech, with knowledge of how microchips work, can
sometimes be a obstacle when programming in high level languages.)

I'm sure I'm not the only one who's had difficulties with this. But
I'm somewhat disappointed in myself for not grasping the concept as it
is, in this particular context, a bit sooner.

Cheers,
Ron

Regards,
Martin


Jul 18 '05 #22

P: n/a
On Sun, 03 Apr 2005 07:53:07 GMT, bo**@oz.net (Bengt Richter) wrote:

No, I did not know that you could pass multiple sets of arguments to
That phraseology doesn't sound to me like your concept space is quite isomorphic
with reality yet, sorry ;-)
You'll be happy to know, my conceptual conceptions are conclusively
isomorphic this morning. :-)
It sounds like you are thinking of "multiple sets of arguments"
as an aggregate that is passed as such, and that isn't happening, as I believe El Pitonero
is trying to indicate with his parenthesized visualization below.
Well there are multiple sets of arguments, and there are multiple
functions involved. It's just a matter of how they get matched up.
Depending on what level you look at it, it could be both ways. But the
correct way to view it is in the context of the language it self, and
not the underlying byte code, c++ or assembly code.
What is happening is that an expression "foo(2)(6)" is being evaluated left to right.
First foo as a name evaluates to whatever it is bound to, which is the foo function.
Then () is the calling operator, which says evaluate the list inside the parens left to right
and call the thing you had so far, which was foo here. The arg list was just 2, so foo is called
with 2, and foo returns something, with which we will do the next operation if there is one.
Like this of course:

def foo(x):
def fee(y):
return y*x
return fee

statement: z = foo(2)(6)
becomes: z = fee(6)
becomes: z = 12

The position of the 'def fee' inside of 'def foo' isn't relevant, it's
only needed there so it can have access to foo's name space. It could
be at the top or bottom of the function it is in, and it wouldn't make
a difference.

This would be the same without the nesting:

def foo(xx):
global x
x = xx
return fee

def fee(y):
global x
return y*x

z = foo(2)(6)

So if you are seeing (2)(6) as something to pass, as opposed to a sequence of operations, I think there's
a misconception involved. Perhaps I am taking your words askew ;-)
It's not entirely a misconception. Lets see where this goes...
dis.dis(compiler.compile('foo(2)(6)','','eval'))

1 0 LOAD_NAME 0 (foo)
3 LOAD_CONST 1 (2)
6 CALL_FUNCTION 1
9 LOAD_CONST 2 (6)
12 CALL_FUNCTION 1
15 RETURN_VALUE


In this example, you have byte code that was compiled from source
code, and then an interpreter running the byte code; which in it self,
is a program written in another language to execute the byte code,
C++; which gets translated into yet another language, assembly; which
at one time would have corresponded to specific hardwired registers
and circuits,(I could go further...ie... translators... PNP...
holes...), but with modern processors, it may yet get translated still
further.

While all of this isn't relevant, it's knowledge in my mind, and
effects my view of programming sometimes.

Now take a look at the following descriptions of the above byte codes
from http://docs.python.org/lib/bytecodes.html
LOAD_NAME namei
Pushes the value associated with "co_names[namei]" onto the stack.

LOAD_CONST consti
Pushes "co_consts[consti]" onto the stack.

CALL_FUNCTION argc
Calls a function. The low byte of argc indicates the number of
positional parameters, the high byte the number of keyword parameters.
On the stack, the opcode finds the keyword parameters first. For each
keyword argument, the value is on top of the key. Below the keyword
parameters, the positional parameters are on the stack, with the
right-most parameter on top. Below the parameters, the function object
to call is on the stack.

RETURN_VALUE
Returns with TOS to the caller of the function.

*TOS = Top Of Stack.

The calling routine, puts (passes) the second set of arguments onto
the stack before calling the function returned on the stack by the
previous call.

Which is exactly how I viewed it when I referred to coming full circle
and the second sets of arguments are pass with a "stack(?)".

Or it could be said equally the functions (objects) are passed with
the stack. So both view are correct depending on the view point that
is chosen.

Cheers,
Ron

HTH

Regards,
Bengt Richter


Jul 18 '05 #23

P: n/a
On 3 Apr 2005 00:11:22 -0800, "El Pitonero" <pi******@gmail.com>
wrote:
Martin v. Lwis wrote: Perhaps this will make you think a bit more:
Now my problem is convincing the group I do know it. LOL

Another example:

def f():
return f

g = f()()()()()()()()()()()

is perfectly valid.


Good example! Yes, I realize it. As I said before I just haven't come
across this particular variation before using decorators so it wasn't
clear to me at first, it is now. :)

Read my reply to Bengt Richter.

Thanks, this has been a very interesting discussion.

Ron

Jul 18 '05 #24

P: n/a
On Sun, 03 Apr 2005 08:32:09 +0200, "Martin v. Lwis"
<ma****@v.loewis.de> wrote:
Ron_Adam wrote:
I wasn't aware that the form:

result = function(args)(args)

Was a legal python statement.

So python has a built in mechanism for passing multiple argument sets
to nested defined functions! (click) Which means this is a decorator
without the decorator syntax.
No. There is no mechanism for passing multiple argument sets to
nested functions. Instead, functions are objects, which can be
assigned to variables, passed as arguments to other functions,
and returned:


Yes there is, it's the stack python uses to interpret the byte code.
But it's the same mechanism that is used for passing arguments to
sequential function calls (objects) also. The only difference is the
next function (object) is returned on the stack in the nested case.
Then the next argument is then put on to the stack (passed), before
the next function is called.

How you view this depends on the frame of reference you use, I was
using a different frame of reference, which I wasn't sure was correct
at the time, but turns out is also valid. So both view points are
valid.

In any case, I now have a complete picture of how it works. Inside,
and out. Which was my goal. :)
So this isn't a decorator question any more. Each argument gets
passed to the next inner defined function, via... a stack(?) ;)


No, functions are objects. Notice that in step 1, the object returned
doesn't have to be a function - other things are callable, too, like
types, classes, and objects implementing __call__.


They are objects; which are data structures; containing program code &
data; which reside in memory; and get executed by, in this case, a
byte code interpreter. The interpreter executes the byte code in a
sequential manner, using a *stack* to call functions (objects), along
with their arguments.

For the record, I never had any trouble understanding the concept of
objects. I think I first started programming OOP in the mid '90's with
c++.

It was the sequence of events in the objects of the nested def
functions that I was trying to understand along with where the objects
get their arguments, which isn't obvious because of the levels of
indirect calling.

Thanks for the help Martin, it's always appreciated. :)

Cheers,
Ron

Regards,
Martin


Jul 18 '05 #25

P: n/a
Ron_Adam wrote:
This would be the same without the nesting:

def foo(xx):
global x
x = xx
return fee

def fee(y):
global x
return y*x

z = foo(2)(6)
Actually, it wouldn't.

def foo(xx): .... global x
.... x = xx
.... return fee
.... def fee(y): .... global x
.... return y*x
.... z=foo(2)
x=8
z(6)
48

So the global variable can be changed between the time foo returns
and the time fee is invoked. This is not the same in the nested function
case: the value of x would be bound at the time foo is called. It can
be modified inside foo, but freezes once foo returns.
So if you are seeing (2)(6) as something to pass, as opposed to a sequence of operations, I think there's
a misconception involved. Perhaps I am taking your words askew ;-)

It's not entirely a misconception. Lets see where this goes...

dis.dis(compiler.compile('foo(2)(6)','','eval '))


1 0 LOAD_NAME 0 (foo)
3 LOAD_CONST 1 (2)
6 CALL_FUNCTION 1
9 LOAD_CONST 2 (6)
12 CALL_FUNCTION 1
15 RETURN_VALUE


Hmm. If you think that this proves that (2)(6) is being *passed*, you
still might have a misconception. What this really does is:

0. Put foo on the stack. Stack is [value of foo]
3. Put 2 on the stack -> [value of foo, 2]
6. Call a function with one arg; invoking foo(2)
Put the result of this call back on the stack ->
[result of foo(2)]
9. Put 6 on the stack -> [result of foo(2), 6]
12. Call it, computing (result of foo(2))(6)
Put the result on the stack ->
[result of (result of foo(2))(6)]
13. Return top-of-stack, yielding foo(2)(6)

So at no point in time, (2)(6) actually exists. Instead,
when the 6 is being put onto the stack, the 2 is already gone.
It computes it one by one, instead of passing multiple sets
of arguments.
While all of this isn't relevant, it's knowledge in my mind, and
effects my view of programming sometimes.
There is nothing wrong with that. However, you really should try
to see what the interpreter actually does, instead of speculation
(of course, asking in a newsgroup is fine).
The calling routine, puts (passes) the second set of arguments onto
the stack before calling the function returned on the stack by the
previous call.
Sure - you need the arguments to a function before being able to
call the function. So there is always a set of arguments on the
stack, which internally indeed gets converted into a tuple right
before calling the function. However, at no point in time, there
are *two* sets of arguments.
Or it could be said equally the functions (objects) are passed with
the stack. So both view are correct depending on the view point that
is chosen.


Maybe I don't understand your view, when you said

# No, I did not know that you could pass multiple sets of arguments to
# nested defined functions in that manner.

However, they way I understood it, it seemed incorrect - there are
no multiple sets of arguments being passed, atleast not simultaneously.
It is, of course, possible to pass multiple sets of arguments
sequentially to multiple functions, eg.

a = len(x)
b = len(y)

Regards,
Martin
Jul 18 '05 #26

P: n/a
On Sun, 03 Apr 2005 23:59:51 +0200, "Martin v. Lwis"
<ma****@v.loewis.de> wrote:
Ron_Adam wrote:
This would be the same without the nesting:

def foo(xx):
global x
x = xx
return fee

def fee(y):
global x
return y*x

z = foo(2)(6)


Actually, it wouldn't.


Ok, yes, besides the globals, but I figured that part is obvious so I
didn't feel I needed to mention it. The function call works the same
even though they are not nested functions.

It's not entirely a misconception. Lets see where this goes...

>>dis.dis(compiler.compile('foo(2)(6)','','eva l'))

1 0 LOAD_NAME 0 (foo)
3 LOAD_CONST 1 (2)
6 CALL_FUNCTION 1
9 LOAD_CONST 2 (6)
12 CALL_FUNCTION 1
15 RETURN_VALUE
Hmm. If you think that this proves that (2)(6) is being *passed*, you
still might have a misconception. What this really does is:


I didn't say they were passed at the same time by the stack. It just
shows my reference to *stacks* was correct, and that there's is an
underlying mechanism for calling functions and passing arguments and
functions that use the stack. I however was not yet aware (yesterday
afternoon) of just how the stack worked in this case. This was very
much a figure it out as you go exercise.

Yesterday, I had made the incorrect judgement that since the functions
are all nested inside a defined function, that I should treat them as
a group instead of individual functions. But that wasn't the correct
way of viewing it. They are in a group in that they share name space,
so I figured, (incorectly), that they shared an argument list somehow,
and those where passed to the group. The passing of the function, and
it's arguments silently was a big reason for me jumping to this
conclusion.

So my reference to:
The interesting thing about this is the 'return fee' statement gets
the (6) apparently appended to it. So it becomes 'return fee(6).
Which is not correct, as the order of events is wrong and they do not
share a common argument list.

The correct order is:

return fee
fee(6)

with the fee(6) being evaluated after the return statement is
executed.

Another contributing factor is two days of really poor sleep. Which
probably is a bigger factor than I would like to admit. I really feel
I should have gotten it much sooner. But I did get-it, a little bit
at a time, and had a lot of terrific help along the way. :-)

<clip>
Or it could be said equally the functions (objects) are passed with
the stack. So both view are correct depending on the view point that
is chosen.


Maybe I don't understand your view, when you said

# No, I did not know that you could pass multiple sets of arguments to
# nested defined functions in that manner.


My views have changed as I added the missing peices to the puzzle
yesterday.

At first I didn't see how they were passed at all, in a group or
otherwise. There wasn't any one-to-one way to match the arguments up
visually like there are in a normal function call.

My next thought was they are passed as a group, to the group of
defined functions that shared the same name space. (Everyone seems to
think I'm stuck on this one.)

My Next view, yesterday afternoon, was they were passed on a stack
somehow one at a time. This last one is not necessarily incorrect from
a byte code viewpoint, but it's not the best way to view the problem.

Today I believe I have the correct view as I've said this morning. I
could be wrong yet again. I hope not though I might have to give up
programming. :/

It's interesting that I have had several others tell me they had
trouble with this too.

So it is my opinion that decorators are a little too implicit. I
think there should be a way to make them easier to use while achieving
the same objective and use.
Thanks again for the reply, :)

Cheers,
Ron
Jul 18 '05 #27

P: n/a
On Mon, 04 Apr 2005 02:13:29 GMT, Ron_Adam <ra****@tampabay.rr.com> wrote:
On Sun, 03 Apr 2005 23:59:51 +0200, "Martin v. Lwis"
<ma****@v.loewis.de> wrote:
Ron_Adam wrote:
This would be the same without the nesting:

def foo(xx):
global x
x = xx
return fee

def fee(y):
global x
return y*x

z = foo(2)(6)
Actually, it wouldn't.


Ok, yes, besides the globals, but I figured that part is obvious so I
didn't feel I needed to mention it. The function call works the same
even though they are not nested functions.


I am afraid that is wrong. But be happy, this may be the key to what ISTM
is missing in your concept of python functions ;-)

What you don't seem to grok is what the def statement really does.
When you compile a def statement, you code, but it's not principally the code
that runs when you call the function that is being defined.

When you compile a def statement, you get code that MAKES the function being defined,
from precompiled pieces (one of which is a code object representing the defined function)
and optionally code for evaluating default argument expressions. This becomes part of the
code generated by compiling the def statement. (Default argument value expressions are
the simplest examples).

When you work interactively as above, the interactive loop both compiles and executes
chunks as you go, so both your def foo and def fee would be compiled AND executed.

If you put def fee inside the body of foo, you get fee def-code inside the body of foo,
but it doesn't get executed, because foo hasn't been called yet, though the foo def
has been compiled and executed interactively (and thus the name foo is bound to
the finished foo function, ready to be called or otherwise used.

So, no, it does not "work the same." In fact, it is a misconception to talk about
a nested fee as if it existed ready to call in the same way as foo. It doesn't
exist that way until the fee def is EXECUTED, producing the callable fee.

It's something like the distinction between relocatable object files in C
representing a function and machine code representing the function in a dll.
The execution of def is like a little make operation that links the function
pieces (and in the case of Python may dynamically generate some of the pieces
with arbitrarily complex code, including decorators etc).

def is hard to grok at first, if you're coming from languages that don't
do it Python's way.

We can use dis to see the above clearly:

Let's see what the difference in code is for a simple function
with the body

fee = 'fee string'
return fee

and the body

def fee():
return 'nested fee result'
return fee

It's going to be returning whatever fee is either way, so what we need
to compare is how fee gets bound to something:
import dis
def foo(): ... fee = 'fee string'
... return fee
... dis.dis(foo) 2 0 LOAD_CONST 1 ('fee string')
3 STORE_FAST 0 (fee)

3 6 LOAD_FAST 0 (fee)
9 RETURN_VALUE

Now we'll look for what replaces
2 0 LOAD_CONST 1 ('fee string')
3 STORE_FAST 0 (fee)
when we do a simple nested function and return that:

def foo(): ... def fee():
... return 'nested fee result'
... return fee
... dis.dis(foo) 2 0 LOAD_CONST 1 (<code object fee at 02EF21E0, file "<stdin>", line 2>)
3 MAKE_FUNCTION 0
6 STORE_FAST 0 (fee)

4 9 LOAD_FAST 0 (fee)
12 RETURN_VALUE

Note the MAKE_FUNCTION. That is dynamically going to take its argument(s) -- in this case
what the single LOAD_CONST put on the stack -- and leave the now ready-to-call function
on the stack (as a reference, not the whole thing of course). Now we have the function
reference instead of a reference to the string 'fee string' in the first example on the
stack, and the next step is STORE_FAST to bind the local name fee to whatever was on the stack.
In both cases, what happens next is to return (by reference) whatever fee is bound to.

So fee doesn't exist as such until the three bytecodes (LOAD_CONST, MAKE_FUNCTION, STORE_FAST)
in this example have been executed. To show that MAKE_FUNCTION is in general doing more
than moving a constant, give fee an argument with a default value expression. It could be
arbitrarily complicated, but we'll keep it simple:
def foo(): ... def fee(x, y=2*globalfun()):
... return x, y
... return fee
... dis.dis(foo) 2 0 LOAD_CONST 1 (2)
3 LOAD_GLOBAL 0 (globalfun)
6 CALL_FUNCTION 0
9 BINARY_MULTIPLY
10 LOAD_CONST 2 (<code object fee at 02EF21A0, file "<stdin>", line 2>)
13 MAKE_FUNCTION 1
16 STORE_FAST 0 (fee)

4 19 LOAD_FAST 0 (fee)
22 RETURN_VALUE

Compare to what came before MAKE_FUNCTION in the previous example. The def fee ... compiled into
code to compute the argument default value expression right there. That is not part of the fee
code, that is part of the fee-making code.

If the nested function makes use of its nested environment, there is extra work in setting up
the closure variables, so MAKE_FUNCTION is replaced my MAKE_CLOSURE, but either way the creation
of fee doesn't happen until the code compiled from the def source is executed, and if the def
code is nested, it doesn't get executed until its enclosing function is called.

Notice that def foo (compiled AND executed interactively above) didn't complain about globalfun,
which is just a name I picked. That's because foo hasn't been called yet, and the def-code to
generate fee has not been executed yet. But when we do:
foo() Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "<stdin>", line 2, in foo
NameError: global name 'globalfun' is not defined

That's not from fee, that's from trying to create a fee default argument
as part of creating fee dynamically.

Now supplying globalfun, we can make it succeed:
import time
def globalfun(): return '[%s]'%time.ctime() ... foo() <function fee at 0x02EF802C> foo()(111, 222) (111, 222) foo()(333) (333, '[Mon Apr 04 22:31:23 2005][Mon Apr 04 22:31:23 2005]') foo()(333) (333, '[Mon Apr 04 22:31:37 2005][Mon Apr 04 22:31:37 2005]')

Note that the times changed, because we execute foo() and therefore def fee
multiple times but if we capture the fee output as f1 and f2, the times
are captured in fee's default arg according to when def fee was executed
and then that belongs to fee and won't change when fee is called.
foo()(111, 222) (111, 222)
That generated a new fee with a new default time, but didn't use the default
foo()(333) (333, '[Mon Apr 04 22:32:48 2005][Mon Apr 04 22:32:48 2005]')
That used the default. Now we will make instances of fee and
bind them to f1 and f2, pausing a little between so the times
should be different.
f1 = foo() (paused 8 seconds apparently) f2 = foo()
Now calling f1 or f2 is calling different fees with different,
(but constant because we are not calling foo again) defaults:
f1(1) (1, '[Mon Apr 04 22:32:55 2005][Mon Apr 04 22:32:55 2005]') f1(1) (1, '[Mon Apr 04 22:32:55 2005][Mon Apr 04 22:32:55 2005]')

And different, but repeated: f2(1) (1, '[Mon Apr 04 22:33:03 2005][Mon Apr 04 22:33:03 2005]') f2(1) (1, '[Mon Apr 04 22:33:03 2005][Mon Apr 04 22:33:03 2005]')

Now get foo to make another fee and call it without even storing it: foo()(1)

(1, '[Mon Apr 04 22:33:40 2005][Mon Apr 04 22:33:40 2005]')
[...]Today I believe I have the correct view as I've said this morning. I
could be wrong yet again. I hope not though I might have to give up
programming. :/ Don't give up. It would be boring if it were all instantly clear.
The view is better after an enjoyable hike, and some of the flowers
along the way may turn out prettier than whatever the vista at the
top may be ;-)

For this part of the trail, just grok that def is executable,
not just the thing def's execution produces ;-)
It's interesting that I have had several others tell me they had
trouble with this too.

So it is my opinion that decorators are a little too implicit. I
think there should be a way to make them easier to use while achieving
the same objective and use.

Maybe the above will help make functions and decorators a little easier
to understand.

HTH

Regards,
Bengt Richter
Jul 18 '05 #28

P: n/a
On Tue, 05 Apr 2005 06:52:58 GMT, bo**@oz.net (Bengt Richter) wrote:
Ok, yes, besides the globals, but I figured that part is obvious so I
didn't feel I needed to mention it. The function call works the same
even though they are not nested functions.
I am afraid that is wrong. But be happy, this may be the key to what ISTM
is missing in your concept of python functions ;-)


The expression in the form of "function(args)(args)" is the same
pattern in two "different" cases, which was all that I was trying to
say. Not that the exact process of the two different cases were the
same.
So, no, it does not "work the same." In fact, it is a misconception to talk about
a nested fee as if it existed ready to call in the same way as foo. It doesn't
exist that way until the fee def is EXECUTED, producing the callable fee.
Ok, I'm going to have to be more careful in how I phrase things I
think, I tend to over-genralize a bit. I said they were "the same",
but meant similar, a mistake in wording, but not in my understanding.

But this is a good point. In my example the calling expression does
not yet know who the next tuple of arguments will go to until foo
returns it. That part is the same, but as you point out in a nested
scope foo defines fee then returns it. And in the non nested example
fee is already defined before foo is called. And they must use
globals to communicate because they are not share the same name space.
They differ because fee is temporary, in the nested version, only
existing until the expression foo(arg)(arg) is evaluated. It never
gets assigned a name in foo's parent name space. Do I have that
correct?

We can use dis to see the above clearly:
Love the byte code walk through, Thanks. Is there a resource that
goes in depth on python byte code and the compiler? I haven't been
able to find much on it on google.
import time
def globalfun(): return '[%s]'%time.ctime() ... foo() <function fee at 0x02EF802C> foo()(111, 222) (111, 222) foo()(333) (333, '[Mon Apr 04 22:31:23 2005][Mon Apr 04 22:31:23 2005]') foo()(333) (333, '[Mon Apr 04 22:31:37 2005][Mon Apr 04 22:31:37 2005]')


I like your idea of using time stamps to trace code! :)

[...]
Today I believe I have the correct view as I've said this morning. I
could be wrong yet again. I hope not though I might have to give up
programming. :/Don't give up. It would be boring if it were all instantly clear.
The view is better after an enjoyable hike, and some of the flowers
along the way may turn out prettier than whatever the vista at the
top may be ;-)


I won't give up, at most I would take a break, but I love programming
too much to give it up. ;-)

Maybe the above will help make functions and decorators a little easier
to understand.


I understand functions, sometimes it's difficult to describe just what
it is I don't understand yet, and sometimes I fool myself by jumping
to an invalid conclusion a little too quickly. But I do this for
enjoyment and learning, so I'm not constrained by the need to not make
mistakes, (those are just part of learning in my oppinion), as I would
if my job depended on it. However it's a little frustrating when my
inability to write well, gets in the way of expressing myself
accurately.

But a few questions remain...

When a @decorator statement is found, How does the compiler handle it?

Let me see if I can figure this out...using dis. :)
from dis import dis
def deco1(d1): return d1 def func1(f1): @deco1
def func2(f2):
return f2
return func2(f1)
func1(2) 2
dis(deco1) 1 0 LOAD_FAST 0 (d1)
3 RETURN_VALUE dis(func1)

2 0 LOAD_GLOBAL 0 (deco1)
3 LOAD_CONST 1 (<code object func2 at
00B45CA0, file "<pyshell#11>", line 2>)
6 MAKE_FUNCTION 0
9 CALL_FUNCTION 1
12 STORE_FAST 1 (func2)

5 15 LOAD_FAST 1 (func2)
18 LOAD_FAST 0 (f1)
21 CALL_FUNCTION 1
24 RETURN_VALUE

I'm not sure how to interpret this... Line 5 and below is the return
expression. The part above it is the part I'm not sure about.

Is the first CALL_FUNCTION calling deco1 with the result of the
defined functions reference, as it's argument? Then storing the result
of deco1 with the name func2?

If so the precompiler/parser is replacing the @deco1 with a call to
the deco1 function like this.

deco1( (def func2(f2):return f2) )

But this causes an illegal syntax error on the def statement. So you
can't do it directly. Or is there yet another way to view this? :)

Cheers,
Ron
Jul 18 '05 #29

P: n/a
On Tue, 05 Apr 2005 18:59:32 GMT, Ron_Adam <ra****@tampabay.rr.com> wrote:
On Tue, 05 Apr 2005 06:52:58 GMT, bo**@oz.net (Bengt Richter) wrote:
Ok, yes, besides the globals, but I figured that part is obvious so I
didn't feel I needed to mention it. The function call works the same
even though they are not nested functions.
I am afraid that is wrong. But be happy, this may be the key to what ISTM
is missing in your concept of python functions ;-)


The expression in the form of "function(args)(args)" is the same
pattern in two "different" cases, which was all that I was trying to
say. Not that the exact process of the two different cases were the
same.
So, no, it does not "work the same." In fact, it is a misconception to talk about
a nested fee as if it existed ready to call in the same way as foo. It doesn't
exist that way until the fee def is EXECUTED, producing the callable fee.


Ok, I'm going to have to be more careful in how I phrase things I
think, I tend to over-genralize a bit. I said they were "the same",
but meant similar, a mistake in wording, but not in my understanding.

But this is a good point. In my example the calling expression does
not yet know who the next tuple of arguments will go to until foo
returns it. That part is the same, but as you point out in a nested
scope foo defines fee then returns it. And in the non nested example
fee is already defined before foo is called. And they must use
globals to communicate because they are not share the same name space.
They differ because fee is temporary, in the nested version, only
existing until the expression foo(arg)(arg) is evaluated. It never
gets assigned a name in foo's parent name space. Do I have that
correct?

I think so.
We can use dis to see the above clearly:
Love the byte code walk through, Thanks. Is there a resource that
goes in depth on python byte code and the compiler? I haven't been
able to find much on it on google.

I don't know of anything other than the compiler and cpython sources.
The byte codes are not all the same from version to version, since
added language features may require new byte code operations, at least
for efficiency, and (as our postings here show ;-) explaining code
clearly is as much of a job as writing it. Fortunately, python code
is pretty readable, and there is a lot of interesting reading in

Python-2.4xxx\Lib\compiler\ and Python-2.4xxx\Lib\compiler\

on your disk if you download the source installation. Lots of
other goodies as well like demo code etc.
>>> import time
>>> def globalfun(): return '[%s]'%time.ctime()

...
>>> foo()

<function fee at 0x02EF802C>
>>> foo()(111, 222)

(111, 222)
>>> foo()(333)

(333, '[Mon Apr 04 22:31:23 2005][Mon Apr 04 22:31:23 2005]')
>>> foo()(333)

(333, '[Mon Apr 04 22:31:37 2005][Mon Apr 04 22:31:37 2005]')


I like your idea of using time stamps to trace code! :)


Well, I wanted to emphasize the time dimension in creation and existence
of things, so I thought to tag them.

[...]I understand functions, sometimes it's difficult to describe just what
it is I don't understand yet, and sometimes I fool myself by jumping
to an invalid conclusion a little too quickly. But I do this for
enjoyment and learning, so I'm not constrained by the need to not make
mistakes, (those are just part of learning in my oppinion), as I would
if my job depended on it. However it's a little frustrating when my
inability to write well, gets in the way of expressing myself
accurately. I feel very much the same ;-)

But a few questions remain...

When a @decorator statement is found, How does the compiler handle it?

Let me see if I can figure this out...using dis. :)
from dis import dis
def deco1(d1): return d1 def func1(f1): @deco1
def func2(f2):
return f2
return func2(f1) That last line is a bit strange, which leads to the strangeness
you ask about below.
func1(2)2
dis(deco1) 1 0 LOAD_FAST 0 (d1)
3 RETURN_VALUE dis(func1) 2 0 LOAD_GLOBAL 0 (deco1)
3 LOAD_CONST 1 (<code object func2 at
00B45CA0, file "<pyshell#11>", line 2>)
6 MAKE_FUNCTION 0
9 CALL_FUNCTION 1
12 STORE_FAST 1 (func2)

5 15 LOAD_FAST 1 (func2)
18 LOAD_FAST 0 (f1)
21 CALL_FUNCTION 1
24 RETURN_VALUE

I'm not sure how to interpret this... Line 5 and below is the return
expression. The part above it is the part I'm not sure about.
Well, it's showing the result of your python code, but the return func2(f1)
is obscuring normal decorator functionality. I suggest going from the simplest
and introducing complications stepwise. You are defining a func1 that could
be used as a decorator function, and it decorates its function argument by
explicitly calling an internally defined decorator function func2, whose internal
definition involved decorating func2 using func1. We can walk through it.
It's legal usage, it's just not the simplest example or clearest design ;-)

Is the first CALL_FUNCTION calling deco1 with the result of the
defined functions reference, as it's argument? Then storing the result
of deco1 with the name func2?
Yes, if I understand you correctly.
I'll use list notation to represent the stack, so [] is empty and
[bottom, middle, top] is a stack with references to three things.
So _after_ each line above, we get

0 -> [deco1] This is a reference to the callable function deco1
3 -> [deco1, <code object func2>] The top is what's necessary to make the func2 function
6 -> [deco1, func2] MAKE_FUNCTION pops its arg and pushes it result (func2)
9 -> [func2*] CALL_FUNCTION 1 (one arg) called deco1(func2) and stacked the decorated func2
12 -> [] func2 reference popped from stack and stored in local func2
15 -> [func2*] local func2 (I use * to indicate it's the decorated func2) pushed on stack
18 -> [func2*, f1] stack the outer f1 argument. This is for your weird func2(f1) return value expression
21 -> [f1*] CALL_FUNCTION calls the decorated func2 with f1 as argument and stacks the modified f1
24 -> [] f1* (modified f1) is popped for return value

If so the precompiler/parser is replacing the @deco1 with a call to
the deco1 function like this.

deco1( (def func2(f2):return f2) )

But this causes an illegal syntax error on the def statement. So you
can't do it directly. Or is there yet another way to view this? :)

With simple examples? ;-)

One of the original examples of decorating was to replace the
staticmethod and classmethod function calls that had to be done
after the method defs. So the simplest view goes back to

@deco
def foo(): pass

being the equivalent (except if deco raises and exception) of

def foo(): pass
foo = deco(foo)

The plain name @deco was then allowed to become a simple xxx.yyy(zzz,...) expression
returning a callable that would serve like the decorator function a bare name normally
referred to. And then the mechanism could be cascaded. I suggest looking at the
code for the simplest cascade of normal decorators. E.g., (putting the example in
the body of a function makes dis.dis easy ;-)
def deco_outer(f): return f ... def deco_inner(f): return f ... def example(): ... @deco_outer
... @deco_inner
... def foo(): pass
...
import dis
dis.dis(example)

2 0 LOAD_GLOBAL 0 (deco_outer) -> [deco_outer]
3 LOAD_GLOBAL 1 (deco_inner) -> [deco_outer, deco_inner]
6 LOAD_CONST 1 (<code object foo at 02EE4FA0, file "<stdin>", line 2>) -> [deco_outer, deco_inner, <code obj>]
9 MAKE_FUNCTION 0 -> [deco_outer, deco_inner, foo]
12 CALL_FUNCTION 1 -> [deco_outer, foo*] foo* == deco_inner(foo)
15 CALL_FUNCTION 1 -> [foo**] foo** == deco_outer(foo*)
18 STORE_FAST 0 (foo) -> [] foo = deco_outer(deco_inner(foo))
21 LOAD_CONST 0 (None) # ignore ;-)
24 RETURN_VALUE

I didn't return foo from example, so it generated code at 21 to return a None,
but you can ignore that, since the decoration ends with the binding of the function
name (foo here) done at 18 with STORE_FAST.

Regards,
Bengt Richter
Jul 18 '05 #30

P: n/a
On Wed, 06 Apr 2005 00:23:01 GMT, bo**@oz.net (Bengt Richter) wrote:
I don't know of anything other than the compiler and cpython sources.
The byte codes are not all the same from version to version, since
added language features may require new byte code operations, at least
for efficiency, and (as our postings here show ;-) explaining code
clearly is as much of a job as writing it. Fortunately, python code
is pretty readable, and there is a lot of interesting reading in

Python-2.4xxx\Lib\compiler\ and Python-2.4xxx\Lib\compiler\

on your disk if you download the source installation. Lots of
other goodies as well like demo code etc.
Thanks, I've already downloaded the source as well as CVS, although I
don't have a resent version of VisualC++. I Tried the Express version
8.0 since it's free, but it fails on the library with link errors.
:-/, Not that I expected it to work since nothing I could find said
it would. Probably easier to load up linux. But I don't need a
compiler to read the source.


One of the original examples of decorating was to replace the
staticmethod and classmethod function calls that had to be done
after the method defs. So the simplest view goes back to

@deco
def foo(): pass

being the equivalent (except if deco raises and exception) of

def foo(): pass
foo = deco(foo)

The plain name @deco was then allowed to become a simple xxx.yyy(zzz,...) expression
returning a callable that would serve like the decorator function a bare name normally
referred to. And then the mechanism could be cascaded. I suggest looking at the
code for the simplest cascade of normal decorators. E.g., (putting the example in
the body of a function makes dis.dis easy ;-)


So the @decorator functionality was a very small incremental change to
the pre compiler. That also explains the nesting behavior of stacked
decorators. A small change with a worth while functionality. :-)

Looks like the @decorator is pseudo function limited to a single
callable as it's body.

(experimenting with a different syntax)

@deco(a):
def function(x): pass
function = deco(a)(function)(x)
Stacked, it would be:

@deco1(a):
@deco2(b):
def function(x):
return x+1
function = deco2(a)(function)(x)
function = deco1(b)(function)(x)
Each subsequent stacked _@_ statement redefines "function" when it
exits.

If I have this correct, this would be the equivalent long hand of two
stacked _@_ expressions.
Cheers,
Ron

Jul 18 '05 #31

This discussion thread is closed

Replies have been disabled for this discussion.