By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
428,590 Members | 663 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 428,590 IT Pros & Developers. It's quick & easy.

PEP: Specialization Syntax

P: n/a
Hi everyone, I would to know what do you think of this PEP. Any comment
welcomed (even about English mistakes).

PEP: XXX
Title: Specialization Syntax
Version: $Revision: 1.10 $
Last-Modified: $Date: 2003/09/22 04:51:49 $
Author: Nicolas Fleury <nidoizo at gmail.com>
Status: Draft
Type: Standards Track
Content-Type: text/plain
Created: 24-Jul-2005
Python-Version: 2.5
Post-History:
Abstract

This PEP proposes a syntax in Python to do what is called in
this document "specialization". It contains more than one
proposal:
- Extend square brackets syntax to allow a full call syntax,
using a __specialize__ method, similarly to the __call__
method.
- Extend function definition syntax to allow specialization
of functions.
- Parameterized types.
Motivation

In his controversial blog entry "Adding Optional Typing to
Python -- Part II" [1], Guido Van Rossum introduced the idea
of "parameterized types" in Python. The proposition was to
use [square brackets] rather than <pointy ones> to allow
prototyping with __getitem__ method. However, the __getitem__
method is not flexible enough. It doesn't support keyword
arguments and using multiple and default arguments can be pain,
since the only argument received would be a tuple. Calling
can also be error-prone if a tuple can be allowed as a first
argument. This PEP proposes to enhance square brackets syntax
to allow full-call syntax as with parenthesis.

Note that Guido dropped the idea, for now, of parameterized
types in a following blog entry [2]. This PEP introduces
parameterized types only as a last step, and focus more on
having a syntax to prototype them. This PEP can also serve
as a place to discuss to feature of specialization independently.

The term "specialization" is used in that document because
"parameterized functions" would sound like an already available
feature. As Guido pointed out [1], "generic" is neither a good
term. Specialization is a term in that case borrowed from C++.
The term alone is not perfect, since it refers to the action of
passing arguments to a "parameterized type" (or function) to
make it usable and a term must still be found to describe the
"unspecialized" type or function.

Another motivation to this PEP is the frequent usage in Python
of functions to create functions. This pattern is often used
to create callback functions and decorators, to only name these.
However, the fact that both the creation and the use is using
parenthesis can be confusing. Also, a programmer ends up naming
two functions, when only the creating one is called by name and
the created one is doing the job. Some programmers ends up
naming the creating function with the name they would want to
give to the created function, confusing even more the code using
it. To fix this situation, this PEP proposes a syntax for
function specialization.
__specialize__ Special Member Function.

The first element of this proposal is the addition of the
__specialize__ special member function. The __specialize__
function can have the same signatures as __call__. When
defined, the definition of __getitem__ has no effect, and
__specialize__ will be called instead.

The language grammar is extended to allow keyword arguments
and no arguments. For example:

class MyObject(object):
def __specialize__(self, a=4, b=6, *args, **kwargs):
pass

obj = MyObject()
obj[b=7, a=8, c=10]
obj[]

Note that when __specialize__ is defined, __setitem__,
__getslice__ and __setslice__ are still used as before.
The specializer Decorator

To explain the syntaxes proposed in this PEP, the following
decorator is used:

class Specializer:
def __init__(self, callableObj):
self.callableObj
self.__name__ = callableObj.__name__
def __specialize__(self, *args, **kwargs):
self.callableObj(*args, **kwargs)

def specializer(callableObj):
return Specializer(callableObj)

It takes a callable and make it callable with square brackets
instead.
Function Specialization

A common pattern in Python is to use a function to create
another function:

def makeGetMemberFunc(memberName):
def getMember(object):
return getattr(object, memberName)
return getMember

foo(makeGetMemberFunc('xyz'))

The second element of this proposal is to add a syntax so
that the previous example can be replaced by:

def getMember[memberName](object):
return getattr(object, memberName)

foo(getMember['xyz'])

which is equivalent to:

@specializer
def getMember(memberName):
def getMember(object):
return getattr(object, memberName)
return getMember

foo(getMember['xyz'])
Class Specialization

The last element of this proposal is to add a syntax to pass
arguments to class creation:

class List[ElementType=object](list):
...

This would be the equivalent to:

@specializer
def List(ElementType=object):
class List(list):
...
return List

Note that the additional syntax is inserted before the
inheritance, different than what was initially proposed [1].
The reason is that inheritance can need the specialization
arguments, and it is more intuitive to use an argument
after its introduction:

class MyList[ElementType=object](List[ElementType]):
...
Backward Compatibility

The three propositions are backward compatible.
Open Issues

Instead of adding a __specialize__ method, the __getitem__
method could be changed to allow additional signatures:

def __getitem__(self, *args, **kwargs): ...

Should other operators that square brackets be used for
specialization?
References

[1] Adding Optional Static Typing to Python -- Part II,
Guido van Rossum
http://www.artima.com/weblogs/viewpost.jsp?thread=86641

[2] Optional Static Typing -- Stop the Flames!, Guido van Rossum
http://www.artima.com/weblogs/viewpost.jsp?thread=87182
Copyright

This document has been placed in the public domain.

Aug 7 '05 #1
Share this Question
Share on Google+
19 Replies


P: n/a
Nicolas Fleury wrote:
Hi everyone, I would to know what do you think of this PEP. Any comment
welcomed (even about English mistakes).


-1. I don't see the point of this PEP. Apparently, you want to define
parametrized types - but for what purpose? I.e. what are the specific
use cases for the proposed syntax, and why do you need to change the
language to support these use cases? I very much doubt that there are
no acceptable alternatives for each case.

IOW, whatever it is that you could do with this PEP, it seems you could
do this today easily.

Regards,
Martin
Aug 7 '05 #2

P: n/a
Martin v. Lwis wrote:
-1. I don't see the point of this PEP. Apparently, you want to define
parametrized types - but for what purpose? I.e. what are the specific
use cases for the proposed syntax, and why do you need to change the
language to support these use cases? I very much doubt that there are
no acceptable alternatives for each case.
Well, I'm using the alternatives. For example, where I work we have
built a small framework to create binary data to be loaded-in-place by
C++ code (it might be presented at next GDC (Game Developer
Conference)). It uses metaclasses and descriptors to allow things like:

class MyObject(LoadInPlaceObject):
size = Member(Int32)
data = Member(makeArrayType(makePtrType(MyObject2, nullable=True)))
...

I know, it's not really Python, but still, defining functions like
makeArrayType and makePtrType is a pain. It is necessary to maintain a
dictionary of types (to avoid redundacy) and simple things like:

def makeType(someArgument):
class MyObject:
someArgument = someArgument
return MyObject

are not allowed. So its ends up with something like:

__arrayTypes = {}
def makeArrayType(arg1, arg2=someDefault):
if (arg1, arg2) in __arrayTypes:
return __arrayTypes[arg1, arg2]
renamed_arg1 = arg1
renamed_arg2 = arg2
class Array:
arg1 = renamed_arg1
arg2 = renamed_arg2
...
__arrayTypes[arg1, arg2] = Array
return Array

Does it qualify as an "acceptable alternative"? when it could have been:

class Array[arg1, arg2=someDefault]:
...

I probably should have put this example in the PEP.
IOW, whatever it is that you could do with this PEP, it seems you could
do this today easily.


The PEP validity is also very much influenced if optional static typing
is planned to be added to the language. I realize defending this PEP is
much harder without static typing and my use cases imply typing of some
sort anyway.

Regards,
Nicolas

Aug 7 '05 #3

P: n/a
On Sun, 07 Aug 2005 16:22:11 -0400, Nicolas Fleury <ni*****@yahoo.com> wrote:
Hi everyone, I would to know what do you think of this PEP. Any comment
welcomed (even about English mistakes).

PEP: XXX
Title: Specialization Syntax
Version: $Revision: 1.10 $
Last-Modified: $Date: 2003/09/22 04:51:49 $
Author: Nicolas Fleury <nidoizo at gmail.com>
Status: Draft
Type: Standards Track
Content-Type: text/plain
Created: 24-Jul-2005
Python-Version: 2.5
Post-History:
Abstract

This PEP proposes a syntax in Python to do what is called in
this document "specialization". It contains more than one
proposal:
- Extend square brackets syntax to allow a full call syntax,
using a __specialize__ method, similarly to the __call__
method.
- Extend function definition syntax to allow specialization
of functions.
- Parameterized types.
Motivation

In his controversial blog entry "Adding Optional Typing to
Python -- Part II" [1], Guido Van Rossum introduced the idea
of "parameterized types" in Python. The proposition was to
use [square brackets] rather than <pointy ones> to allow
prototyping with __getitem__ method. However, the __getitem__
method is not flexible enough. It doesn't support keyword
arguments and using multiple and default arguments can be pain,
since the only argument received would be a tuple. Calling
can also be error-prone if a tuple can be allowed as a first
argument. This PEP proposes to enhance square brackets syntax
to allow full-call syntax as with parenthesis.

Note that Guido dropped the idea, for now, of parameterized
types in a following blog entry [2]. This PEP introduces
parameterized types only as a last step, and focus more on
having a syntax to prototype them. This PEP can also serve
as a place to discuss to feature of specialization independently.

The term "specialization" is used in that document because
"parameterized functions" would sound like an already available
feature. As Guido pointed out [1], "generic" is neither a good
term. Specialization is a term in that case borrowed from C++.
The term alone is not perfect, since it refers to the action of
passing arguments to a "parameterized type" (or function) to
make it usable and a term must still be found to describe the
"unspecialized" type or function.

Another motivation to this PEP is the frequent usage in Python
of functions to create functions. This pattern is often used
to create callback functions and decorators, to only name these.
However, the fact that both the creation and the use is using
parenthesis can be confusing. Also, a programmer ends up naming
two functions, when only the creating one is called by name and
the created one is doing the job. Some programmers ends up
naming the creating function with the name they would want to
give to the created function, confusing even more the code using
it. To fix this situation, this PEP proposes a syntax for
function specialization.
__specialize__ Special Member Function. By "Member Function" do you mean anything different from "method"?

The first element of this proposal is the addition of the
__specialize__ special member function. The __specialize__
function can have the same signatures as __call__. When Any function can have any legal signature, so I'm not sure what you are saying.
defined, the definition of __getitem__ has no effect, and
__specialize__ will be called instead. What about subclassing and overriding __getitem__ ?

The language grammar is extended to allow keyword arguments
and no arguments. For example:

class MyObject(object):
def __specialize__(self, a=4, b=6, *args, **kwargs):
pass here you can currently write
__getitem__ = __specialize__
although you have to remember that obj[:] and related slicing expressions
become legal and that obj[] does not, without a language sysntax change.
obj = MyObject()
obj[b=7, a=8, c=10]
obj[]

Note that when __specialize__ is defined, __setitem__,
__getslice__ and __setslice__ are still used as before.
The specializer Decorator

To explain the syntaxes proposed in this PEP, the following
decorator is used:

class Specializer:
def __init__(self, callableObj):
self.callableObj ^^?? = callableObj ? self.__name__ = callableObj.__name__
def __specialize__(self, *args, **kwargs):
self.callableObj(*args, **kwargs)

def specializer(callableObj):
return Specializer(callableObj)

It takes a callable and make it callable with square brackets
instead. Well, it wraps it, but the thing itself is still called as before from the wrapper,
so "it" itself is not "callable" with square brackets ;-)


Function Specialization

A common pattern in Python is to use a function to create
another function:

def makeGetMemberFunc(memberName):
def getMember(object):
return getattr(object, memberName)
return getMember

foo(makeGetMemberFunc('xyz'))
Either closures like the above or bound methods work for this,
so you just want more concise spelling?

The second element of this proposal is to add a syntax so
that the previous example can be replaced by:

def getMember[memberName](object):
return getattr(object, memberName)

foo(getMember['xyz'])

which is equivalent to:

@specializer
def getMember(memberName):
def getMember(object):
return getattr(object, memberName)
return getMember

foo(getMember['xyz'])
Have you looked at currying? E.g.,
http://aspn.activestate.com/ASPN/Coo...n/Recipe/52549

Also, I made a byte-hacking decorator that is able to inject local presets
into a function itself (hence without wrapping overhead when used) or to curry
in another variation of the decorator. E.g.,
from ut.presets import presets, curry
@curry(memberName='xyz') ... def getMember(obj, memberName):
... return getattr(obj, memberName)
... o = type('',(),{})()
o.xyz = 'This is object o attribute xyz'
getMember(o) 'This is object o attribute xyz' import dis
dis.dis(getMember) 1 0 LOAD_CONST 1 ('xyz')
3 STORE_FAST 1 (memberName)

3 6 LOAD_GLOBAL 0 (getattr)
9 LOAD_FAST 0 (obj)
12 LOAD_FAST 1 (memberName)
15 CALL_FUNCTION 2
18 RETURN_VALUE getMember.func_code.co_argcount 1

Or the presets decorator can make the preset available without
having been a part of the original signature at all:
@presets(attname='xyz') ... def foo(obj): return getattr(obj, attname)
... foo(o) 'This is object o attribute xyz' dis.dis(foo) 1 0 LOAD_CONST 1 ('xyz')
3 STORE_FAST 1 (attname)

3 6 LOAD_GLOBAL 0 (getattr)
9 LOAD_FAST 0 (obj)
12 LOAD_FAST 1 (attname)
15 CALL_FUNCTION 2
18 RETURN_VALUE

As mentioned, these are byte code hacks. So they are pretty
fragile, version-portability-wise.


Class Specialization

The last element of this proposal is to add a syntax to pass
arguments to class creation:

class List[ElementType=object](list):
...

This would be the equivalent to:

@specializer
def List(ElementType=object):
class List(list):
...
return List

Note that the additional syntax is inserted before the
inheritance, different than what was initially proposed [1].
The reason is that inheritance can need the specialization
arguments, and it is more intuitive to use an argument
after its introduction:

class MyList[ElementType=object](List[ElementType]):
...

Before I'd want to extend class syntax this way, I think I'd want to
explore some other aspects of class syntax as well, with more (and
more persuasive) use cases in view. Also more thought to what is done
when and whether the issue is to supply information into existing control
contexts or to modify control flow as well, to extend possibilities for
customized processing.

Backward Compatibility

The three propositions are backward compatible.
Open Issues

Instead of adding a __specialize__ method, the __getitem__ When you say "the" __getitem__ method, what do you mean? AFAIK the
method itself is an unrestricted function. It just happens that
binding it as a class attribute __getitem__ makes it get called
from code with square bracket access spellings. I think that's where
your changes to allow "additional signatures" would have to go. I.e.,
in generation of code from the "calling" syntax. To illustrate:
class C(object): ... def __getitem__(self, *args, **kwargs):
... return self, args, kwargs
... c=C()
c[1] (<__main__.C object at 0x02EF498C>, (1,), {}) c[1,2] (<__main__.C object at 0x02EF498C>, ((1, 2),), {}) c[:] (<__main__.C object at 0x02EF498C>, (slice(None, None, None),), {}) c[kw='key word arg'] File "<stdin>", line 1
c[kw='key word arg']
^
SyntaxError: invalid syntax

But here the problem is not in the __getitem__ method:
c.__getitem__(kw='key word arg')

(<__main__.C object at 0x02EF498C>, (), {'kw': 'key word arg'})

It's just that square bracket expression trailer syntax does not
allow the same arg list syntax as parenthesis calling trailer syntax.
method could be changed to allow additional signatures:

def __getitem__(self, *args, **kwargs): ...

Should other operators that square brackets be used for
specialization? Didn't quite parse that ;-) You mean list comprehensions? Or ??

References

[1] Adding Optional Static Typing to Python -- Part II,
Guido van Rossum
http://www.artima.com/weblogs/viewpost.jsp?thread=86641

[2] Optional Static Typing -- Stop the Flames!, Guido van Rossum
http://www.artima.com/weblogs/viewpost.jsp?thread=87182
Copyright

This document has been placed in the public domain.


Regards,
Bengt Richter
Aug 8 '05 #4

P: n/a
On Sun, 07 Aug 2005 17:20:25 -0400, Nicolas Fleury <ni*****@yahoo.com> wrote:
Martin v. Lwis wrote:
-1. I don't see the point of this PEP. Apparently, you want to define
parametrized types - but for what purpose? I.e. what are the specific
use cases for the proposed syntax, and why do you need to change the
language to support these use cases? I very much doubt that there are
no acceptable alternatives for each case.
Well, I'm using the alternatives. For example, where I work we have
built a small framework to create binary data to be loaded-in-place by
C++ code (it might be presented at next GDC (Game Developer
Conference)). It uses metaclasses and descriptors to allow things like:

class MyObject(LoadInPlaceObject):
size = Member(Int32)
data = Member(makeArrayType(makePtrType(MyObject2, nullable=True)))
...

I know, it's not really Python, but still, defining functions like
makeArrayType and makePtrType is a pain. It is necessary to maintain a
dictionary of types (to avoid redundacy) and simple things like:

def makeType(someArgument):
class MyObject:
someArgument = someArgument
return MyObject

are not allowed. So its ends up with something like:

I don't understand why you wouldn't give the function arg a different name
in the first place instead of via a temporary intermediary binding, e.g.,

def makeType(someArgument_alias):
class MyObject:
someArgument = someArgument_alias
return MyObject

__arrayTypes = {}
def makeArrayType(arg1, arg2=someDefault):
if (arg1, arg2) in __arrayTypes:
return __arrayTypes[arg1, arg2]
renamed_arg1 = arg1
renamed_arg2 = arg2
class Array:
arg1 = renamed_arg1
arg2 = renamed_arg2
...
__arrayTypes[arg1, arg2] = Array
return Array
Or (untested, using new style class):

def makeArrayType(arg1, arg2=someDefault):
try: return __arrayTypes[arg1, arg2]
except KeyError:
__arrayTypes[arg1, arg2] = Array = type('Array',(),{'arg1':arg1, 'arg2':arg2})
return Array

(just re-spelling functionality, not understanding what your real use case is ;-)
Does it qualify as an "acceptable alternative"? when it could have been:

class Array[arg1, arg2=someDefault]:
...

I probably should have put this example in the PEP.
IOW, whatever it is that you could do with this PEP, it seems you could
do this today easily.
I agree with this, until I see some really persuasive use cases.

The PEP validity is also very much influenced if optional static typing
is planned to be added to the language. I realize defending this PEP is
much harder without static typing and my use cases imply typing of some
sort anyway.

I'll have to catch up with that. Have been very bogged down for a long while.

Regards,
Bengt Richter
Aug 8 '05 #5

P: n/a
Bengt Richter wrote:
__specialize__ Special Member Function.
By "Member Function" do you mean anything different from "method"?


No, I should have written method. C++ habit.
The first element of this proposal is the addition of the
__specialize__ special member function. The __specialize__
function can have the same signatures as __call__. When


Any function can have any legal signature, so I'm not sure what you are saying.


You're right, I should focus on the syntax change, to call __getitem__
(or __specialize__) automatically.
defined, the definition of __getitem__ has no effect, and
__specialize__ will be called instead.


What about subclassing and overriding __getitem__ ?


I have no problem with that. I even suggest it at the end of the PEP.

But don't you think the name "__getitem__" is not appropriate then?
here you can currently write
__getitem__ = __specialize__
although you have to remember that obj[:] and related slicing expressions
become legal and that obj[] does not, without a language sysntax change.
Yes, the PEP is about that syntax change.
class Specializer:
def __init__(self, callableObj):
self.callableObj


^^?? = callableObj ?


Yes, "= callableObj" is missing.
A common pattern in Python is to use a function to create
another function:

def makeGetMemberFunc(memberName):
def getMember(object):
return getattr(object, memberName)
return getMember

foo(makeGetMemberFunc('xyz'))

Either closures like the above or bound methods work for this,
so you just want more concise spelling?


In the case of functions, yes. For functions, I guess the syntax is
much more useful is static typing is added, or planned to be added, in
the language. However, there's still use cases where it is useful.
Have you looked at currying? E.g.,
http://aspn.activestate.com/ASPN/Coo...n/Recipe/52549


And partial will be in Python 2.5 (PEP 309). Yes, I've look at it, but
in my use cases the created function correspond to a specific function
signature, so, for example, you always only want to specify the
"memberName" argument. Currying is nice, but since any argument can
supplied, the code is less self-documenting. But I used these in
day-to-day programming.
class MyList[ElementType=object](List[ElementType]):
...


Before I'd want to extend class syntax this way, I think I'd want to
explore some other aspects of class syntax as well, with more (and
more persuasive) use cases in view. Also more thought to what is done
when and whether the issue is to supply information into existing control
contexts or to modify control flow as well, to extend possibilities for
customized processing.


What do you think of the example I post in a reply to Martin v.Lowis?
Instead of adding a __specialize__ method, the __getitem__


When you say "the" __getitem__ method, what do you mean? AFAIK the
method itself is an unrestricted function. It just happens that
binding it as a class attribute __getitem__ makes it get called
from code with square bracket access spellings. I think that's where
your changes to allow "additional signatures" would have to go. I.e.,
in generation of code from the "calling" syntax. To illustrate:
>>> class C(object): ... def __getitem__(self, *args, **kwargs):
... return self, args, kwargs
... >>> c=C()
>>> c[1] (<__main__.C object at 0x02EF498C>, (1,), {}) >>> c[1,2] (<__main__.C object at 0x02EF498C>, ((1, 2),), {}) >>> c[:] (<__main__.C object at 0x02EF498C>, (slice(None, None, None),), {}) >>> c[kw='key word arg'] File "<stdin>", line 1
c[kw='key word arg']
^
SyntaxError: invalid syntax

But here the problem is not in the __getitem__ method:
>>> c.__getitem__(kw='key word arg') (<__main__.C object at 0x02EF498C>, (), {'kw': 'key word arg'})

It's just that square bracket expression trailer syntax does not
allow the same arg list syntax as parenthesis calling trailer syntax.


I totally agree and that's what I mean. The formulation of the PEP is
wrong, I should almost not talk about __getitem__ since as you said it
can have any signature. The PEP is about extending [] syntax to call
automtically __getitem__ function with more complex signatures.
Should other operators that square brackets be used for
specialization?


Didn't quite parse that ;-) You mean list comprehensions? Or ??


I mean should angle brackets <> like in C++, or another operator, be
used instead?

Regards and thx for your feedback,
Nicolas
Aug 8 '05 #6

P: n/a
Bengt Richter wrote:
I don't understand why you wouldn't give the function arg a different name
in the first place instead of via a temporary intermediary binding, e.g.,

def makeType(someArgument_alias):
class MyObject:
someArgument = someArgument_alias
return MyObject


Because that would affect documentation and keyword arguments. Both the
constructed class *and* the function are exposed to the user, so it
needs to be coherent.
__arrayTypes = {}
def makeArrayType(arg1, arg2=someDefault):
if (arg1, arg2) in __arrayTypes:
return __arrayTypes[arg1, arg2]
renamed_arg1 = arg1
renamed_arg2 = arg2
class Array:
arg1 = renamed_arg1
arg2 = renamed_arg2
...
__arrayTypes[arg1, arg2] = Array
return Array


Or (untested, using new style class):

def makeArrayType(arg1, arg2=someDefault):
try: return __arrayTypes[arg1, arg2]
except KeyError:
__arrayTypes[arg1, arg2] = Array = type('Array',(),{'arg1':arg1, 'arg2':arg2})
return Array

(just re-spelling functionality, not understanding what your real use case is ;-)


Well, of course, but I didn't wrote the rest of the class definition;)

So I need a place to put the class definition. In the end, it looks
very much like the mess in the example. Maybe I'm missing a solution
using decorators. I've defined a few classes like that, and I can live
with it, but having a syntax to do it more easily, I would use it right
away.

I guess it can also be useful for people interfacing with COM or typed
contexts.

Regards,
Nicolas
Aug 8 '05 #7

P: n/a
Nicolas Fleury wrote:
It is necessary to maintain a
dictionary of types (to avoid redundacy) and simple things like:

def makeType(someArgument):
class MyObject:
someArgument = someArgument
return MyObject

are not allowed.


def makeClass(cls_name, **kw):
return type(cls_name,(), kw)
MyObject = makeClass("MyObject",a=8)
MyObject <class '__main__.MyObject'>
MyObject.a

8

Regards,
Kay

Aug 8 '05 #8

P: n/a
Kay Schluehr wrote:
def makeClass(cls_name, **kw):
return type(cls_name,(), kw)
MyObject = makeClass("MyObject",a=8)
MyObject


As said to Bengt, a place is needed to write the class definition.
There's no need for metaclass in that case:

def makeType(a, b, c=someDefault):
arguments = locals()
class MyObject:
pass # Complete definition here
MyObject.__dict__.update(arguments)
return MyObject

Regards,
Nicolas
Aug 8 '05 #9

P: n/a

Nicolas Fleury schrieb:
Kay Schluehr wrote:
def makeClass(cls_name, **kw):
return type(cls_name,(), kw)
>MyObject = makeClass("MyObject",a=8)
>MyObject


As said to Bengt, a place is needed to write the class definition.
There's no need for metaclass in that case:

def makeType(a, b, c=someDefault):
arguments = locals()
class MyObject:
pass # Complete definition here
MyObject.__dict__.update(arguments)
return MyObject

Regards,
Nicolas


I have to admit that i don't actually understand what you want? The
problems you try to solve seem trivial to me but it's probably my fault
and i'm misreading something. You might be correct that your PEP may be
interesting only if "optional static typing" will be introduced to Py3K
and then we will suddenly have an immediate need for dealing with
generic types so that the syntax can be reused for deferred functions (
the concept of "specialization" is usually coupled with some kind of
partial evaluation which doesn't take place somewhere in your proposal
). But i'm not sure if this makes sense at all.

Kay

Aug 8 '05 #10

P: n/a
On 8 Aug 2005 02:26:40 -0700, Kay Schluehr <ka**********@gmx.net> wrote:

I have to admit that i don't actually understand what you want?


Me neither. I don't see the point of this.

--
Email: zen19725 at zen dot co dot uk
Aug 8 '05 #11

P: n/a
Kay Schluehr wrote:
I have to admit that i don't actually understand what you want? The
problems you try to solve seem trivial to me but it's probably my fault
and i'm misreading something. You might be correct that your PEP may be
interesting only if "optional static typing" will be introduced to Py3K
and then we will suddenly have an immediate need for dealing with
generic types so that the syntax can be reused for deferred functions (
the concept of "specialization" is usually coupled with some kind of
partial evaluation which doesn't take place somewhere in your proposal
). But i'm not sure if this makes sense at all.


Well, the partial evaluation is done when using [].

def getMember[memberName](obj):
return getattr(obj, memberName)
x = getMember["xyz"] # specialization
y = x(obj) # use

I realize the term "specialization" can be confusing, since people might
think of what is called in C++ "explicit specialization" and "partial
specialization", while these concepts are not present in the PEP.

The point is that I'm already using static-like typing in frameworks
interacting with other languages with generic types. So I would already
benefit from such a capability, and yes, there's workarounds. I'm
clearly in a minority with such a need, but the first point fo the PEP
is to extend [] syntax, so that it is possible to prototype generic
types using [] operators.

Regards,
Nicolas

Aug 8 '05 #12

P: n/a
On Sun, 07 Aug 2005 21:41:33 -0400, Nicolas Fleury <ni******@yahoo.com_removethe_> wrote:
Bengt Richter wrote: [...]
But here the problem is not in the __getitem__ method:
>>> c.__getitem__(kw='key word arg')

(<__main__.C object at 0x02EF498C>, (), {'kw': 'key word arg'})

It's just that square bracket expression trailer syntax does not
allow the same arg list syntax as parenthesis calling trailer syntax.


I totally agree and that's what I mean. The formulation of the PEP is
wrong, I should almost not talk about __getitem__ since as you said it
can have any signature. The PEP is about extending [] syntax to call
automtically __getitem__ function with more complex signatures.
Should other operators that square brackets be used for
specialization?


Didn't quite parse that ;-) You mean list comprehensions? Or ??


I mean should angle brackets <> like in C++, or another operator, be
used instead?


I am getting the feeling that your PEP is about a means to do something C++-like
in python, not necessarily to enhance python ;-) IOW, it seems like you
want the [<new syntax>] to do something like C++ <type_spec> in templates?

(BTW, I have nothing against giving python new capabilities (quite the reverse),
but not by grafting limbs from other animals ;-)

Maybe you want hidden name-mangling of function defs according to arg types
and corresponding dispatching of calls? I am afraid I am still not clear
on the fundamental motivation for all this ;-)

Regards and thx for your feedback,

You're welcome.

Regards,
Bengt Richter
Aug 8 '05 #13

P: n/a
Bengt Richter wrote:
On Sun, 07 Aug 2005 21:41:33 -0400, Nicolas Fleury <ni******@yahoo.com_removethe_> wrote:
I mean should angle brackets <> like in C++, or another operator, be
used instead?
I am getting the feeling that your PEP is about a means to do something C++-like
in python, not necessarily to enhance python ;-) IOW, it seems like you
want the [<new syntax>] to do something like C++ <type_spec> in templates?


Yes, exactly. Actually Guido also mentionned pointy brackets:
http://www.artima.com/weblogs/viewpost.jsp?thread=86641
(BTW, I have nothing against giving python new capabilities (quite the reverse),
but not by grafting limbs from other animals ;-)
If I look at a very recent blog entry of Guido, it seems the idea is
still in the air:
http://www.artima.com/weblogs/viewpost.jsp?thread=92662
Maybe you want hidden name-mangling of function defs according to arg types
and corresponding dispatching of calls? I am afraid I am still not clear
on the fundamental motivation for all this ;-)


I wrote the PEP to see if was the only one that would benefit from
generic types *before* having optional static typing in the language.

It seems I'm the only one;)

According to blog entry 86641, Guido himself is prototyping with
__getitem__. However, I cannot do the same, because the framework I use
is much more complete and keyword arguments are a must.

Regards,
Nicolas
Aug 8 '05 #14

P: n/a
On Mon, 08 Aug 2005 16:18:50 -0400, Nicolas Fleury <ni******@yahoo.com_removethe_> wrote:
Bengt Richter wrote:
On Sun, 07 Aug 2005 21:41:33 -0400, Nicolas Fleury <ni******@yahoo.com_removethe_> wrote:
I mean should angle brackets <> like in C++, or another operator, be
used instead?


I am getting the feeling that your PEP is about a means to do something C++-like
in python, not necessarily to enhance python ;-) IOW, it seems like you
want the [<new syntax>] to do something like C++ <type_spec> in templates?


Yes, exactly. Actually Guido also mentionned pointy brackets:
http://www.artima.com/weblogs/viewpost.jsp?thread=86641
(BTW, I have nothing against giving python new capabilities (quite the reverse),
but not by grafting limbs from other animals ;-)


If I look at a very recent blog entry of Guido, it seems the idea is
still in the air:
http://www.artima.com/weblogs/viewpost.jsp?thread=92662
Maybe you want hidden name-mangling of function defs according to arg types
and corresponding dispatching of calls? I am afraid I am still not clear
on the fundamental motivation for all this ;-)


I wrote the PEP to see if was the only one that would benefit from
generic types *before* having optional static typing in the language.

It seems I'm the only one;)

According to blog entry 86641, Guido himself is prototyping with
__getitem__. However, I cannot do the same, because the framework I use
is much more complete and keyword arguments are a must.


Here is a decorator object to set up function call dispatch according to type.
It only uses positional arguments, but could be fleshed out, I think.
Not tested beyond what you see ;-)

----< typedispatcher.py >-------------------------------------------------
# typedispatcher.py
"""
Provides a decorator to dispatch function
calls according to arg types and signature.
Example usage:
foodisp = TypeDispatcher() # instance dedicated to foo variants
@foodisp(a=int, b=str)
def foo(a, b):
assert type(a) is int and type(b) is str
return (a,b)
@foodisp(a=str, b=str)
def foo(a, b):
assert type(a) is str and type(b) is str
return (a,b)

"""
class TypeDispatcher(object):
def __init__(self):
self.dispdict = {}
def __call__(self, **kwtypes):
self.kwtemp = kwtypes
return self.dodeco
def dodeco(self, f):
if not hasattr(self, 'name'):
self.name = f.func_name
if f.func_name != self.name:
raise ValueError('This TypeDispatcher instance decorates only functions named %r' % self.name)
sig = tuple((self.kwtemp[argname] for argname in f.func_code.co_varnames[:f.func_code.co_argcount]))
assert len(set([f.func_name]+list(f.func_name for f in self.dispdict.values())))
self.dispdict[sig] = f
return self.docall
def docall(self, *args):
sig = tuple(map(type, args))
try: f = self.dispdict[sig]
except KeyError:
raise TypeError('no function %r with signature %r' % (self.name, sig))
return f(*args)

def test():
try:
foodisp = TypeDispatcher()
@foodisp(a=int, b=str)
def foo(a, b):
assert type(a) is int and type(b) is str
return 'foo(int, str):', (a,b)
@foodisp(a=str, b=str)
def foo(a, b):
assert type(a) is str and type(b) is str
return 'foo(str, str):', (a,b)
@foodisp()
def foo():
return 'foo()', ()
print foo(123, 'hello')
print foo('hi','there')
print foo()
print foo(456, 789)
except Exception, e:
print 'Exception %s: %s' % (e.__class__.__name__, e)
try:
@foodisp()
def bar(): pass
except Exception, e:
print 'Exception %s: %s' % (e.__class__.__name__, e)
if __name__ == '__main__':
test()
--------------------------------------------------------------------------

Result:

[17:12] C:\pywk\ut>py24 typedispatcher.py
('foo(int, str):', (123, 'hello'))
('foo(str, str):', ('hi', 'there'))
('foo()', ())
Exception TypeError: no function 'foo' with signature (<type 'int'>, <type 'int'>)
Exception ValueError: This TypeDispatcher instance decorates only functions named 'foo'
Regards,
Bengt Richter
Aug 9 '05 #15

P: n/a
On Tue, 09 Aug 2005 00:14:25 GMT, bo**@oz.net (Bengt Richter) wrote:
[...]
Here is a decorator object to set up function call dispatch according to type.
It only uses positional arguments, but could be fleshed out, I think.
Not tested beyond what you see ;-)

----< typedispatcher.py >-------------------------------------------------
# typedispatcher.py [...] assert len(set([f.func_name]+list(f.func_name for f in self.dispdict.values())))

[...]
Oops, that was a leftover hack that was supposed to check that all names were the same,
and was missing ==1 at the right. Replaced by using self.name. Sorry. There's probably
more, but the overall idea should be clear. Using **kw is also a leftover of starting
down the trail of handling more signature variants, but I was too lazy.

Regards,
Bengt Richter
Aug 9 '05 #16

P: n/a
Bengt Richter wrote:
On Mon, 08 Aug 2005 16:18:50 -0400, Nicolas Fleury <ni******@yahoo.com_removethe_> wrote:
I wrote the PEP to see if was the only one that would benefit from
generic types *before* having optional static typing in the language.

It seems I'm the only one;)

According to blog entry 86641, Guido himself is prototyping with
__getitem__. However, I cannot do the same, because the framework I use
is much more complete and keyword arguments are a must.

Here is a decorator object to set up function call dispatch according to type.
It only uses positional arguments, but could be fleshed out, I think.
Not tested beyond what you see ;-)


That's nice. Guido also posted this multimethods solution:
http://www.artima.com/weblogs/viewpo...?thread=101605

The only thing I was saying it that I can use generic types in Python
right now (and I do), by using (), but I can't with what will probably
be the syntax in future, i.e. using [].

Regards,
Nicolas
Aug 9 '05 #17

P: n/a
On Mon, 08 Aug 2005 21:24:15 -0400, Nicolas Fleury <ni******@yahoo.com_removethe_> wrote:
Bengt Richter wrote:
On Mon, 08 Aug 2005 16:18:50 -0400, Nicolas Fleury <ni******@yahoo.com_removethe_> wrote:
I wrote the PEP to see if was the only one that would benefit from
generic types *before* having optional static typing in the language.

It seems I'm the only one;)

According to blog entry 86641, Guido himself is prototyping with
__getitem__. However, I cannot do the same, because the framework I use
is much more complete and keyword arguments are a must.
Here is a decorator object to set up function call dispatch according to type.
It only uses positional arguments, but could be fleshed out, I think.
Not tested beyond what you see ;-)


That's nice. Guido also posted this multimethods solution:
http://www.artima.com/weblogs/viewpo...?thread=101605

When I first read that, I thought you meant he had posted the very same thing.
Anyway, maybe mine is different enough to be a little interesting ;-)
The only thing I was saying it that I can use generic types in Python
right now (and I do), by using (), but I can't with what will probably
be the syntax in future, i.e. using [].

Ok ;-) <gesture action="magic handwave"/> Maybe sometime in the future
it will be possible to modify the language grammar and define a few classes
and regenerate a whole new python interpreter that interprets new syntax.

Regards,
Bengt Richter
Aug 9 '05 #18

P: n/a
Nicolas Fleury wrote:
Well, I'm using the alternatives.
Perhaps not to the full power.
__arrayTypes = {}
def makeArrayType(arg1, arg2=someDefault):
if (arg1, arg2) in __arrayTypes:
return __arrayTypes[arg1, arg2]
renamed_arg1 = arg1
renamed_arg2 = arg2
class Array:
arg1 = renamed_arg1
arg2 = renamed_arg2
...
__arrayTypes[arg1, arg2] = Array
return Array

Does it qualify as an "acceptable alternative"? when it could have been:

class Array[arg1, arg2=someDefault]:


So you don't want to write the makeArrayType function, right?

How about this:

# declaration
class Array(object):
__typeargs__ = ['arg1', ('arg2', someDefault)]
...

# use
t = specialize(Array, arg1=Int32)

where specialize is defined as

def specialize(ptype, *args):
result = type(ptype.__name__, (ptype,), args)
for t in result.__typeargs__:
if isinstance(t, string):
if not hasattr(result, t):
raise TypeError("missing parameter "+t)
else:
name,val = t
if not hasattr(result, name):
setattr(result, val)
return result

Regards,
Martin
Aug 9 '05 #19

P: n/a
Martin v. Lwis wrote:
Nicolas Fleury wrote:
Well, I'm using the alternatives.
Perhaps not to the full power.


Not perhaps, surely;) Who does anyway;)
So you don't want to write the makeArrayType function, right?

How about this:

# declaration
class Array(object):
__typeargs__ = ['arg1', ('arg2', someDefault)]
...

# use
t = specialize(Array, arg1=Int32)

where specialize is defined as

def specialize(ptype, *args):
result = type(ptype.__name__, (ptype,), args)
for t in result.__typeargs__:
if isinstance(t, string):
if not hasattr(result, t):
raise TypeError("missing parameter "+t)
else:
name,val = t
if not hasattr(result, name):
setattr(result, val)
return result


That exact solution would not work for me, since that would replace the
class metaclass, right? However, you have a good point. Such a
function could be done by using the class metaclass instead of type,
passing the base classes and dictionary (basically copying the class)
and using a dictionary with args values as a key to avoid redundacies
(can't see if there's something else).

Thx and regards,
Nicolas
Aug 9 '05 #20

This discussion thread is closed

Replies have been disabled for this discussion.