By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
445,918 Members | 1,872 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 445,918 IT Pros & Developers. It's quick & easy.

Python and generic programming

P: n/a

Hi,

I wonder, does Python support generic programming paradigm, and to what extent
(I guess, it doesn't do it in full)? And (or) does it move in that direction?
Will we ever see

concept WellAlright:
...

constructs in Python?
Isn't it right time to reserve "concept" as a keyword ;-)
Sincerely yours, Roman Suzi
--
rn*@onego.ru =\= My AI powered by GNU/Linux RedHat 7.3
Jul 18 '05 #1
Share this Question
Share on Google+
16 Replies


P: n/a
Roman Suzi <rn*@onego.ru> wrote:
I wonder, does Python support generic programming paradigm, and to what extent


It appears to me to support generics at least as well as C++ does. I
just write normal code (no typechecks) and voila, the equivalent of C++
templates (at least). What do you think is missing?
Alex
Jul 18 '05 #2

P: n/a
On Fri, 22 Oct 2004 00:25:28 +0200, Alex Martelli wrote:
Roman Suzi <rn*@onego.ru> wrote:
I wonder, does Python support generic programming paradigm, and to what extent


It appears to me to support generics at least as well as C++ does. I
just write normal code (no typechecks) and voila, the equivalent of C++
templates (at least). What do you think is missing?


The handcuffs.

Jul 18 '05 #3

P: n/a


Alex Martelli wrote:

Roman Suzi <rn*@onego.ru> wrote:
I wonder, does Python support generic programming paradigm, and to what extent


It appears to me to support generics at least as well as C++ does. I
just write normal code (no typechecks) and voila, the equivalent of C++
templates (at least). What do you think is missing?

Alex


How about specialization? I'm relatively new to Python. I ask for
information, not to be argumentative.

If you have, for example, several data-types for matrices, how do you
write code that will automatically find the right routine for quickly
multiplying a vector by a diagonal matrix represented as a vector, and
automatically call the right code for multiplying by a sparse matrix
represented by some sparse coding, etc?
Jul 18 '05 #4

P: n/a

"Jive Dadson" <jd*****@yahoo.com> wrote in message
news:41***************@yahoo.com...
How about specialization? I'm relatively new to Python. I ask for
information, not to be argumentative.

If you have, for example, several data-types for matrices, how do you
write code that will automatically find the right routine for quickly
multiplying a vector by a diagonal matrix represented as a vector, and
automatically call the right code for multiplying by a sparse matrix
represented by some sparse coding, etc?


Specialization goes in special methods -- the ones named __xxx__ -- of
which there are now 50-100. For each pair of classes, at least one of the
two must know how to do xxx on the other. There is no way to get around
the n square problem, but Python pushes it into individual operations so
you only need one function that uses the operation. So planning is
required for a system of user-defined classed such as multiple matrix
implementations.

For more, see the ref manual and the section on user classes. Also check
'special methods' in the index.

Terry J. Reedy

Jul 18 '05 #5

P: n/a
Jive Dadson wrote:
If you have, for example, several data-types for matrices, how do you
write code that will automatically find the right routine for quickly
multiplying a vector by a diagonal matrix represented as a vector, and
automatically call the right code for multiplying by a sparse matrix
represented by some sparse coding, etc?


I haven't been following this thread, but Phillip Eby's recent
implementation of generic functions should be mentioned if it hasn't
been already:

http://dirtsimple.org/2004/11/generi...ve-landed.html

It might look something like this:

@dispatch.on('m1', 'm2')
def matrix_mul(m1, m2):
"Multiply two matrices"

@matrix_mul.when(VectorMatrix, SparseMatrix)
def matrix_mul_sparse(m1, m2):
"Special implementation"
--
Ian Bicking / ia**@colorstudy.com / http://blog.ianbicking.org
Jul 18 '05 #6

P: n/a
>>>>> "Terry" == Terry Reedy <tj*****@udel.edu> writes:

Terry> Specialization goes in special methods -- the ones named
Terry> __xxx__ -- of which there are now 50-100. For each pair of
Terry> classes, at least one of the two must know how to do xxx on
Terry> the other. There is no way to get around the n square
Terry> problem, but Python pushes it into individual operations so
Terry> you only need one function that uses the operation. So
Terry> planning is required for a system of user-defined classed
Terry> such as multiple matrix implementations.

No, you don't quite understand what the OP is asking for. C++
templates are function or classes that is parameterized by some
arguments, the parameters can either be integer constants or type
names. So by writing a single template function you create (or
actually, you ask the compiler to create on-the-fly) many functions.
Upon using the parameterized function or class, the compiler will
choose/create the function or class for you. Specialization is a
mechanism that allows you to create a function that overrides the
implementation for some specific parameters. E.g., you can create a
"factorial" function as a template function:

template <int n>
int factorial() { return n * factorial<n-1>(); }

so that when asked, the compiler generates factorial<5>() as a
function which returns 5*factorial<4>(); factorial<4>() as a function
which returns 4*factorial<3>(), and so on. The terminal condition is
defined using a specialization:

template <>
int factorial<0>() { return 1; }

When the compiler wants to generate factorial<0>(), the more
specialized form is used, which stops the recursion. In this way you
create a compile-time computed factorial function. During compilation
the optimizer will turn it into a constant, so no computation is done
during run-time: factorial<10>() is in every conceivable way an
integer constant 3628800.

Another example is in the standard library: it defines a vector, which
is a resizable array for specific type (e.g., ints or strings)
normally. So a vector<int> will keep an int*, pointing to an array of
ints, used to hold the integers; and vector<string> will keep a
string*, pointing an array of strings. But if the specific type is
"bool", a bit-vector is created instead (i.e., an int* is used
instead, each int holds 32 values).

I don't see Python supporting generic program in the way C++ does, as
it does so little thing during "compile time" (which is just to
byte-compile the program). What I miss most is a mechanism that
allows programmers to define dispatching of functions based on types,
in a way that does not requires the programmer to specify how to do
dispatching (the compiler will figure it out by itself). I think
there's a PEP for that, though.

Regards,
Isaac.
Jul 18 '05 #7

P: n/a
> No, you don't quite understand what the OP is asking for. C++
templates are function or classes that is parameterized by some
arguments, the parameters can either be integer constants or type
names. So by writing a single template function you create (or
actually, you ask the compiler to create on-the-fly) many functions.
Upon using the parameterized function or class, the compiler will
choose/create the function or class for you. Specialization is a
mechanism that allows you to create a function that overrides the
implementation for some specific parameters. E.g., you can create a
"factorial" function as a template function:

template <int n>
int factorial() { return n * factorial<n-1>(); }

so that when asked, the compiler generates factorial<5>() as a
function which returns 5*factorial<4>(); factorial<4>() as a function
which returns 4*factorial<3>(), and so on. The terminal condition is
defined using a specialization:

template <>
int factorial<0>() { return 1; }

When the compiler wants to generate factorial<0>(), the more
specialized form is used, which stops the recursion. In this way you
create a compile-time computed factorial function. During compilation
the optimizer will turn it into a constant, so no computation is done
during run-time: factorial<10>() is in every conceivable way an
integer constant 3628800.


You should mention that C++ lacks the possibility to properly solve
type-equations, so that your provided positive example backfires in cases
where two or more type arguments are needed. An example would be a generic
vector class, that takes the scalar type as argument as well as the
dimension. now for example the multiplying code looks like this:

template <T, N>
void mult(vector<T, N> a, vector<T, N> b) {
a[N] = a[N] * b[N];
mult<T, N-1>(a,b);
}

Now to stop code generation at N=0, you need to write

template <double, 0>
void mult(vector<double, 0> a, vector<double, 0> b) {
a[0] = a[0] * b[0];
}
The above examples are right out of my head, so I can't guarantee they work.

The important thing to observe is that one needs to instantiate the
recursion abortion for the scalar class as well as the dimension 0 - it
won't work like this:

template <T, 0>
void mult(vector<T, 0> a, vector<T, 0> b) {
a[0] = a[0] * b[0];
}

So forgetting to instantiate the used scalar template - for int as example -
will lead to a compiler error, as it tries to generate templates until it
hits the built-in iteration limit.

In functional languages, these things work better, as their pattern-matching
has a notion of "more specialized" patterns over lesser specialized ones.

--
Regards,

Diez B. Roggisch
Jul 18 '05 #8

P: n/a

"Isaac To" <ik****@netscape.net> wrote in message
news:87************@sinken.local.csis.hku.hk...
"Terry" == Terry Reedy <tj*****@udel.edu> writes:
Terry> Specialization goes in special methods -- the ones named
Terry> __xxx__ -- of which there are now 50-100.
No, you don't quite understand what the OP is asking for.


No, YOU are the one who does not understand, because you apparently did not
do me the courtesy of reading my entire post, in which I quoted the
question I answered. Here again is the OP's question about *type*
specialization *in Python*:
If you have, for example, several data-types for matrices, how do you
write code that will automatically find the right routine for quickly
multiplying a vector by a diagonal matrix represented as a vector, and
automatically call the right code for multiplying by a sparse matrix
represented by some sparse coding, etc?


The has NOTHING to do with writing base cases in recursive C++ template
metaprogramming (which, ironically, uses the compiler as an interpreter).

Although there may be other answers now and in the future, the answer I
gave is the straightforward standard answer for Python today.

Terry J. Reedy

Jul 18 '05 #9

P: n/a
I take it the ampersand is a neologism in 2.4?

Jive

jdadson at yahoo dott com

"Ian Bicking" <ia**@colorstudy.com> wrote in message
news:ma**************************************@pyth on.org...
Jive Dadson wrote:
If you have, for example, several data-types for matrices, how do you
write code that will automatically find the right routine for quickly
multiplying a vector by a diagonal matrix represented as a vector, and
automatically call the right code for multiplying by a sparse matrix
represented by some sparse coding, etc?


I haven't been following this thread, but Phillip Eby's recent
implementation of generic functions should be mentioned if it hasn't
been already:

http://dirtsimple.org/2004/11/generi...ve-landed.html

It might look something like this:

@dispatch.on('m1', 'm2')
def matrix_mul(m1, m2):
"Multiply two matrices"

@matrix_mul.when(VectorMatrix, SparseMatrix)
def matrix_mul_sparse(m1, m2):
"Special implementation"
--
Ian Bicking / ia**@colorstudy.com / http://blog.ianbicking.org

Jul 18 '05 #10

P: n/a
I hate to break it to you Terry, but Issac did understand the question. The
example he gave was perhaps a little fanciful, but it was on the money.
That is exactly the kind of thing I was asking about.

More typical examples of specialization may be found in the Standard
Template Library, and in the Boost Graph Library.

When you define a function in Python, you are (in effect) defining many
functions, just as Issac says. What the byte codes eventually do depends on
type information that is acted on at runtime. C++ has a mechanism
(templates) for acting on the type information similarly, but at compile
time. However, a modern C++ compiler can do something beyond that. It can
recognize special cases (which the programmer must code for) and use a
different functional form (template) for the special case than for the
generic case. Both STL and BGL use the mechanism extensively.

jdadson at yahoo dott com

"Terry Reedy" <tj*****@udel.edu> wrote in message
news:ma**************************************@pyth on.org...

"Isaac To" <ik****@netscape.net> wrote in message
news:87************@sinken.local.csis.hku.hk...
"Terry" == Terry Reedy <tj*****@udel.edu> writes:
Terry> Specialization goes in special methods -- the ones named
Terry> __xxx__ -- of which there are now 50-100.
No, you don't quite understand what the OP is asking for.
No, YOU are the one who does not understand, because you apparently did

not do me the courtesy of reading my entire post, in which I quoted the
question I answered. Here again is the OP's question about *type*
specialization *in Python*:
If you have, for example, several data-types for matrices, how do you
write code that will automatically find the right routine for quickly
multiplying a vector by a diagonal matrix represented as a vector, and
automatically call the right code for multiplying by a sparse matrix
represented by some sparse coding, etc?


The has NOTHING to do with writing base cases in recursive C++ template
metaprogramming (which, ironically, uses the compiler as an interpreter).

Although there may be other answers now and in the future, the answer I
gave is the straightforward standard answer for Python today.

Terry J. Reedy

Jul 18 '05 #11

P: n/a
>>>>> "Terry" == Terry Reedy <tj*****@udel.edu> writes:
No, you don't quite understand what the OP is asking for.


Terry> you apparently did not do me the courtesy of reading my
Terry> entire post, in which I quoted the question I answered.

I admit that I didn't read your mail very carefully when I write my
response. I did that immediately afterwards, and my post didn't make
sense even to myself. So I cancel that post a minute after making it,
although apparently my cancel posting didn't go as far as my response.

Terry> The has NOTHING to do with writing base cases in recursive
Terry> C++ template metaprogramming (which, ironically, uses the
Terry> compiler as an interpreter).

I see no problem using the compiler as an interpreter. Indeed, I
think C++ should make it more explicit and allow one to do things in
compile time as easily as one can do at run-time (e.g., why only
support integers, and why one have to do recursions instead of
loops?). But this is a Python group, not a C++ group, so such things
are off-topic here.

Terry> Although there may be other answers now and in the future,
Terry> the answer I gave is the straightforward standard answer
Terry> for Python today.

On the other hand, I didn't agree with this. Your answer basically
said "define a member function called __mul__ for each class you
have", which I don't think is what the OP asked for. First, nothing
in the original post said he wants the function to be used with
operator syntax, so __mul__ or other __xxx__ member functions should
be considered off-topic. Second, the OP asked for "specialization",
and, as an example, asked for a way to cause the system "automatically
finds the right routine for [one type], and automatically call the
right code for [another type]". Your answer involves defining a
method for each type one creates, which (1) is not automatic, and (2)
requires one to modify the class, which is not what is expected by a
C++ programmer who are used to template functions and specialization.

Template specialization is a dispatch mechanism that is external to a
class. You probably mean "such mechanism is missing in Python". I'm
not quite sure that is correct, though.

Regards,
Isaac.
Jul 18 '05 #12

P: n/a
"Jive" <so*****@microsoft.com> wrote in message news:<su**********************@news.easynews.com>. ..
When you define a function in Python, you are (in effect) defining many
functions, just as Issac says. What the byte codes eventually do depends on
type information that is acted on at runtime. C++ has a mechanism
(templates) for acting on the type information similarly, but at compile
time. However, a modern C++ compiler can do something beyond that. It can
recognize special cases (which the programmer must code for) and use a
different functional form (template) for the special case than for the
generic case. Both STL and BGL use the mechanism extensively.

This doesn't look too hard. Untested, cause I don't have Python 2.4
yet:

def generic(genfunc):
speclist = []
def wrapf(arg):
for (specfunc,typ) in speclist:
if isinstance(arg,typ):
return specfunc(arg)
return genfunc(arg)
wrapf.speclist = speclist
wrapf.func_name = genfunc.func_name # isn't this ok in 2.4?
return wrapf

def specializes(genf,typ):
def spd(specf):
genf.speclist.append((specf,typ))
return specf
return spd

# so my European friends don't get too upset
specialises = specializes

Then:

@generic
def some_function(arg):
print "generic"

@specializes(some_fuction,int)
def some_int_specific_function(arg):
print "int-specific"

Whence calling some_function('abc') prints "generic", and calling
some_function(123) prints "int-specific".

Obviously this example could use lots of improvement. It only works
for functions of one argument, and it could probably do better than a
linear search through the type-dispatch list. But it does show how it
can be done.

A couple things: I personally have a distaste for specializations,
because of the easy possibily that what is supposed to be an
optimized, but equivalent, version of the generic function does
something different (intentionaly or not). IMO, that's much too stiff
a price for a premature optimization. They should rarely be used
outside the final stages of development. Or if you WANT the
specialization to behave differently, you ought to be using regular
old polymorphism.

Having said that, if you're going to use specializations, then the way
I did it above, by explicitly defining functions as generic and
specialized, is a much better way to do it than in C++ IMO. At least
then you get advance warning of any surprises. "Explicit is better
than implicit", 'n at. By not supporting implicit specializations, I
think Python's doing a good job.
--
CARL BANKS
Jul 18 '05 #13

P: n/a

[Isaac]
[...] my post didn't make
sense even to myself. So I cancel that post a minute after making it,
although apparently my cancel posting didn't go as far as my response.


The comp.lang.python newsgroup is mirrored by the py*********@python.org
mailing list, so canceling a message often won't work for the mailing list
subscribers. Your original message will already have been sent out by
email by the time your cancel message gets through.

--
Richie Hindle
ri****@entrian.com

Jul 18 '05 #14

P: n/a

"Richie Hindle" <ri****@entrian.com> wrote in message
news:g5********************************@4ax.com...

[Isaac]
[...] my post didn't make
sense even to myself. So I cancel that post a minute after making it,
although apparently my cancel posting didn't go as far as my response.


The comp.lang.python newsgroup is mirrored by the py*********@python.org
mailing list, so canceling a message often won't work for the mailing
list
subscribers. Your original message will already have been sent out by
email by the time your cancel message gets through.


and python-list, along with 1000s of other technical mailing lists, is
mirrored as a news.gname.org newsgroup -- gmane.comp.python.

I cancel about half the messages I start to write -- by hitting delete
instead of send ;-0.

Terry J. Reedy

Jul 18 '05 #15

P: n/a
Richie Hindle <ri****@entrian.com> wrote in message news:<ma**************************************@pyt hon.org>...
[Isaac]
[...] my post didn't make
sense even to myself. So I cancel that post a minute after making it,
although apparently my cancel posting didn't go as far as my response.


The comp.lang.python newsgroup is mirrored by the py*********@python.org
mailing list, so canceling a message often won't work for the mailing list
subscribers. Your original message will already have been sent out by
email by the time your cancel message gets through.


Also, Google ignores cancel requests. If the article makes it to
Google's news server, which it likely will in a matter of seconds,
you're out of luck. Probably the only thing cancel's good for these
days is _immediately_ recalling an incomplete message that you
accidentally clicked post on.

--
CARL BANKS
Jul 18 '05 #16

P: n/a
Ian Bicking <ia**@colorstudy.com> wrote in message news:<ma**************************************@pyt hon.org>...
Jive Dadson wrote:
If you have, for example, several data-types for matrices, how do you
write code that will automatically find the right routine for quickly
multiplying a vector by a diagonal matrix represented as a vector, and
automatically call the right code for multiplying by a sparse matrix
represented by some sparse coding, etc?


I haven't been following this thread, but Phillip Eby's recent
implementation of generic functions should be mentioned if it hasn't
been already:

http://dirtsimple.org/2004/11/generi...ve-landed.html

It might look something like this:

@dispatch.on('m1', 'm2')
def matrix_mul(m1, m2):
"Multiply two matrices"

@matrix_mul.when(VectorMatrix, SparseMatrix)
def matrix_mul_sparse(m1, m2):
"Special implementation"


Actually, it would be more like:

@dispatch.generic()
def matrix_mul(m1, m2):
"""Multiply two matrices"""

@matrix_mul.when("m1 in VectorMatrix and m2 in SparseMatrix")
def matrix_mul_sparse(m1, m2):
"""Special implementation"""

The 'dispatch.on()' decorator is for single-dispatch functions only.
If you want to dispatch on multiple arguments or on logical
conditions, you use 'dispatch.generic()'. The implementations for
single and multiple dispatch are completely separate, and use
different algorithms.
Jul 18 '05 #17

This discussion thread is closed

Replies have been disabled for this discussion.