By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
432,275 Members | 947 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 432,275 IT Pros & Developers. It's quick & easy.

automatic static types (metaclass), attribute-order extraction =python instead of XML-likes - and some RequestsForEnhancements

P: n/a
hello again.

i'm now into using python instead of another language(s) for
describing structures of data, including names, structure,
type-checks, conversions, value-validations, metadata etc. And i have
things to offer, and to request.
And a lot of ideas, but who needs them....
here's an example (from type_struct.py):
[this is still in progress. it would be used for direct generating of
(ordered) byte structures as well. And user interface forms...]
from statictype2 import StaticTyper, StaticType

class Record( StaticTyper) :
def _validator_ID( v):
if v<=0: raise ValueError, 'value must be positive int, not %r' % v
return v
def _validator_Date( v):
if isinstance( v, tuple) and len(v)==2:
v = '.'.join([ str(x) for x in v] )
elif not isinstance( v, str):
raise TypeError, 'expect str or tuple( str,str), not %r of type %s' % (v,type(v))
return v

ID = StaticType( int, auto_convert=True, validator=_validator_ID )
Date = StaticType( None, validator= _validator_Date)
Count= StaticType( int)
Key = StaticType( str, default_value= 'empty/key/used')

class Input_Record( Record):
Checksum = Tchecksum

class Output_Head( StaticTyper):
timestamp = StaticType( str, default_value= 'none')

class Output_Record( Output_Head, Record):
pass .... r = Input_Record()
r.ID = '357'
r.Date = (10,'03')
r.Count = 2 ....


here's output of `python2.3 type_struct.py`:
<module '__main__' from 'test_struct.py'>:
- order:
['Tchecksum', 'r', 'a', 'v']
<class '__main__.Record'>:
- order:
['ID', 'Date', 'Count', 'Key']
- flattened order:
ID = <property object at 0x401795a4>
Date = <property object at 0x4017993c>
Count = <property object at 0x40179554>
Key = <property object at 0x401799b4>
<class '__main__.Input_Record'>:
- order:
['Checksum']
- flattened order:
ID = <property object at 0x401795a4>
Date = <property object at 0x4017993c>
Count = <property object at 0x40179554>
Key = <property object at 0x401799b4>
Checksum = <property object at 0x40179694>
<class '__main__.Output_Head'>:
- order:
['timestamp']
- flattened order:
timestamp = <property object at 0x40179be4>
<class '__main__.Output_Record'>:
- order:
[]
- flattened order:
timestamp = <property object at 0x40179be4>
ID = <property object at 0x401795a4>
Date = <property object at 0x4017993c>
Count = <property object at 0x40179554>
Key = <property object at 0x401799b4>

['ID', 'Date', 'Count', 'Key', 'Checksum']
<__main__.Input_Record object at 0x40433bec>
ID 357
Date 10.03
Count 2
Key empty/key/used
Checksum 52bf113125e445e072387820ccd0a983

==================

static_type1 is if using const BaseTypes as templates for actual classes.
static_type2 is more powerful as stuff goes direct in the actual class.
you can run each (execpt the statictype_base), they have more tests
and examples in them.

each set/get for these fields in the instances are captured as
properties.

StaticType's supports exact type checks and automatic conversion,
default_value lazy setting, and a validator than can add value checks
and/or replace type/auto_conv if needed. Probably more can be added,
when needed (e.g. things about representation / formatting, naming, UI
etc)

there's also a method .test_validator() to test your validators
_before_ using them as that can be nasty.

------
you don't need anything around assignemt_order.py unless u need the
order of the fields to follow the order they are written in the python
file (Which is the best place IMO. WYSIWYG - What You See Is What You
Get.)

my previous post about extracting the order of assignments in
module/class namespaces didn't get much response, well, i wrote the
damned thing.

This assignment_order.py hack (parsing the source again to get the
order) would not be needed IF any of the following was available:
- order of variable assignments in class namespace was given to the
__metaclass__, as separate argument or if the dict argument is not
dict-mapping but iterable of (key,value) tuples, it is so easy to turn
it into dictionary - just dict(iterable)
- at very low-level, internal frame object's f_locals() could return
iterable of key,value tuples instead of plain dict.

The order is there in the execution frame, it just not reachable from
python.

-------
thes can go well as examples:
the static_types could go with Demo/newmetaclass/
the assignement_order could go with Demo/parser/

================================================== ============================

now, about the requests. i want to see python more symmetrical and
consistent. this way it could be advertised/marketed easier to places
with serious portability AND maintainability thinking, over other
pure-procedural languages. Is anyone is interested in that?

- i want the documentation to be fully interlinked (in current
hierarchy) and to have another alternative hierarchy - from
usage-point of view - if i say list, ALL about lists should be there,
and not separated in 3+ different places, so either a lot bookmarks
are to be kept, or 10+ clicks to get the next piece of info. Same goes
for most data/execution things. it can be done as simple
links-only-layer - i dont mind. Probably data-model and execution
model are best to follow as template.

- order of variable assignments in class namespace to be
offered/passed to the metaclass, as separate argument or the (now
dict) namespace argument to be properly ordered iterable (moving the
dict creation from C into python metaclass - well, if any.
type(name,bases,dictiterable) can still do it in C). Order can be
safely ignored if not needed.

- i want somehow to turn the { MY order of dict values here } syntax
into as-is-ordered thing! Could be even with some pragma switch that
would create my dictOrder (or some builtin one) instead of internal
hashed mapping! This is equivalent to above, but for plain value dicts
(and not the attribute namespaces, represented as dicts). Function
keyword args can be another thing to keep order of - if required. (a
sort of execution-namespace control pragma)

- the notion of iterators/ generators, and notion of interfaces are so
powerful but so poorly advertised - and used. Now there is itertools -
PERFECT! but i have to import it while waster things like map() are
still sitting at plain view - hence - from lamer's point of view -
easier to use.

- what about try: finally: over generators? PEP288. gee, 50 lines of
iterator class instead of 5 with yield just becasue i _have_ to close
a file...

- all builtin functions that do not add value to the language as
language should move into separate module (e.g. map, zip, sum etc...
vs callable, isinstance, builtin type-constructors) and not be so
over-emphasized.

- all interfaces (not types!) - e.g. containers (__get/set/delitem) -
should be named 'interfaces' - now this is vaguely called 'customization'.

- how do i check if x is iterable?

- (i see this goes in python2.4, PEP289) The perfect idea of in-place
generators - which is now used for filling lists and maybe dicts in
future, should be allowed everywhere where iterables can be. e.g. i want
my_method_with_iterator( (k, 2*k, keys[k]) for k in [a,b,c] )
to work without need of intermediate list/dict/whatever.

- IMO most funcs now needing a list or dict should be happy with
proper iterable. ( Eventualy funcs needing only iterable over single
values may have *args instead, and for key-balue tuples **kargs
instead. but this isnt that important.)

- i would like to be able to control somehow what constructors are
available for example to eval() - so i can patch them (operator *) -
so things like
eval( '[ 1,2,3 ] * 500000000')
or
eval( ' "abcdef" * 500000000')
could not bomb the machine, and give syntax error or else.
which means safer if not safe expressions. (again a sort of
execution-namespace control)

- i want the list head/tail notation:
a, b, *c = somelisthere
apart of all else, it helps future maintenance and readability of
the program, and this IS important if python is to be used for serios
things in serios places. ah, i might like simmetrical thing for dicts
but you will not like it:
'a':var1, 'b':var2, **rest = dictionaryhere

- uh there were more, but i keep forgetting...

if i can help to do something of these or others, or you want ideas of
how/when/why something can be implemented for the python, call ...
there's plenty of experience in a lot of languages and languages in
general, as well human-machine related things, even psychology. And i
DO want to support and make better the good thing what python is.

just in case this newserver of mine dies (and it does often), do CC to
me mail.

[btw, anyone interested in very thin framework for UI-via-html-forms?
all is in python, even the form description, html etc.
it's 75%-done, with most things showable.
]

ciao
svilenD

=================

#$Id: statictype_base.py,v 1.1 2004/03/18 17:21:51 sdobrev Exp $
#s.dobrev 2k4

class PropContainer: pass
def _get__props( me):
try: p = me.__props
except AttributeError:
p = me.__props = PropContainer() #XXX with slots?
return p

#Made along Demo/newmetaclass/Eiffel.py

class StaticTyper_factory_base( type):
""" make yourBaseType( StaticTyper ) class with a
class attribute = type for each required instance attribute-name.
All Instance's classes must subclass one or more of them.
"""
def __new__( meta, name, bases, adict):
#print 'meta', meta, name, bases, adict
#inplace conversion
meta.convert( name, bases, adict)
adict[ '_props'] = property( _get__props)
adict.setdefault( '_debug_props_get_set', 0)
return type.__new__( meta, name, bases, adict)

#def convert( klas, name, bases, adict):
# ...
#convert = classmethod( convert)
#StaticTyper = StaticTyper_factory( 'StaticTyper', (), {}) #empty

# vim:ts=4:sw=4:expandtab

#$Id: statictype2.py,v 1.2 2004/03/18 18:07:01 sdobrev Exp $
#s.dobrev 2k4

from statictype_base import StaticTyper_factory_base

class _NONE: pass

class StaticType:
"""StaticType(
typ, #can be None if validator present and does all needed ckecks;
# else exact type check is done before validator()
auto_convert =False, #if type present and not exact as of value, type(value) is attempted
validator =None, #functor( value); replaces typ AND auto_convert -
# raise TypeError for wrong type(value), ValueError if wrong range,
# and/or autoconvert etc; return value
default_value =_NONE, #if anything (not _NONE), lazy (!) set - at first get.
)
"""
#XXX if callable(default_value), value = default_value(me) ??

TYPEERROR = 'set %stype %s with value %r of type %s'
def __init__( klas, typ, auto_convert =False, validator =None, default_value =_NONE, ):

klas.default_value = default_value
klas.typ = typ
klas.auto_convert = auto_convert #kept only for __str__
if not typ:
assert callable( validator)
else:
if auto_convert:
def typecheck( v):
if type(v) is not typ:
v = typ(v) #could raise something
return v
else:
TYPEERROR = klas.TYPEERROR
def typecheck( v):
if type(v) is not typ:
raise TypeError, TYPEERROR % ('', typ, v, type(v) )
return v

#prepend type-check to validator
if validator:
_validator = validator #namespace binding!
def validator( v):
return _validator( typecheck( v))
else:
validator = typecheck
klas.validator = validator

if default_value is not _NONE and validator:
validator( default_value ) #test default_value

# XXX what if validator / type does not match ?

def _make_property( klas, attr):
base_type = klas.typ

#get
if klas.default_value is _NONE:
def a_get( me):
if me._debug_props_get_set: print 'getattr', attr, base_type
return getattr( me._props, attr)
else:
def a_get( me):
if me._debug_props_get_set: print 'getattr', attr, base_type
try: v = getattr( me._props, attr)
except AttributeError:
v = klas.default_value
setattr( me._props, attr, v)
return v

#set
validator = klas.validator
if not validator:
TYPEERROR = klas.TYPEERROR
def a_set( me, v):
if me._debug_props_get_set: print 'setattr', attr,v, base_type
if type(v) is not base_type:
raise TypeError, TYPEERROR % ( 'attribute "%s" of' % attr, base_type, v, type(v) )
return setattr( me._props, attr, v)
else:
def a_set( me, v):
if me._debug_props_get_set: print 'setattr', attr,v, base_type
try: v = validator( v)
except (TypeError,ValueError), e:
e.args = ( 'set attribute "'+attr+'": ' + e.args[0], )
raise
return setattr( me._props, attr, v)
return property( a_get, a_set)

def test_validator( klas, *value_in_out ):
if klas.validator:
for v,vout in value_in_out:
print 'test .validator', klas, ': value %s, expect %s;' % (v, vout)
try:
isexc = issubclass( vout, Exception)
except TypeError: isexc = False

if not isexc:
r = klas.validator( v)
if vout != r:
print 'falied For %r: expect %r, got %r' % (v, vout, r)
#else: print 'ok'
else:
try:
r = klas.validator( v)
except vout:
#print 'ok'
pass
except Exception,r:
print 'falied for %r: expect %s, got %s' % (v, vout, r)
raise
else:
print 'falied for %r: expect %s, got %r' % (v, vout, r)

def __str__( me):
return '%s(%s%s%s%s)' % ( me.__class__.__name__,
me.typ,
me.auto_convert and ', auto_convert' or '',
me.default_value is not _NONE and ', default=%s' % me.default_value or '',
me.validator and ',validator' or ''
)
class StaticTyper_factory( StaticTyper_factory_base):
""" subclass StaticTyper and define
class attribute = StaticType(...) for each required instance attribute-name.
"""
def convert( klas, name, bases, adict):
for a,t in adict.iteritems():
if isinstance( t, StaticType):
if not a.startswith('__'):
adict[a] = t._make_property( a)
convert = classmethod( convert)
StaticTyper = StaticTyper_factory( 'StaticTyper', (), {}) #empty

#########

if __name__=='__main__':

class boza( StaticTyper):
def byte_Validator( v):
#print 'validate', v
if v<0 or v>255:
raise ValueError, 'value %d must be within 0..255' % v
return v
x = StaticType( int)
y = StaticType( int, validator= byte_Validator, auto_convert =True, default_value =44 )
y.test_validator(
(1,1), (0,0), (255,255), #in range
('1',1), #in range, auto_convert
('0x34', ValueError), #in range, auto_convert, wrong format #use int(..,0) to autoguess
(300, ValueError), #out range
('400', ValueError), #out range, auto_convert
( list(), TypeError), #wrong type
)
pass

a = boza()

print '----'
try: print a.x
except :
import traceback
traceback.print_exc(1)

print '----'
a.x = 2 #ok
print 'x ', a.x

print '----'
try: a.x = '2'
except :
import traceback
traceback.print_exc(1)

print '----'
print 'y ', a.y

print '----'
try: a.y = 500
except :
import traceback
traceback.print_exc(1)

print '----'
a.y = '250'
print 'y ', a.y

# vim:ts=4:sw=4:expandtab

#$Id: statictype1.py,v 1.1 2004/03/18 17:21:51 sdobrev Exp $
#s.dobrev 2k4

from statictype_base import StaticTyper_factory_base

class StaticTyper_factory( StaticTyper_factory_base):
""" inherit yourBaseType from StaticTyper with
class attribute = type for each required instance attribute-name.
your-classes for actual instances must subclass one or more of these yourBaseTypes.
Supports exact type check only - no default_value, auto_conv, validators etc.
"""

def convert( klas, name, bases, adict):
for base in bases:
if isinstance( base, StaticTyper_factory):
for a,t in base.__dict__.iteritems():
if isinstance( t, type):
if not a.startswith('__'):
adict[a] = _make_property( a,t)
convert = classmethod( convert)
StaticTyper = StaticTyper_factory( 'StaticTyper', (), {}) #empty

#XXX may have become a class, so a_get, a_set are override-able
def _make_property( a, t):

#!!!! local bind to current value(s);
# else 'a' is bound at outer namespace, and references to last value there.
# Well when not in a separate function.
attr = a
base_type = t

def a_get( me):
if me._debug_props_get_set: print 'getattr', attr, base_type
return getattr( me._props, attr)

def a_set( me, v):
if me._debug_props_get_set: print 'setattr', attr,v, base_type
if type(v) is not base_type:
raise TypeError, 'set attribute "%s" of type %s with value %r of type %s' % (attr, base_type, v, type(v) )
return setattr( me._props, attr, v)

return property( a_get, a_set)
#########

if __name__=='__main__':

class BozaType( StaticTyper):
x = int
y = float

class boza( BozaType):
#x = 1
#y = 'a'
#z = '3'
pass

a = boza()

print '----'
try: print a.x
except :
import traceback
traceback.print_exc(1)

print '----'
a.x = 2 #ok
print 'x ', a.x

print '----'
try: a.x = '2'
except :
import traceback
traceback.print_exc(1)
print '----'
try: a.y = 500
except :
import traceback
traceback.print_exc(1)
# vim:ts=4:sw=4:expandtab

#$Id: assignment_order.py,v 1.9 2004/03/18 18:07:01 sdobrev Exp $
#s.dobrev 2k4
"""
Obtain order of assigments in python source text namespaces.
python loses them at end of exec, as all namespaces
are/must be plain dicts (set via PyDict_SetItem - there is only
a temporary array of local vars in exec' frame in C).
Handles assignment, class def, function def, import.
Warning: conditional definitions etc runtime stuff is ignored -
this goes through _all_ the source.
use as:
a = ASTVisitor4assign()
klastree = a.parse( src_string1 )
###now use klastree (or a.klas which is same)
klastree.pprint()

a.parse( src_string2 ) #as if src = src_string1 + src_string2
a.klas.pprint()
###for clean reuse, either do a.__init__() or make another ASTVisitor4assign

###for whole module:
import xxx #assume there's class X1 inside
klastree = ASTVisitor4assign().parseModuleSource( xxx )
klastree.set_order_in_namespace( xxx) #,flatten=True to pre-calculate flattened order
print xxx.__order
print xxx.X1.__order

### put this at end of a module for self-auto-ordering (e.g. at import):
try:
__order
except NameError:
from util.assignment_order import ASTVisitor4assign
ASTVisitor4assign().parseFile_of_module( __file__).set_order_in_namespace( globals(), flatten =True)
#print '==================='
#print __order

###get _all_ attribs of some class, ordered:
for key,value in get_class_attributes_flatten_ordered( xxx.X1):
print ' ', key,value

"""

# see compiler.visitor
# uses compiler.ast as parser.ast is too grammar specific.

class ASTVisitor4assign:
"""create an instance with optional call_on_duplicates functor argument,
then run .parse*( src).
a.parse( src1); a.parse( src2) === a.parse( src1 + src2 )
"""
class Klas:
""" root instance of this is returned by ASTVisitor4assign.parse*().
use set_order_in_namespace() to setattr a '__order' list into each class of the
source class hierarhcy, assuming it exists as class object"""

def __init__( me, name ):
me.name = name
me.vars = []
me._vars = {} #fast search
me.klasi = []
me.all = []
def _add_var( me, name, call_on_duplicates =None, *args):
if name in me._vars:
if call_on_duplicates:
call_on_duplicates( name, *args)
else:
me.vars.append( name)
me._vars[ name] = 1
me.all.append( name)

def _add_klas( me, klas ):
#no check for duplicates
me.klasi.append( klas)
me.all.append( klas)
_add_func = _add_var
_add_import = _add_var

def pprint( me, pfx ='', ):
if me.all:
print pfx, 'klas', me.name or '<>' ,':'
pfx = pfx+' '
if 1:
for var in me.vars:
print pfx, var
for klas in me.klasi:
klas.pprint( pfx)
else:
for a in me.all:
if isinstance( a, me.__class__):
a.pprint( pfx)
else:
print pfx, var

def set_order_in_namespace( me, namespace, order_name ='__order', flatten =False, **kargs_flatten_class__order ):
""" assign a '__order' list into each class of the hierarchy, assuming it exists as class object.
namespace is the relevant (for me) class, module or dict (e.g. globals()
which is same as __main__.__dict__).
flatten=True will pre-calculate the flattened-hierarchy attribute list (__order_flat).
"""
order = me.vars
if type( namespace) is dict:
_setattr = dict.__setitem__
_getattr = dict.__getitem__
else:
_setattr = setattr
_getattr = getattr
_setattr( namespace, order_name, order)
for k in me.klasi:
k_namespace = _getattr( namespace, k.name)
k.set_order_in_namespace( k_namespace, order_name=order_name)
if flatten:
flatten_class__order( k_namespace, order_name=order_name, **kargs_flatten_class__order)

def pprint_order_in_namespace( me, namespace, order_name ='__order', pfx='', **kargs_flatten_class__order ):
""" namespace is the relevant (for me) class, module or dict
(e.g. globals() which is __main__.__dict__)
"""
if type( namespace) is dict:
_getattr = dict.__getitem__
print pfx,'<dict>:'
else:
_getattr = getattr
print pfx,str(namespace)+':'
pfx = pfx+' '

print pfx, '- order:'
print pfx,' ', _getattr( namespace, order_name)
import types
if type( namespace) is dict or isinstance( namespace, types.ModuleType): pass
else:
print pfx, '- flattened order:'
for k,v in get_class_attributes_flatten_ordered( namespace, **kargs_flatten_class__order):
print pfx,' ', k,'\t=',v

for klas in me.klasi:
k_namespace = _getattr( namespace, klas.name)
klas.pprint_order_in_namespace( k_namespace, order_name=order_name, pfx=pfx, **kargs_flatten_class__order)

def __init__( me, call_on_duplicates =None, do_function =False, do_import =False):
me.klas = me.Klas( '') #None #root
me.call_on_duplicates = call_on_duplicates
me.do_function = do_function
me.do_import = do_import

def visitAssName( me, node):
#print node
me.klas._add_var( node.name, me.call_on_duplicates, node, me )

def visitClass( me, node):
#print 'klas', node.name
klas = me.klas
name = node.name

newklas = me.Klas( name)
klas._add_klas( newklas) #what if same klas repeated.. bad luck

me.klas = newklas #push
me.visit( node.code) #call back the caller - indirect recursion
me.klas = klas #pop

#all things inside funcs are not of interest - jump over
def visitFunction( me, node):
#print 'func', node.name, 'ignored'
if me.do_function:
me.klas._add_func( node.name, me.call_on_duplicates, node, me )
return

def visitImport( me, node):
#print 'import', node.names, 'ignored'
if me.do_import:
for mod_name, as_name in node.names:
if as_name is None:
as_name = mod_name.split( '.',1)[0]
me.klas._add_import( as_name, me.call_on_duplicates, node, me )
return

def visitFrom( me, node):
#print 'from', node.modname, 'import', node.names, 'ignored'
return me.visitImport( node)

def parse( me, src):
import compiler
ast = compiler.parse( src +'\n')
compiler.walk( ast, me )
return me.klas

def parseFile( me, path):
import compiler
ast = compiler.parseFile( path)
compiler.walk( ast, me )
return me.klas

def parseModuleSource( me, module):
""" Warning: module's .__file__ attribute may be pathname of
precompiled .pyc, .pyo, shared library .so/.dll/.. for C extensions,
or not available at all if module is statically linked into interpreter.
These are all checked but Something may escape through...
"""
try:
src_filename = module.__file__
except AttributeError: #statically linked builtin
raise TypeError, 'cannot access sourcefile for '+ str(module)
return me.parseFile_of_module( src_filename)

def parseFile_of_module( me, src_filename):
""" Warning: module's .__file__ attribute may be pathname of
precompiled .pyc, .pyo, shared library .so/.dll/.. for C extensions,
or not available at all if module is statically linked into interpreter.
These are all checked but Something may escape through...
"""
#can use inspect.getsourcefile( config), but this is a bit better
fl = src_filename.lower() #to lower() or not??
import imp
for suffix, mode, kind in imp.get_suffixes():
if fl[-len(suffix):] == suffix:
if kind == imp.PY_COMPILED:
src_filename = src_filename[:-len(suffix)] + '.py' #guess? lower/upper-case??
elif kind != imp.PY_SOURCE:
# not text file
raise TypeError, 'cannot access sourcefile for '+ str(module)
break
return me.parseFile( src_filename) #hope for accessible, syntacticaly correct, source file
def flatten_class__order( klas, order_name ='__order', flat_name ='__order_flat',
ignore_missing_order =False, _used =None ):
"""collect and set order of _all_ attributes of (hierarchical) class as class.flat_name """
#ignore cached __order_flat when computing
if _used is None: #root
try:
#return getattr( klas, flat_name) #this looks through into bases!
return klas.__dict__[ flat_name ]
except (AttributeError, KeyError): pass
_used = {},{} #global for root
_used_klas,_used_vars = _used

o = []
#print klas
try:
#if klas.__class__ is type: return o #klas is a type-object
if issubclass( klas, type): return o #klas is a type-object
except AttributeError: pass #non-object/old-style classes have no __class__

#recursion, deep then wide...
for base in klas.__bases__:
if base not in _used_klas: #base duplicates ignored
_used_klas[ base] = 1
o += flatten_class__order( base, order_name, flat_name, ignore_missing_order, _used )

try:
order = getattr( klas, order_name)
except AttributeError:
if not ignore_missing_order: raise
else:
for a in order: #do not ignore missing __order
if a not in _used_vars: #vars duplicates ignored
o.append( a)
_used_vars[ a] = 1
setattr( klas, flat_name, o )
return o

def get_class_attributes_flatten_ordered( klas, inst =None, **kargs_flatten_class__order):
"""ordered yield attr,value for _all_ attributes of (hierarchical) class/instance """
if inst is None: inst = klas
for a in flatten_class__order( klas, **kargs_flatten_class__order ):
yield a, getattr( inst, a)
def test( module, src_filename =None, flatten =True, print_tree =False):
a = ASTVisitor4assign()
if not src_filename:
klastree = a.parseModuleSource( module) #try guess the source file
else:
assert module
klastree = a.parseFile_of_module( src_filename) #hope for accessible, syntacticaly correct, source file

if print_tree:
print module, 'tree:'
klastree.pprint()

klastree.set_order_in_namespace( module, flatten =flatten)
klastree.pprint_order_in_namespace( module)
return klastree
if __name__=='__main__':

s = """
p = int
class Record:
Timestamp = str
ddd, ber, aaa = 2,3,4
class SubRec:
a,c = 5,6
b = 12
d = SubRec
ddd = 43 #repeat
import os
import compiler.ast
import compiler.ast as xxxx
from compiler.ast import Node
class ASubRec( SubRec, SubRec):
z = 15
b = 15
def fun(x):
z = 4
return z
"""

help = """usage:
no args - print and test some internal source text
name - if endswith .py, loads file as module 'autoname'
else import name as module
-module name - force import name as module
"""

import sys
name = None
is_module = False
for a in sys.argv[1:]:
if a == '-module':
is_module = True
elif a.startswith( '-'):
print help
sys.exit(-1)
else:
name = a
if not name.lower().endswith( '.py'):
is_module = True

a = ASTVisitor4assign( do_function= True, do_import =True)
if name:
if is_module: #module name
print 'source - module:', name
module = __import__( name )
klastree = a.parseFile_of_module( module)
else:
print 'source - file (as module "autoname"):', name
klastree = a.parseFile( name) #source_file_name
import imp
f = open( name)
try:
module = imp.load_module( 'autoname', f, name, ('.py', 'r', imp.PY_SOURCE) )
finally:
f.close()
else:
print 'source - text:', s
print '-----'
klastree = a.parse( s)
#klastree.pprint()

if not name:
exec s
import __main__
module = __main__

klastree.set_order_in_namespace( module) #,flatten=True to pre-calc flattened

klastree.pprint_order_in_namespace( module)
if not name:
f = flatten_class__order( Record.ASubRec)
print Record.ASubRec,f
assert f == [ 'a','c','b', 'z','fun']

# vim:ts=4:sw=4:expandtab

#$Id: test_struct.py,v 1.1 2004/03/18 18:07:01 sdobrev Exp $
#s.dobrev 2k4

from statictype2 import StaticTyper, StaticType

import md5
def get_checksum( me):
s = md5.new()
for a in me.__order_flat:
s.update( str(a) )
return s.hexdigest()
Tchecksum = property( get_checksum)

class Record( StaticTyper) :
def _validator_ID( v):
if v<=0: raise ValueError, 'value must be positive int, not %r' % v
return v
def _validator_Date( v):
if isinstance( v, tuple) and len(v)==2:
v = '.'.join([ str(x) for x in v] )
elif not isinstance( v, str):
raise TypeError, 'expect str or tuple( str,str), not %r of type %s' % (v,type(v))
return v

ID = StaticType( int, auto_convert=True, validator=_validator_ID )
Date = StaticType( None, validator= _validator_Date)
Count= StaticType( int)
Key = StaticType( str, default_value= 'empty/key/used')

class Input_Record( Record):
Checksum = Tchecksum

class Output_Head( StaticTyper):
timestamp = StaticType( str, default_value= 'none')

class Output_Record( Output_Head, Record):
pass

if 10: #auto-calc order
try: __order
except NameError:
from assignment_order import ASTVisitor4assign
ASTVisitor4assign().parseFile_of_module( __file__).set_order_in_namespace( globals(), flatten=True, ignore_missing_order=True)

if __name__ == '__main__':
from assignment_order import test, get_class_attributes_flatten_ordered
import __main__
test( __main__)

r = Input_Record()
r.ID = '357'
r.Date = (10,'03')
r.Count = 2
print r.__order_flat
print r
for a,v in get_class_attributes_flatten_ordered( r.__class__, r):
print a,'\t',v

# vim:ts=4:sw=4:expandtab

Jul 18 '05 #1
Share this Question
Share on Google+
7 Replies


P: n/a
>>> import this
The Zen of Python, by Tim Peters

Beautiful is better than ugly.
Explicit is better than implicit.
Simple is better than complex.
Complex is better than complicated.
Flat is better than nested.
Sparse is better than dense.
Readability counts.
Special cases aren't special enough to break the rules.
Although practicality beats purity.
Errors should never pass silently.
Unless explicitly silenced.
In the face of ambiguity, refuse the temptation to guess.
There should be one-- and preferably only one --obvious way to do it.
Although that way may not be obvious at first unless you're Dutch.
Now is better than never.
Although never is often better than *right* now.
If the implementation is hard to explain, it's a bad idea.
If the implementation is easy to explain, it may be a good idea.
Namespaces are one honking great idea -- let's do more of those!
**end quote**

[snip attached file information and comments]

now, about the requests. i want to see python more symmetrical and
consistent. this way it could be advertised/marketed easier to places
with serious portability AND maintainability thinking, over other
pure-procedural languages. Is anyone is interested in that?
I'm all for Python being used in more places (as are most people here),
but I am not for changing the language to suit someone's arbitrary
opinion on what it should be. You'll likely find others who are also
against arbitrary changes that seem to suit an individual's sense of
'what should happen'. Read some of the "suggested changes" to syntax
and behavior that others have offered here, and you'll notice a strong
tendency for the suggestions to be generally disliked (if not loathed).

Guido and others in python-dev seem to be doing a pretty good job of
keeping things together. While Guido has some regrets (in terms of
Python's behavior), he seems much happier than regretful, and I doubt
many (if any) of your "suggestions" will go far, because I don't believe
that /any/ of your suggestions are Pythonic.

- i want the documentation to be fully interlinked (in current
hierarchy) and to have another alternative hierarchy - from usage-point
of view - if i say list, ALL about lists should be there, and not
separated in 3+ different places, so either a lot bookmarks are to be
kept, or 10+ clicks to get the next piece of info. Same goes for most
data/execution things. it can be done as simple links-only-layer - i
dont mind. Probably data-model and execution model are best to follow as
template.
I find that the documentation index
(http://www.python.org/doc/current/lib/genindex.html) to be quite
useful, as well as google.
- order of variable assignments in class namespace to be offered/passed
to the metaclass, as separate argument or the (now dict) namespace
argument to be properly ordered iterable (moving the dict creation from
C into python metaclass - well, if any. type(name,bases,dictiterable)
can still do it in C). Order can be safely ignored if not needed.
I understand that for some reason this is important to you. However,
previous to your initial post about it, I've never heard of anyone
having a need for the assignment order of class variables. This
suggests that very few people have a need for it, further shown by the
fact that no one replied to your post saying, "hey, I've wanted to do
this too, tell me if you figure it out".

I did reply to your post, if only because I thought you wanted
/instance/ variable assignment order, which /has/ been brought up in
other contexts.

- i want somehow to turn the { MY order of dict values here } syntax
into as-is-ordered thing! Could be even with some pragma switch that
would create my dictOrder (or some builtin one) instead of internal
hashed mapping! This is equivalent to above, but for plain value dicts
(and not the attribute namespaces, represented as dicts). Function
keyword args can be another thing to keep order of - if required. (a
sort of execution-namespace control pragma)
Ordered mappings are probably not currently supported in Python because:
"Special cases aren't special enough to break the rules."

Dictionaries are great. Dictionaries are fast. Lets not mess them up
by slowing them down.

- the notion of iterators/ generators, and notion of interfaces are so
powerful but so poorly advertised - and used. Now there is itertools -
PERFECT! but i have to import it while waster things like map() are
still sitting at plain view - hence - from lamer's point of view -
easier to use.
Perhaps you didn't mean "lamer", such a term is quite rude. If you
/did/ mean "lamer", then that would make you a "troll".

There are 12 functions in itertools. Adding 1 built-in function takes
the word of Guido. Adding 12 built-in functions would likely take the
word of most of the core developers of Python, and god. The itertools
module is the proper location, as determined by Guido and others in
python-dev.

As for map and other builtins being in plain view, even if they are not
iterators/generators, they are still useful.

- what about try: finally: over generators? PEP288. gee, 50 lines of
iterator class instead of 5 with yield just becasue i _have_ to close a
file...
If it takes you 50 lines to close a file after yielding it, you're doing
something wrong.

Checking the source code that you attached, you use yield once, which
doesn't seem like it is iterating over file handles or file names.

Post the code in a reply, and I'm sure someone will show you how to
clean it up.

- all builtin functions that do not add value to the language as
language should move into separate module (e.g. map, zip, sum etc... vs
callable, isinstance, builtin type-constructors) and not be so
over-emphasized.
map, zip, sum, etc. are used in a large amount of production code.
Moving them somewhere else would break every piece of code that uses
them. This is not going to happen soon, possibly ever.

If you don't like their location, create a module that modifies
__builtins__ that gets imported by site.py. Something like the
following would work...

#my_confused_builtins.py

move_me = ['map', ...]
for i in move_me:
exec("%s = __builtins__.%s"%(i,i))
exec("del __builtins__.%s"%i)

#include the following line in site.py
import my_confused_builtins

- all interfaces (not types!) - e.g. containers (__get/set/delitem) -
should be named 'interfaces' - now this is vaguely called 'customization'.
I don't know why they initially decided on "customization" for a name,
perhaps it is because we are customizing the behavior of various
operations on new objects.

FYI, almost all of the __ops__ are listed in the documentation for the
operator module.

- how do i check if x is iterable?
try:
iter(x)
except:
#x is not iterable
- (i see this goes in python2.4, PEP289) [snip comments]
PEP 289 has been accepted, so your comments on it are late.

- IMO most funcs now needing a list or dict should be happy with proper
iterable. ( Eventualy funcs needing only iterable over single values may
have *args instead, and for key-balue tuples **kargs instead. but this
isnt that important.)
Where any iterable makes sense (and not just a list), the Python is
moving that direction, and if all goes well, (I believe), that will be
the case in the future.

For objects that need a dictionary, usually it is because they use a
dictionary for its primary purpose, looking up and storing arbitrary
values. Certainly any arbitrary sequence or iterable can be used as a
dictionary, but when people say dictionary, they usually mean something
with (approximately) O(1) access time for any element, that can be
queried and written to multiple times (you can only search an iterable
once, and can't write at all).

- i would like to be able to control somehow what constructors are
available for example to eval() - so i can patch them (operator *) - so
things like [snip limitation argument]
Check the compiler module. When you have an AST, you can do all the
checks your heart desires.

There may be a project to create a restricted python execution
environment for such tasks, but a better idea is to not allow eval() and
exec() statements at all.

- i want the list head/tail notation:
a, b, *c = somelisthere
apart of all else, it helps future maintenance and readability of the
program, and this IS important if python is to be used for serios things
in serios places. ah, i might like simmetrical thing for dicts but you
will not like it:
'a':var1, 'b':var2, **rest = dictionaryhere
I doubt Python is going here. If you want head/tail of a list...

def head_tail_list(lst, count=1):
return tuple([lst[i] for i in xrange(count)] + lst[count:])

a,b,c = head_tail_list(somelisthere, 2)
If you want the head/tail of a dict...

def head_tail_dict(dct, *items):
return tuple([dct.pop(i) for i in items] + [dct])

var1, var2, rest = head_tail_dict(dictionaryhere, 'a', 'b')
Personally, I think that the syntax you give is ugly, and have had
little need to do anything close to what you describe. I've also not
seen anyone who uses lisp-like car/cdr operations in Python. One thing
you should remember is that Python lists are actually arrays. Pulling
an element off the front is very wasteful (computationally).

Also considering that the functional equivalent to what you want is a
1-liner in either case, I see little reason for any of this
functionality to be included with Python now or in the future.

[btw, anyone interested in very thin framework for UI-via-html-forms?
all is in python, even the form description, html etc.
it's 75%-done, with most things showable.]


Sounds neat. Perhaps you should create a sourceforge.net project for it.

- Josiah
Jul 18 '05 #2

P: n/a
svilen <sd*****@sistechnology.com> writes:
if i can help to do something of these or others, or you want ideas of
how/when/why something can be implemented for the python, call
... there's plenty of experience in a lot of languages and languages
in general, as well human-machine related things, even psychology. And
i DO want to support and make better the good thing what python is.


While your comments are interesting, you may not find a receptive
audience here because as a group we tend to be rather inflexible.

I would suggest taking your requirements over to comp.lang.lisp.

I think you will find a number of individuals over there who will be
more than willing to assist you with them.

--
KBK
Jul 18 '05 #3

P: n/a
Josiah Carlson wrote:
- all builtin functions that do not add value to the language as
language should move into separate module (e.g. map, zip, sum etc...
vs callable, isinstance, builtin type-constructors) and not be so
over-emphasized.


map, zip, sum, etc. are used in a large amount of production code.
Moving them somewhere else would break every piece of code that uses
them. This is not going to happen soon, possibly ever.


I'm a bit confused, anyway. When did functional programming become
outdated?

map() is easily as important as iterators and generators.

-- Chris.

Jul 18 '05 #4

P: n/a
Chris Gonnerman <ch*************@newcenturycomputers.net> writes:
map, zip, sum, etc. are used in a large amount of production
code. Moving them somewhere else would break every piece of code
that uses them. This is not going to happen soon, possibly ever.


I'm a bit confused, anyway. When did functional programming become
outdated?


Preferred style these days is to use list comprehensions instead of
map and zip.
Jul 18 '05 #5

P: n/a
In article <7x************@ruckus.brouhaha.com>,
Paul Rubin <http://ph****@NOSPAM.invalid> wrote:
Chris Gonnerman <ch*************@newcenturycomputers.net> writes:
> map, zip, sum, etc. are used in a large amount of production
> code. Moving them somewhere else would break every piece of code
> that uses them. This is not going to happen soon, possibly ever.


I'm a bit confused, anyway. When did functional programming become
outdated?


Preferred style these days is to use list comprehensions instead of
map and zip.


zip? Do you mean filter?

Regards. Mel.
Jul 18 '05 #6

P: n/a
In message <dVGXAls/KH*******@the-wire.com>, Mel Wilson wrote:
In article <7x************@ruckus.brouhaha.com>,
Paul Rubin <http://ph****@NOSPAM.invalid> wrote:
Preferred style these days is to use list comprehensions instead of
map and zip.


zip? Do you mean filter?


From 'Learning Python' by Mark Lutz and David Ascher:

<quote>

The built-in zip function allows us to use for loops to visit multiple
sequences in _parallel_. In basic operation, zip takes one or more
sequences, and returns a list of tuples that pair up parallel items taken
from its arguments. For example, suppose we're working with two lists:
L1 = [1,2,3,4]
L2 = [5,6,7,8]
To combine the items in these lists, we can use zip:
zip(L1,L2) [(1, 5), (2, 6), (3, 7), (4, 8)]

Such a result may be useful in other contexts. When wedded with the for
loop, though, it supports parallel iterations:
for (x,y) in zip(L1, L2): .... print x, y, '--', x+y
....
1 5 -- 6
2 6 -- 8
3 7 -- 10
4 8 -- 12

</quote>

It might be useful to think of the items in each list coming together like
the teeth of a zipper. But it's better than that: you're not constrained to
two sets of teeth:
L3 = [9, 10, 11, 12]
zip(L1,L2,L3)

[(1, 5, 9), (2, 6, 10), (3, 7, 11), (4, 8, 12)]

--
Garry Knight
ga*********@gmx.net ICQ 126351135
Linux registered user 182025
Jul 18 '05 #7

P: n/a
On Sun, 21 Mar 2004 00:07:50 +0000, Garry Knight <ga*********@gmx.net>
wrote:

It might be useful to think of the items in each list coming together like
the teeth of a zipper. But it's better than that: you're not constrained to
two sets of teeth:
L3 = [9, 10, 11, 12]
zip(L1,L2,L3)

[(1, 5, 9), (2, 6, 10), (3, 7, 11), (4, 8, 12)]


I have found zip quite useful.

But I think it is a case of Python being for consenting adults. If
the lists are of unequal length, the first 'n' items of each list,
with 'n' being the number of items in the shortest list, are zipped.
It is assumed that is one's intention, and there is no warning that
you have attempted to zip unequal length lists. Where the lists are
coming together from different sections of the code, are being built
dynamically, and there are corner cases to concern oneself with - well
something unexpected could be passing unless there is explicit testing
somewhere along the route.

Map behaves differently, matching None with the item in the longer
length list when the lists are of unequal length. Which I think makes
it easier to catch exceptions.

I guess I am the stage of understanding some of the not fully obvious
"traps". Sometimes, even, before I fall into them.

Art
Jul 18 '05 #8

This discussion thread is closed

Replies have been disabled for this discussion.