By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,235 Members | 1,008 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,235 IT Pros & Developers. It's quick & easy.

__delitem__ affecting performance

P: n/a
Hi,

I was performing some timing tests on a class that inherits from the
built-in list, and got some curious results:

import timeit

class MyList(list):
def __init__(self):
list.__init__(self)
self[:] = [0,0,0]

def __delitem__(self,index):
print 'deleting'

ml = MyList()

def test():
global ml
ml[0] += 0
ml[1] += 0
ml[2] += 0

t = timeit.Timer("test()","from __main__ import test")
print t.timeit()
>4.11651382676
import timeit

class MyList(list):
def __init__(self):
list.__init__(self)
self[:] = [0,0,0]

ml = MyList()

def test():
global ml
ml[0] += 0
ml[1] += 0
ml[2] += 0

t = timeit.Timer("test()","from __main__ import test")
print t.timeit()
>2.23268591383
Does anybody know why defining __delitem__ is causing the code to run
slower? It is not being called, so I don't see why it would affect
performance. Overriding other sequence operators like __delslice__ does
not exhibit this behavior.

The speed difference doesn't really bother me, but I am curious.

I used Python 2.4 for this test.

-Karl
Oct 19 '06 #1
Share this Question
Share on Google+
1 Reply


P: n/a
Karl H. wrote:
Does anybody know why defining __delitem__ is causing the code to run
slower? It is not being called, so I don't see why it would affect
performance.
probably because overriding portions of the internal sequence slot API
(tp_as_sequence) means that Python needs to do full dispatch for all
members of that API, instead of keeping things at the C level.

</F>

Oct 19 '06 #2

This discussion thread is closed

Replies have been disabled for this discussion.