By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
455,547 Members | 1,425 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 455,547 IT Pros & Developers. It's quick & easy.

python shutting down sloooooooowly/tuning dictionaries

P: n/a
Is there a way to speed up killing python from within a python program?
Sometimes shutting down takes more than 10 times as much time as the actual
running of the program.

The programs are fairly simple (searching/organizing large boardgame
databases) but use a lot of memory (1-6GB). The memory is mostly used
for simple structures like trees or relations. Typically there will be
a few large dictionaries and many small dictionaries/sets/arrays/lists.

Since the entire database is too large I typically run the same
program 10-100 times on smaller parts. The annoying bit is that the
time for finishing one instance of a program before starting the next
instance can take a lot of time. I have started using a shell script to
check whether certain files have been written and then kill the
program from the os to speed up everything. But this solution is way
too ugly. There should be a better way, but I don't seem to be able to
find one.

A second question. Is it possible to control the resizing of
dictionaries? It seems that resizing always doubles the size, but for
dictionaries using more than half of the remaining memory than this
is a problem. I would rather have a slightly too full dictionaries in
main memory than have a sparse dictionary partially swapped out.

I would like to be able to restrict the maximum growth of each resize
operation to around 100MB or so.

TIA.

- Till
Jul 18 '05 #1
Share this question for a faster answer!
Share on Google+

This discussion thread is closed

Replies have been disabled for this discussion.