By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,091 Members | 1,555 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,091 IT Pros & Developers. It's quick & easy.

Help with large data set and fatal python error

P: n/a
I'm running a process which uses python numeric arrays extensively and also
does some calculations in c. All objects are created in python or through
the Numeric library and then passed to the c routines. I'm dealing with very
large sets of data.

I'm getting the following exceptions in the early part of the program:

Exception exceptions.AssertionError: <exceptions.AssertionError instance at
0x02639C10> in <function remove at 0x025CF5B0> ignored

....and then after about at the 1060 iteration of the main processing loop I
get the following error:

Fatal Python error: deallocating None

1. Is there any way to catch this error and close my database connection so
my uncommitted transactions get saved to the db
2. When I profile the larger dataset the .prof file does not get generated.
Is there any way to incrementally build the .prof file
3. Any ideas what the errors means and how can I debug/catch them?


Jul 18 '05 #1
Share this question for a faster answer!
Share on Google+

This discussion thread is closed

Replies have been disabled for this discussion.