By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
443,923 Members | 1,279 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 443,923 IT Pros & Developers. It's quick & easy.

Python not freeing memory (?)

P: n/a
I have a Bayesian simulation package that keeps running into memory
allocation problems. I have a feeling this has something to do with
Python (2.5.1.1) not freeing memory. The program essentially
iterates n times, each time proposing new arrays (numpy) of values
that are evaluated using statistical likelihoods. All variables in
the loop are re-assigned at each iteration, so there should not be
a leak. Nevertheless, at approximately the same iteration every
time, I run into malloc errors:

Iteration 36200 at 5647.58165097

Iteration 36300 at 5664.8412981

Iteration 36400 at 5681.71009493
Python(28344,0xa000d000) malloc: *** vm_allocate(size=8421376)
failed (error code=3)
Python(28344,0xa000d000) malloc: *** error: can't allocate region
Python(28344,0xa000d000) malloc: *** set a breakpoint in szone_error
to debug
Traceback (most recent call last):
File "/Users/chris/EURING/detection/detection2.py", line 285,
in <module>
run()
File "/Users/chris/EURING/detection/detection2.py", line 268, in run
results = sampler.sample(iterations=iterations, burn=burn, thin=thin)
File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-
packages/PyMC/MCMC.py", line 3021, in sample
parameter.propose(debug)
File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-
packages/PyMC/MCMC.py", line 768, in propose
if not self._sampler.test():
File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-
packages/PyMC/MCMC.py", line 2899, in test
self()
File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-
packages/PyMC/MCMC.py", line 2562, in __call__
return self.model()
File "/Users/chris/EURING/detection/detection2.py", line 238, in model
self.pd = invlogit(self.beta0 + self.beta1 * self.wind + self.beta2 * (self.cloud>0) + beta3[self.series-
1])
MemoryError

Is there *any* way of getting around this?

Aug 12 '07 #1
Share this Question
Share on Google+
2 Replies


P: n/a
Is there *any* way of getting around this?

Sure: Determine the bug, and fix it. A prerequisite
is that you have the source code of all extension modules
which you are using, but that seems to be the case.

If you want others to help you in finding the bug, you
need to provide more detail, e.g. a specific piece of
code that reproducibly wastes memory. If you want to
study how Python objects are allocated and released,
you need to create a debug build of Python (and all
extension modules), and start, e.g., with looking at the
value of sys.gettotalrefcount() over time.

HTH,
Martin
Aug 12 '07 #2

P: n/a
Martin v. Löwis <martin <atv.loewis.dewrites:
If you want others to help you in finding the bug, you
need to provide more detail, e.g. a specific piece of
code that reproducibly wastes memory. If you want to
study how Python objects are allocated and released,
you need to create a debug build of Python (and all
extension modules), and start, e.g., with looking at the
value of sys.gettotalrefcount() over time.
I tried monitoring the refcount at every iteration, but it
does not change; at the same time, the memory use by the
python process increases. This is why I suspected that
python was not returning memory.

Below is the method that gets called iteratively; the *_like methods
are statistical likelihoods implemented in f2py. I dont see
anything that is obviously responsible:

def model(self):
# Specification of joint log-posterior

#alpha3 = concatenate(([0], self.alpha3))

# Linear model for surfacing wait time
#self.lamda = exp(self.alpha0 + self.alpha1 * self.wind + self.alpha2 *
self.air + alpha3[self.series-1])
gamma3 = concatenate(([0], self.gamma3))

# Linear model for at-surface probability
self.theta = invlogit(self.gamma0 + self.gamma1 * self.wind + self.gamma
2 * self.air + gamma3[self.series-1])

x, n, theta = transpose([[z[1], sum(z), t] for z, t in zip(self.downup,
self.theta) if type(z)!=type(0.0)])

# Binomial likelihood of available animals
self.binomial_like(x, n, theta, name='theta')

# Probability of availability (per survey)
self.pa = 1.0 - (1 - self.theta)**10

beta3 = concatenate(([0], self.beta3))

# Linearmodel for probability of detection
self.pd = invlogit(self.beta0 + self.beta1 * self.wind + self.beta2 * (s
elf.cloud>0) + beta3[self.series-1])

# Binomial likelihood of detection
self.binomial_like(self.obs, self.present, self.pd * self.pa, name='pd')

zeta1 = concatenate(([0], self.zeta1))

# Probability of presence
self.pp = invlogit(self.zeta0 + zeta1[self.series-1] + self.zeta2 * self
..intake + self.zeta3 * (self.discharge - self.intake))

# Binomial likelihood of presence
self.binomial_like(self.present, self.absent + self.present, self.pp, na
me='pp')

# Correct flight counts for detection
self.N_flight = self.count / (self.pd * self.pa)

# Aggregate counts by series
N_series = [self.N_flight[self.series==i+1] for i in range(6)]

# Log-normal likelihood for N
#sum(self.lognormal_like(X, log(N), T) for X, N, T in zip(N_series, self
..N, self.T))
for N, mu in zip(N_series, self.N):
self.poisson_like(N, mu)

Thanks in advance for anything that you might suggest.

cf

Aug 13 '07 #3

This discussion thread is closed

Replies have been disabled for this discussion.