By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,676 Members | 2,249 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,676 IT Pros & Developers. It's quick & easy.

pyMPI momory leak

P: n/a
Hi,
I have been using pyMPI to parallelize my code and found that the
function mpi.send() leaks memory a lot and thus is not really working
for large amount fo data communication. It actually fails after the
leak accumulates more than 2G. I wonder if others have the same
experience or I did something wrong. I compiled python 2.4, mpich
1.2.6, pyMPI 2.1b4 on Opteron cluster running Rocks 3.3.
Here is a small test script with 2 CPUs to demo the memory leak:

import mpi
n = 10000
i=0
data = [0]*40000
while i < n:
if mpi.rank==1:
mpi.send(data, 0)
elif mpi.rank==0:
msg, status = mpi.recv()
n+=1
if one watchs the memory usage using 'top', one can see one process use
little and constant amount of memory (recv for rank=0) and the other
process uses more and more memory (send for rank=1).

Jul 18 '05 #1
Share this question for a faster answer!
Share on Google+

This discussion thread is closed

Replies have been disabled for this discussion.