By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
437,831 Members | 2,237 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 437,831 IT Pros & Developers. It's quick & easy.

Long CLI-process consumes too much memory

P: n/a
hi,

I am using an adapted (CLI) version of the mediawiki wiki->HTML parser
on several thousand articles. Because of performance, we don't want to
start the php interpreter for each article, so we loop over articles
in the php script calling the parser.

The problem is that some memory is not freed, and for about 100
articles,
the memory usage increased by about 10M. We recreate the Parser-Object
in each iteration to avoid accumulation of article-data, but maybe
there
are still references to some or all of this. Is there a way to make
sure
that there are no references to $parser (or parts of it) so that it
really
gets freed?

Can I recursively free memory of an object including all subobjects?

Finally, is there a way to debug memory usage (display by object)?

thanks!

--
Felix Natter

Feb 22 '07 #1
Share this question for a faster answer!
Share on Google+

This discussion thread is closed

Replies have been disabled for this discussion.