By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
446,361 Members | 1,800 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 446,361 IT Pros & Developers. It's quick & easy.

Python memory usage

P: n/a
Hi All,

Just wondering when i run the following code;

for i in range(1000000):
print i

the memory usage of Python spikes and when the range(..) block finishes
execution the memory usage does not drop down. Is there a way of
freeing this memory that range(..) allocated?

I found this document but the fix seems too complicated.

http://www.python.org/pycon/2005/pap...hon-memory.pdf

Cheers

Nov 8 '06 #1
Share this Question
Share on Google+
13 Replies


P: n/a
On 7 Nov 2006 21:42:31 -0800, placid <Bu****@gmail.comwrote:
Hi All,

Just wondering when i run the following code;

for i in range(1000000):
print i
the problem of that is that all the memory is used by the list
returned by range which wont be freed until the for loop exits

try this
>>import itertools
for i in itertools.count(1000000):
.... print i

that uses an iterator which I believe will bring down the memory usage
but will kill your CPU :)
the memory usage of Python spikes and when the range(..) block finishes
execution the memory usage does not drop down. Is there a way of
freeing this memory that range(..) allocated?

I found this document but the fix seems too complicated.

http://www.python.org/pycon/2005/pap...hon-memory.pdf

Cheers

--
http://mail.python.org/mailman/listinfo/python-list
Nov 8 '06 #2

P: n/a
On Tuesday 07 November 2006 22:42, placid wrote:
Hi All,

Just wondering when i run the following code;

for i in range(1000000):
print i

the memory usage of Python spikes and when the range(..) block finishes
execution the memory usage does not drop down. Is there a way of
freeing this memory that range(..) allocated?

I found this document but the fix seems too complicated.

http://www.python.org/pycon/2005/pap...hon-memory.pdf

Cheers
Change range to xrange. It will run faster and use up almost no memory by
comparison. I know the point you are getting at for releasing memory however
in this case there is no reason to allocate the memory to begin with.

Stranegly enough on my python2.4 install (kubuntu edgy) about half the memory
gets released as soon as the call finishes however if I run the same call
again it only goes up to the memory usage it was before the memory was
released. So some of the memory is returned and some is reused by python
later.
Nov 8 '06 #3

P: n/a

William Heymann wrote:
On Tuesday 07 November 2006 22:42, placid wrote:
Hi All,

Just wondering when i run the following code;

for i in range(1000000):
print i

the memory usage of Python spikes and when the range(..) block finishes
execution the memory usage does not drop down. Is there a way of
freeing this memory that range(..) allocated?

I found this document but the fix seems too complicated.

http://www.python.org/pycon/2005/pap...hon-memory.pdf

Cheers

Change range to xrange. It will run faster and use up almost no memory by
comparison. I know the point you are getting at for releasing memory however
in this case there is no reason to allocate the memory to begin with.
Thanks for that it has fixed some of the memory problems. Just
wondering if i continuously create different BeautifulSoup objects
within a xrange() block when does the memory get released for this
object, after the xrange() block, the next iteration of the xrange()
block ?

Cheers

Nov 8 '06 #4

P: n/a
placid wrote:
Hi All,

Just wondering when i run the following code;

for i in range(1000000):
print i

the memory usage of Python spikes and when the range(..) block finishes
execution the memory usage does not drop down. Is there a way of
freeing this memory that range(..) allocated?
Python maintains a freelist for integers which is never freed (I don't
believe this has changed in 2.5). Normally this isn't an issue since
the number of distinct integers in simultaneous use is small (assuming
you aren't executing the above snippet).

-Mike

Nov 8 '06 #5

P: n/a

Klaas wrote:
placid wrote:
Hi All,

Just wondering when i run the following code;

for i in range(1000000):
print i

the memory usage of Python spikes and when the range(..) block finishes
execution the memory usage does not drop down. Is there a way of
freeing this memory that range(..) allocated?

Python maintains a freelist for integers which is never freed (I don't
believe this has changed in 2.5). Normally this isn't an issue since
the number of distinct integers in simultaneous use is small (assuming
you aren't executing the above snippet).
Actually i am executing that code snippet and creating BeautifulSoup
objects in the range() (now xrange() ) code block.

Cheers

Nov 8 '06 #6

P: n/a

placid wrote:
Actually i am executing that code snippet and creating BeautifulSoup
objects in the range() (now xrange() ) code block.
Right; I was referring specifically to abominations like
range(1000000), not looping over an incrementing integer.

-Mike

Nov 9 '06 #7

P: n/a
(hello group)

On Nov 9, 8:38 pm, "Klaas" <mike.kl...@gmail.comwrote:
I was referring specifically to abominations like range(1000000)
However, there are plenty of valid reasons to allocate huge lists of
integers. This issue has been worked on:
http://evanjones.ca/python-memory.html
http://evanjones.ca/python-memory-part3.html

My understanding is that the patch allows most objects to be released
back to the OS, but can't help the problem for integers. I could be
mistaken. But on a clean Python 2.5:

x=range(10000000)
x=None

The problem exists for floats too, so for a less contrived example:

x=[random.weibullvariate(7.0,2.0) for i in xrange(10000000)]
x=None

Both leave the Python process bloated in my environment. Is this
problem a good candidate for the FAQ?

--Joseph

Nov 13 '06 #8

P: n/a
velotron wrote:
x=range(10000000)
x=None

The problem exists for floats too, so for a less contrived example:

x=[random.weibullvariate(7.0,2.0) for i in xrange(10000000)]
x=None

Both leave the Python process bloated in my environment. Is this
problem a good candidate for the FAQ?
http://effbot.org/pyfaq/why-doesnt-p...a-large-object

</F>

Nov 13 '06 #9

P: n/a
Le Mon, 13 Nov 2006 20:46:58 +0100,
Fredrik Lundh <fr*****@pythonware.coma écrit :

http://effbot.org/pyfaq/why-doesnt-p...a-large-object

</F>
Is it still true with Python 2.5 ?

I mean, [http://evanjones.ca/python-memory.html] should fix this
behaviour, doesn't it ?

Jonathan
Nov 13 '06 #10

P: n/a
Jonathan Ballet wrote:
>http://effbot.org/pyfaq/why-doesnt-p...a-large-object

Is it still true with Python 2.5 ?

I mean, [http://evanjones.ca/python-memory.html] should fix this
behaviour, doesn't it ?
not really -- that change just means that Python's object allocator will
return memory chunks to the C allocator if the chunks become empty, but
as the FAQ entry says, there are no guarantees that anything will be
returned at all. it all depends on your application's memory allocation
patterns.

(did you read Evan's presentation material, btw?)

</F>

Nov 13 '06 #11

P: n/a
velotron wrote:
On Nov 9, 8:38 pm, "Klaas" <mike.kl...@gmail.comwrote:
I was referring specifically to abominations like range(1000000)

However, there are plenty of valid reasons to allocate huge lists of
integers.
I'm sure there are some; I doubt there are plenty. Care to name a few?
This issue has been worked on:
http://evanjones.ca/python-memory.html
http://evanjones.ca/python-memory-part3.html

My understanding is that the patch allows most objects to be released
back to the OS, but can't help the problem for integers. I could be
Integers use their own allocator and as such aren't affected by Evan's
patch.
mistaken. But on a clean Python 2.5:

x=range(10000000)
x=None

The problem exists for floats too, so for a less contrived example:

x=[random.weibullvariate(7.0,2.0) for i in xrange(10000000)]
x=None

Both leave the Python process bloated in my environment. Is this
problem a good candidate for the FAQ?
I think floats use obmalloc so I'm slightly surprised you don't see
differences. I know that evan's patch imposes conditions on freeing
obmalloc arenas, so you could be seeing effects of that.

-Mike

Nov 13 '06 #12

P: n/a
Klaas wrote:
I think floats use obmalloc so I'm slightly surprised you don't see
differences.
as noted in the FAQ I just posted a link to, floats also use a free list
(using pretty much identical code to that used for integers).

see comments in Objects/intobject.c (quoted below) and
Objects/floatobject.c for details.

</F>

/* Integers are quite normal objects, to make object handling uniform.
(Using odd pointers to represent integers would save much space
but require extra checks for this special case throughout the code.)
Since a typical Python program spends much of its time allocating
and deallocating integers, these operations should be very fast.
Therefore we use a dedicated allocation scheme with a much lower
overhead (in space and time) than straight malloc(): a simple
dedicated free list, filled when necessary with memory from malloc().

block_list is a singly-linked list of all PyIntBlocks ever allocated,
linked via their next members. PyIntBlocks are never returned to the
system before shutdown (PyInt_Fini).

free_list is a singly-linked list of available PyIntObjects, linked
via abuse of their ob_type members.
*/

#define BLOCK_SIZE 1000 /* 1K less typical malloc overhead */
#define BHEAD_SIZE 8 /* Enough for a 64-bit pointer */

Nov 13 '06 #13

P: n/a
Le Mon, 13 Nov 2006 21:30:35 +0100,
Fredrik Lundh <fr*****@pythonware.coma écrit :
Jonathan Ballet wrote:
http://effbot.org/pyfaq/why-doesnt-p...a-large-object
Is it still true with Python 2.5 ?

I mean, [http://evanjones.ca/python-memory.html] should fix this
behaviour, doesn't it ?
not really -- that change just means that Python's object allocator
will return memory chunks to the C allocator if the chunks become
empty, but as the FAQ entry says, there are no guarantees that
anything will be returned at all. it all depends on your
application's memory allocation patterns.
Ah ok, I thought memory was much "directly freed" to system (but I
misread the faq).

Is there any documents on good application's memory allocation
patterns ?

(did you read Evan's presentation material, btw?)
I re-read it (thx for mentioning it), but schemas on pages 7, 8 and 9
are not very clear to me.

(snip some questions, since I find answers will writing them).

Where are stored pools which are not completely free ? (not in
'usedpools' nor in 'freepools' objects) (ah, this is the
partially_allocated_arenas I guess :) )
Are 'usedpools' and 'freepools' arenas ?

Smalls objects are stored in free blocks, so a block has a length of 256
bytes I guess. Can its size change (smaller, maybe larger) ?
So, if I am correct :
- I create a new 'small' object -Python allocates a new arena, and
stores my object in a free block in one of the pool of this arena
- this arena is stored in the partially_allocated_arenas list
(Python 2.5)
- allocating more objects fills all the blocks of every pools of the
arena -all pools are going one by one into 'usedpools' object
- del-eting every created object in my Python program free every
blocks of every pools of the arena (with a lot of chance :) ), and pools
are going into the 'freepools' object
- if all pools from an arena are freed, the arena is freed
- else, the arena stay allocated, in order to be re-used
Thanks,
Jonathan
Nov 13 '06 #14

This discussion thread is closed

Replies have been disabled for this discussion.