|
Hi!
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
but its not soooo fast. Is there a way to do this without looping?
David. | |
Share:
|
Dr Mephesto <dn****@googlemail.comwrites:
Hi!
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
but its not soooo fast. Is there a way to do this without looping?
You can do:
[[[],[],[],[],[]]] * 3000000
although I don't know if it performs any better than what you already
have. | | |
Paul Rudin wrote:
Dr Mephesto <dn****@googlemail.comwrites:
>Hi!
I would like to create a pretty big list of lists; a list 3,000,000 long, each entry containing 5 empty lists. My application will append data each of the 5 sublists, so they will be of varying lengths (so no arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
but its not soooo fast. Is there a way to do this without looping?
You can do:
[[[],[],[],[],[]]] * 3000000
although I don't know if it performs any better than what you already
have.
You are aware that this is hugely different, because the nested lists are
references, not new instances? Thus the outcome is most probably (given the
gazillion of times people stumbled over this) not the desired one...
Diez | | |
Paul Rudin wrote:
Dr writes:
>I would like to create a pretty big list of lists; a list 3,000,000 long, each entry containing 5 empty lists. My application will append data each of the 5 sublists, so they will be of varying lengths (so no arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
but its not soooo fast. Is there a way to do this without looping?
You can do:
[[[],[],[],[],[]]] * 3000000
although I don't know if it performs any better than what you already
have.
Actually, that produces list of 3000000 references to the same
5-element list. A reduced example:
>>lst = [[[],[],[],[],[]]] * 3 lst[1][1].append(42) print lst
[[[], [42], [], [], []], [[], [42], [], [], []], [[], [42], [], [], []]]
--
--Bryan | | |
"Diez B. Roggisch" <de***@nospam.web.dewrites:
Paul Rudin wrote:
>Dr Mephesto <dn****@googlemail.comwrites:
>>Hi!
I would like to create a pretty big list of lists; a list 3,000,000 long, each entry containing 5 empty lists. My application will append data each of the 5 sublists, so they will be of varying lengths (so no arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
but its not soooo fast. Is there a way to do this without looping?
You can do:
[[[],[],[],[],[]]] * 3000000
although I don't know if it performs any better than what you already have.
You are aware that this is hugely different, because the nested lists are
references, not new instances? Thus the outcome is most probably (given the
gazillion of times people stumbled over this) not the desired one...
Err, yes sorry. I should try to avoid posting before having coffee in
the mornings. | | |
yep, thats why I'm asking :)
On Sep 5, 12:22 pm, "Diez B. Roggisch" <de...@nospam.web.dewrote:
Paul Rudin wrote:
Dr Mephesto <dnh...@googlemail.comwrites:
Hi!
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
but its not soooo fast. Is there a way to do this without looping?
You can do:
[[[],[],[],[],[]]] * 3000000
although I don't know if it performs any better than what you already
have.
You are aware that this is hugely different, because the nested lists are
references, not new instances? Thus the outcome is most probably (given the
gazillion of times people stumbled over this) not the desired one...
Diez
| | |
In article <11*********************@k79g2000hse.googlegroups. com>,
Dr Mephesto <dn****@googlemail.comwrote:
> I would like to create a pretty big list of lists; a list 3,000,000 long, each entry containing 5 empty lists. My application will append data each of the 5 sublists, so they will be of varying lengths (so no arrays!).
Why do you want to pre-create this? Why not just create the big list and
sublists as you append data to the sublists?
--
Aahz (aa**@pythoncraft.com) <* http://www.pythoncraft.com/
"Many customs in this life persist because they ease friction and promote
productivity as a result of universal agreement, and whether they are
precisely the optimal choices is much less important." --Henry Spencer http://www.lysator.liu.se/c/ten-commandments.html | | |
On Sep 5, 7:50 pm, Dr Mephesto <dnh...@googlemail.comwrote:
Hi!
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Will each and every of the 3,000,000 slots be used? If not, you may be
much better off storagewise if you used a dictionary instead of a
list, at the cost of slower access.
Cheers,
John | | |
Dr Mephesto <dn****@googlemail.comwrites:
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will
append data each of the 5 sublists, so they will be of varying
lengths (so no arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
You might want to use a tuple as the container for the lower-level
lists -- it's more compact and costs less allocation-wise.
But the real problem is not list allocation vs tuple allocation, nor
is it looping in Python; surprisingly, it's the GC. Notice this:
$ python
Python 2.5.1 (r251:54863, May 2 2007, 16:56:35)
[GCC 4.1.2 (Ubuntu 4.1.2-0ubuntu4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>import time t0=time.time(); l=[([],[],[],[],[]) for _ in xrange(3000000)]; t1=time.time() t1-t0
143.89971613883972
Now, with the GC disabled:
$ python
Python 2.5.1 (r251:54863, May 2 2007, 16:56:35)
[GCC 4.1.2 (Ubuntu 4.1.2-0ubuntu4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>import gc gc.disable() import time t0=time.time(); l=[([],[],[],[],[]) for _ in xrange(3000000)]; t1=time.time() t1-t0
2.9048631191253662
The speed difference is staggering, almost 50-fold. I suspect GC
degrades the (amortized) linear-time list building into quadratic
time. Since you allocate all the small lists, the GC gets invoked
every 700 or so allocations, and has to visit more and more objects in
each pass. I'm not sure if this can be fixed (shouldn't the
generational GC only have to visit the freshly created objects rather
than all of them?), but it has been noticed on this group before.
If you're building large data structures and don't need to reclaim
cyclical references, I suggest turning GC off, at least during
construction. | | |
On 6 Sep., 01:34, "Delaney, Timothy (Tim)" <tdela...@avaya.comwrote:
Hrvoje Niksic wrote:
Dr Mephesto <dnh...@googlemail.comwrites:
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will
append data each of the 5 sublists, so they will be of varying
lengths (so no arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
If you're building large data structures and don't need to reclaim
cyclical references, I suggest turning GC off, at least during
construction.
This is good advice, but another question is whether you really want
such a list. You may well be better off with a database of some kind -
they're designed for manipulating large amounts of data.
Tim Delaney
I need some real speed! a database is waaay to slow for the algorithm
im using. and because the sublists are of varying size, i dont think I
can use an array... | | |
On Sep 6, 12:47 am, Dr Mephesto <dnh...@googlemail.comwrote:
>
I need some real speed! a database is waaay to slow for the algorithm
im using. and because the sublists are of varying size, i dont think I
can use an array...- Hide quoted text -
- Show quoted text -
How about a defaultdict approach?
from collections import defaultdict
dataArray = defaultdict(lambda : [[],[],[],[],[]])
dataArray[1001][3].append('x')
dataArray[42000][2].append('y')
for k in sorted(dataArray.keys()):
print "%6d : %s" % (k,dataArray[k])
prints:
1001 : [[], [], [], ['x'], []]
42000 : [[], [], ['y'], [], []]
-- Paul | | |
Dr Mephesto <dn****@googlemail.comwrites:
I need some real speed!
Is the speed with the GC turned off sufficient for your usage? | | |
On 6 Sep., 09:30, Paul McGuire <pt...@austin.rr.comwrote:
On Sep 6, 12:47 am, Dr Mephesto <dnh...@googlemail.comwrote:
I need some real speed! a database is waaay to slow for the algorithm
im using. and because the sublists are of varying size, i dont think I
can use an array...- Hide quoted text -
- Show quoted text -
How about a defaultdict approach?
from collections import defaultdict
dataArray = defaultdict(lambda : [[],[],[],[],[]])
dataArray[1001][3].append('x')
dataArray[42000][2].append('y')
for k in sorted(dataArray.keys()):
print "%6d : %s" % (k,dataArray[k])
prints:
1001 : [[], [], [], ['x'], []]
42000 : [[], [], ['y'], [], []]
-- Paul
hey, that defaultdict thing looks pretty cool...
whats the overhead like for using a dictionary in python?
dave | | |
En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <dn****@googlemail.com>
escribi�:
hey, that defaultdict thing looks pretty cool...
whats the overhead like for using a dictionary in python?
Dictionaries are heavily optimized in Python. Access time is O(1),
adding/removing elements is amortized O(1) (that is, constant time unless
it has to grow/shrink some internal structures.)
--
Gabriel Genellina | | |
On Sep 8, 3:33 am, "Gabriel Genellina" <gagsl-...@yahoo.com.arwrote:
En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <dnh...@googlemail.com>
escribi?:
hey, that defaultdict thing looks pretty cool...
whats the overhead like for using a dictionary in python?
Dictionaries are heavily optimized in Python. Access time is O(1),
adding/removing elements is amortized O(1) (that is, constant time unless
it has to grow/shrink some internal structures.)
--
Gabriel Genellina
well, I want to (maybe) have a dictionary where the value is a list of
5 lists. And I want to add a LOT of data to these lists. 10´s of
millions of pieces of data. Will this be a big problem? I can just try
it out in practice on monday too :)
thanks | | |
Dr Mephesto wrote:
On Sep 8, 3:33 am, "Gabriel Genellina" <gagsl-...@yahoo.com.arwrote:
>En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <dnh...@googlemail.com> escribi?:
>>hey, that defaultdict thing looks pretty cool... whats the overhead like for using a dictionary in python?
Dictionaries are heavily optimized in Python. Access time is O(1), adding/removing elements is amortized O(1) (that is, constant time unless it has to grow/shrink some internal structures.)
-- Gabriel Genellina
well, I want to (maybe) have a dictionary where the value is a list of
5 lists. And I want to add a LOT of data to these lists. 10´s of
millions of pieces of data. Will this be a big problem? I can just try
it out in practice on monday too :)
thanks
targetList = myDict[someKey] # This takes normal dict access time
for j in xrange(5) :
for i in xrange(50000000) : # Add a LOT of data to targetList
targetList[j].append(i) # This takes normal list access time | | |
Dr Mephesto <dn****@googlemail.comwrites:
well, I want to (maybe) have a dictionary where the value is a list of
5 lists. And I want to add a LOT of data to these lists. 10´s of
millions of pieces of data. Will this be a big problem? I can just try
it out in practice on monday too :)
Yes, that may be a problem both because of the amount of memory
required, and because of how the GC works. You may want to turn off
the GC while building these list. Otherwise, think of some other
strategy, like files on disk. | | |
Dr Mephesto a écrit :
Hi!
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Does anyone know the most efficient way to do this?
Hem... Did you consider the fact that RAM is not an unlimited resource?
Let's do some simple math (please someone correct me if I'm going off
the road): if a Python (empty) list object required 256 bits (if I refer
to some old post by GvR, it's probably more - 384 bytes at least. Some
Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to
build this list of lists. Which would make something around 3 Gb. Not
counting all other needed memory...
FWIW, run the following code:
# eatallramthenswap.py
d = {}
for i in xrange(3000000):
d[i] = ([], [], [], [], [])
And monitor what happens with top... | | |
On Sep 8, 8:06 pm, Bruno Desthuilliers
<bdesth.quelquech...@free.quelquepart.frwrote:
Dr Mephesto a écrit :
Hi!
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Does anyone know the most efficient way to do this?
Hem... Did you consider the fact that RAM is not an unlimited resource?
Let's do some simple math (please someone correct me if I'm going off
the road): if a Python (empty) list object required 256 bits (if I refer
to some old post by GvR, it's probably more - 384 bytes at least. Some
Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to
build this list of lists. Which would make something around 3 Gb. Not
counting all other needed memory...
FWIW, run the following code:
# eatallramthenswap.py
d = {}
for i in xrange(3000000):
d[i] = ([], [], [], [], [])
And monitor what happens with top...
Unused ram is wasted ram :)
I tried using MySQL, and it was to slow. and I have 4gb anyway... | | |
Dr Mephesto a écrit :
On Sep 8, 8:06 pm, Bruno Desthuilliers
<bdesth.quelquech...@free.quelquepart.frwrote:
>>Dr Mephesto a écrit :
>>>Hi!
>>>I would like to create a pretty big list of lists; a list 3,000,000 long, each entry containing 5 empty lists. My application will append data each of the 5 sublists, so they will be of varying lengths (so no arrays!).
>>>Does anyone know the most efficient way to do this?
Hem... Did you consider the fact that RAM is not an unlimited resource?
Let's do some simple math (please someone correct me if I'm going off the road): if a Python (empty) list object required 256 bits (if I refer to some old post by GvR, it's probably more - 384 bytes at least. Some Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to build this list of lists. Which would make something around 3 Gb. Not counting all other needed memory...
FWIW, run the following code:
# eatallramthenswap.py d = {} for i in xrange(3000000): d[i] = ([], [], [], [], [])
And monitor what happens with top...
Unused ram is wasted ram :)
Indeed. But when your app eats all RAM and swap and brings the system
down, users are usually a bit unhappy !-)
I tried using MySQL, and it was to slow.
Possibly.
and I have 4gb anyway...
*You* have 4gb. Yes, fine. But:
1/ please take time to re-read my post - the 3 gb is based on a very
optimistic estimation (256 bits) of the size of an empty list. If you
choose the (probably much closer to reality) estimate of 384 bits, then
you need (1 + (3000000 * 5)) * 384, which makes =~ 5 gb. More than what
*you* have. BTW, please remember that your OS and the Python interpreter
are going to eat some of these 4 gb, and that you intend to actually
*store* something - objects references - in these lists. Even if you do
have a few shared references, this means that you'll have to have RAM
space for *at least* 3000000 * 5 *more* Python objects (which make *1*
object per list...). Which will *at minima* use about the same amount of
RAM as the list of lists itself. Which take us to something like 10
gb... for *1* object per list. I of course suppose you plan to store
much more than 1 object per list !-)
2/ now ask yourself how many users of your application will have enough
RAM to run it...
So IMVHO, the question is not how to build such a list in less than x
minutes, but how to *not* build such a list. IOW, do you *really* need
to store all that stuff in RAM ? | | This discussion thread is closed Replies have been disabled for this discussion. Similar topics
3 posts
views
Thread by Todd MacCulloch |
last post: by
|
42 posts
views
Thread by Jeff Wagner |
last post: by
|
reply
views
Thread by Tech |
last post: by
|
41 posts
views
Thread by Odd-R. |
last post: by
|
2 posts
views
Thread by Ryan Ternier |
last post: by
|
3 posts
views
Thread by Richard Thornley |
last post: by
|
1 post
views
Thread by m.k.ball |
last post: by
|
5 posts
views
Thread by Scott |
last post: by
|
11 posts
views
Thread by Prateek |
last post: by
| | | | | | | | | | |