473,473 Members | 1,776 Online
Bytes | Software Development & Data Engineering Community
Create Post

Home Posts Topics Members FAQ

creating really big lists

Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]

but its not soooo fast. Is there a way to do this without looping?

David.

Sep 5 '07 #1
19 8586
Dr Mephesto <dn****@googlemail.comwrites:
Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]

but its not soooo fast. Is there a way to do this without looping?
You can do:

[[[],[],[],[],[]]] * 3000000

although I don't know if it performs any better than what you already
have.
Sep 5 '07 #2
Paul Rudin wrote:
Dr Mephesto <dn****@googlemail.comwrites:
>Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]

but its not soooo fast. Is there a way to do this without looping?

You can do:

[[[],[],[],[],[]]] * 3000000

although I don't know if it performs any better than what you already
have.
You are aware that this is hugely different, because the nested lists are
references, not new instances? Thus the outcome is most probably (given the
gazillion of times people stumbled over this) not the desired one...

Diez
Sep 5 '07 #3
Paul Rudin wrote:
Dr writes:
>I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]

but its not soooo fast. Is there a way to do this without looping?

You can do:

[[[],[],[],[],[]]] * 3000000

although I don't know if it performs any better than what you already
have.
Actually, that produces list of 3000000 references to the same
5-element list. A reduced example:
>>lst = [[[],[],[],[],[]]] * 3
lst[1][1].append(42)
print lst
[[[], [42], [], [], []], [[], [42], [], [], []], [[], [42], [], [], []]]
--
--Bryan
Sep 5 '07 #4
"Diez B. Roggisch" <de***@nospam.web.dewrites:
Paul Rudin wrote:
>Dr Mephesto <dn****@googlemail.comwrites:
>>Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]

but its not soooo fast. Is there a way to do this without looping?

You can do:

[[[],[],[],[],[]]] * 3000000

although I don't know if it performs any better than what you already
have.

You are aware that this is hugely different, because the nested lists are
references, not new instances? Thus the outcome is most probably (given the
gazillion of times people stumbled over this) not the desired one...
Err, yes sorry. I should try to avoid posting before having coffee in
the mornings.
Sep 5 '07 #5
yep, thats why I'm asking :)

On Sep 5, 12:22 pm, "Diez B. Roggisch" <de...@nospam.web.dewrote:
Paul Rudin wrote:
Dr Mephesto <dnh...@googlemail.comwrites:
Hi!
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
but its not soooo fast. Is there a way to do this without looping?
You can do:
[[[],[],[],[],[]]] * 3000000
although I don't know if it performs any better than what you already
have.

You are aware that this is hugely different, because the nested lists are
references, not new instances? Thus the outcome is most probably (given the
gazillion of times people stumbled over this) not the desired one...

Diez

Sep 5 '07 #6
In article <11*********************@k79g2000hse.googlegroups. com>,
Dr Mephesto <dn****@googlemail.comwrote:
>
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Why do you want to pre-create this? Why not just create the big list and
sublists as you append data to the sublists?
--
Aahz (aa**@pythoncraft.com) <* http://www.pythoncraft.com/

"Many customs in this life persist because they ease friction and promote
productivity as a result of universal agreement, and whether they are
precisely the optimal choices is much less important." --Henry Spencer
http://www.lysator.liu.se/c/ten-commandments.html
Sep 5 '07 #7
On Sep 5, 7:50 pm, Dr Mephesto <dnh...@googlemail.comwrote:
Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Will each and every of the 3,000,000 slots be used? If not, you may be
much better off storagewise if you used a dictionary instead of a
list, at the cost of slower access.

Cheers,
John

Sep 5 '07 #8
Dr Mephesto <dn****@googlemail.comwrites:
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will
append data each of the 5 sublists, so they will be of varying
lengths (so no arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]
You might want to use a tuple as the container for the lower-level
lists -- it's more compact and costs less allocation-wise.

But the real problem is not list allocation vs tuple allocation, nor
is it looping in Python; surprisingly, it's the GC. Notice this:

$ python
Python 2.5.1 (r251:54863, May 2 2007, 16:56:35)
[GCC 4.1.2 (Ubuntu 4.1.2-0ubuntu4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>import time
t0=time.time(); l=[([],[],[],[],[]) for _ in xrange(3000000)];
t1=time.time()
t1-t0
143.89971613883972

Now, with the GC disabled:
$ python
Python 2.5.1 (r251:54863, May 2 2007, 16:56:35)
[GCC 4.1.2 (Ubuntu 4.1.2-0ubuntu4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>import gc
gc.disable()
import time
t0=time.time(); l=[([],[],[],[],[]) for _ in xrange(3000000)];
t1=time.time()
t1-t0
2.9048631191253662

The speed difference is staggering, almost 50-fold. I suspect GC
degrades the (amortized) linear-time list building into quadratic
time. Since you allocate all the small lists, the GC gets invoked
every 700 or so allocations, and has to visit more and more objects in
each pass. I'm not sure if this can be fixed (shouldn't the
generational GC only have to visit the freshly created objects rather
than all of them?), but it has been noticed on this group before.

If you're building large data structures and don't need to reclaim
cyclical references, I suggest turning GC off, at least during
construction.
Sep 5 '07 #9
On 6 Sep., 01:34, "Delaney, Timothy (Tim)" <tdela...@avaya.comwrote:
Hrvoje Niksic wrote:
Dr Mephesto <dnh...@googlemail.comwrites:
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will
append data each of the 5 sublists, so they will be of varying
lengths (so no arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
If you're building large data structures and don't need to reclaim
cyclical references, I suggest turning GC off, at least during
construction.

This is good advice, but another question is whether you really want
such a list. You may well be better off with a database of some kind -
they're designed for manipulating large amounts of data.

Tim Delaney
I need some real speed! a database is waaay to slow for the algorithm
im using. and because the sublists are of varying size, i dont think I
can use an array...

Sep 6 '07 #10
On Sep 6, 12:47 am, Dr Mephesto <dnh...@googlemail.comwrote:
>
I need some real speed! a database is waaay to slow for the algorithm
im using. and because the sublists are of varying size, i dont think I
can use an array...- Hide quoted text -

- Show quoted text -
How about a defaultdict approach?

from collections import defaultdict

dataArray = defaultdict(lambda : [[],[],[],[],[]])
dataArray[1001][3].append('x')
dataArray[42000][2].append('y')

for k in sorted(dataArray.keys()):
print "%6d : %s" % (k,dataArray[k])

prints:
1001 : [[], [], [], ['x'], []]
42000 : [[], [], ['y'], [], []]

-- Paul
Sep 6 '07 #11
Dr Mephesto <dn****@googlemail.comwrites:
I need some real speed!
Is the speed with the GC turned off sufficient for your usage?
Sep 6 '07 #12
On 6 Sep., 09:30, Paul McGuire <pt...@austin.rr.comwrote:
On Sep 6, 12:47 am, Dr Mephesto <dnh...@googlemail.comwrote:
I need some real speed! a database is waaay to slow for the algorithm
im using. and because the sublists are of varying size, i dont think I
can use an array...- Hide quoted text -
- Show quoted text -

How about a defaultdict approach?

from collections import defaultdict

dataArray = defaultdict(lambda : [[],[],[],[],[]])
dataArray[1001][3].append('x')
dataArray[42000][2].append('y')

for k in sorted(dataArray.keys()):
print "%6d : %s" % (k,dataArray[k])

prints:
1001 : [[], [], [], ['x'], []]
42000 : [[], [], ['y'], [], []]

-- Paul
hey, that defaultdict thing looks pretty cool...

whats the overhead like for using a dictionary in python?

dave

Sep 7 '07 #13
En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <dn****@googlemail.com>
escribi�:
hey, that defaultdict thing looks pretty cool...

whats the overhead like for using a dictionary in python?
Dictionaries are heavily optimized in Python. Access time is O(1),
adding/removing elements is amortized O(1) (that is, constant time unless
it has to grow/shrink some internal structures.)

--
Gabriel Genellina

Sep 8 '07 #14
On Sep 8, 3:33 am, "Gabriel Genellina" <gagsl-...@yahoo.com.arwrote:
En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <dnh...@googlemail.com>
escribi?:
hey, that defaultdict thing looks pretty cool...
whats the overhead like for using a dictionary in python?

Dictionaries are heavily optimized in Python. Access time is O(1),
adding/removing elements is amortized O(1) (that is, constant time unless
it has to grow/shrink some internal structures.)

--
Gabriel Genellina
well, I want to (maybe) have a dictionary where the value is a list of
5 lists. And I want to add a LOT of data to these lists. 10´s of
millions of pieces of data. Will this be a big problem? I can just try
it out in practice on monday too :)

thanks

Sep 8 '07 #15
Dr Mephesto wrote:
On Sep 8, 3:33 am, "Gabriel Genellina" <gagsl-...@yahoo.com.arwrote:
>En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <dnh...@googlemail.com>
escribi?:
>>hey, that defaultdict thing looks pretty cool...
whats the overhead like for using a dictionary in python?
Dictionaries are heavily optimized in Python. Access time is O(1),
adding/removing elements is amortized O(1) (that is, constant time unless
it has to grow/shrink some internal structures.)

--
Gabriel Genellina

well, I want to (maybe) have a dictionary where the value is a list of
5 lists. And I want to add a LOT of data to these lists. 10´s of
millions of pieces of data. Will this be a big problem? I can just try
it out in practice on monday too :)

thanks

targetList = myDict[someKey] # This takes normal dict access time
for j in xrange(5) :
for i in xrange(50000000) : # Add a LOT of data to targetList
targetList[j].append(i) # This takes normal list access time

Sep 8 '07 #16
Dr Mephesto <dn****@googlemail.comwrites:
well, I want to (maybe) have a dictionary where the value is a list of
5 lists. And I want to add a LOT of data to these lists. 10´s of
millions of pieces of data. Will this be a big problem? I can just try
it out in practice on monday too :)
Yes, that may be a problem both because of the amount of memory
required, and because of how the GC works. You may want to turn off
the GC while building these list. Otherwise, think of some other
strategy, like files on disk.
Sep 8 '07 #17
Dr Mephesto a écrit :
Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this?
Hem... Did you consider the fact that RAM is not an unlimited resource?

Let's do some simple math (please someone correct me if I'm going off
the road): if a Python (empty) list object required 256 bits (if I refer
to some old post by GvR, it's probably more - 384 bytes at least. Some
Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to
build this list of lists. Which would make something around 3 Gb. Not
counting all other needed memory...

FWIW, run the following code:

# eatallramthenswap.py
d = {}
for i in xrange(3000000):
d[i] = ([], [], [], [], [])

And monitor what happens with top...

Sep 10 '07 #18
On Sep 8, 8:06 pm, Bruno Desthuilliers
<bdesth.quelquech...@free.quelquepart.frwrote:
Dr Mephesto a écrit :
Hi!
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Does anyone know the most efficient way to do this?

Hem... Did you consider the fact that RAM is not an unlimited resource?

Let's do some simple math (please someone correct me if I'm going off
the road): if a Python (empty) list object required 256 bits (if I refer
to some old post by GvR, it's probably more - 384 bytes at least. Some
Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to
build this list of lists. Which would make something around 3 Gb. Not
counting all other needed memory...

FWIW, run the following code:

# eatallramthenswap.py
d = {}
for i in xrange(3000000):
d[i] = ([], [], [], [], [])

And monitor what happens with top...
Unused ram is wasted ram :)

I tried using MySQL, and it was to slow. and I have 4gb anyway...

Sep 11 '07 #19
Dr Mephesto a écrit :
On Sep 8, 8:06 pm, Bruno Desthuilliers
<bdesth.quelquech...@free.quelquepart.frwrote:
>>Dr Mephesto a écrit :

>>>Hi!
>>>I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
>>>Does anyone know the most efficient way to do this?

Hem... Did you consider the fact that RAM is not an unlimited resource?

Let's do some simple math (please someone correct me if I'm going off
the road): if a Python (empty) list object required 256 bits (if I refer
to some old post by GvR, it's probably more - 384 bytes at least. Some
Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to
build this list of lists. Which would make something around 3 Gb. Not
counting all other needed memory...

FWIW, run the following code:

# eatallramthenswap.py
d = {}
for i in xrange(3000000):
d[i] = ([], [], [], [], [])

And monitor what happens with top...


Unused ram is wasted ram :)
Indeed. But when your app eats all RAM and swap and brings the system
down, users are usually a bit unhappy !-)
I tried using MySQL, and it was to slow.
Possibly.
and I have 4gb anyway...
*You* have 4gb. Yes, fine. But:

1/ please take time to re-read my post - the 3 gb is based on a very
optimistic estimation (256 bits) of the size of an empty list. If you
choose the (probably much closer to reality) estimate of 384 bits, then
you need (1 + (3000000 * 5)) * 384, which makes =~ 5 gb. More than what
*you* have. BTW, please remember that your OS and the Python interpreter
are going to eat some of these 4 gb, and that you intend to actually
*store* something - objects references - in these lists. Even if you do
have a few shared references, this means that you'll have to have RAM
space for *at least* 3000000 * 5 *more* Python objects (which make *1*
object per list...). Which will *at minima* use about the same amount of
RAM as the list of lists itself. Which take us to something like 10
gb... for *1* object per list. I of course suppose you plan to store
much more than 1 object per list !-)

2/ now ask yourself how many users of your application will have enough
RAM to run it...

So IMVHO, the question is not how to build such a list in less than x
minutes, but how to *not* build such a list. IOW, do you *really* need
to store all that stuff in RAM ?
Sep 11 '07 #20

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
by: Todd MacCulloch | last post by:
Suppose I have a master list and I need to create two derived lists, something like: s0 = s1 = s2 = But suppose generating s0 is expensive and/or s0 is big. In otherwords I'd like to go...
42
by: Jeff Wagner | last post by:
I've spent most of the day playing around with lists and tuples to get a really good grasp on what you can do with them. I am still left with a question and that is, when should you choose a list or...
0
by: Tech | last post by:
I have a table tblEmails which contains a list of email ids. ID,address_id,list_name is the structure of it. I need to create a sub list for a particular list_name say "Google Customers" , then...
41
by: Odd-R. | last post by:
I have to lists, A and B, that may, or may not be equal. If they are not identical, I want the output to be three new lists, X,Y and Z where X has all the elements that are in A, but not in B, and...
2
by: Ryan Ternier | last post by:
I'm currently run into a snag on one of my projects. We need to create an ordered list (Mutli levels). Ie: 1. Some Title ....A....Something here ....B....Something Else 2. Another Title
3
by: Richard Thornley | last post by:
Hello, I need some clarification in creating objects. Consider the following code... (note: The function InitializeListCombo initializes the combobox) Private daLists As New...
1
by: m.k.ball | last post by:
What is the best way to achieve alphabetically ordered lists that will index phrases starting with 'The' as if the word was at the end rather than the start of the phrase? Michael
5
by: Scott | last post by:
I'm sorry if most of my question's seem "petty", but as I've said before, I need to know the petty just because I need to know. This question is more along the lines of just having you guys...
11
by: Prateek | last post by:
I have 3 variable length lists of sets. I need to find the common elements in each list (across sets) really really quickly. Here is some sample code: # Doesn't make sense to union the sets -...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
1
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
0
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and...
0
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The...
0
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
0
muto222
php
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
0
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.