473,800 Members | 2,541 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

very large dictionary

Hello,

I tried to load a 6.8G large dictionary on a server that has 128G of
memory. I got a memory error. I used Python 2.5.2. How can I load my
data?

SImon
Aug 1 '08 #1
20 10401
On Fri, 01 Aug 2008 00:46:09 -0700, Simon Strobl wrote:
I tried to load a 6.8G large dictionary on a server that has 128G of
memory. I got a memory error. I used Python 2.5.2. How can I load my
data?
What does "load a dictionary" mean? Was it saved with the `pickle`
module?

How about using a database instead of a dictionary?

Ciao,
Marc 'BlackJack' Rintsch
Aug 1 '08 #2
What does "load a dictionary" mean?

I had a file bigrams.py with a content like below:

bigrams = {
", djy" : 75 ,
", djz" : 57 ,
", djzoom" : 165 ,
", dk" : 28893 ,
", dk.au" : 854 ,
", dk.b." : 3668 ,
....

}

In another file I said:

from bigrams import bigrams
How about using a database instead of a dictionary?
If there is no other way to do it, I will have to learn how to use
databases in Python. I would prefer to be able to use the same type of
scripts with data of all sizes, though.
Aug 1 '08 #3
Simon Strobl:
I had a file bigrams.py with a content like below:
bigrams = {
", djy" : 75 ,
", djz" : 57 ,
", djzoom" : 165 ,
", dk" : 28893 ,
", dk.au" : 854 ,
", dk.b." : 3668 ,
...
}
In another file I said:
from bigrams import bigrams
Probably there's a limit in the module size here. You can try to
change your data format on disk, creating a text file like this:
", djy" 75
", djz" 57
", djzoom" 165
....
Then in a module you can create an empty dict, read the lines of the
data with:
for line in somefile:
part, n = .rsplit(" ", 1)
somedict[part.strip('"')] = int(n)

Otherwise you may have to use a BigTable, a DB, etc.

If there is no other way to do it, I will have to learn how to use
databases in Python. I would prefer to be able to use the same type of
scripts with data of all sizes, though.
I understand, I don't know if there are documented limits for the
dicts of the 64-bit Python.

Bye,
bearophile
Aug 1 '08 #4
Simon Strobl <Si**********@g mail.comwrote:
>I tried to load a 6.8G large dictionary on a server that has 128G of
memory. I got a memory error. I used Python 2.5.2. How can I load my
data?
Let's just eliminate one thing here: this server is running a
64-bit OS, isn't it? Because if it's a 32-bit OS, the blunt
answer is "You can't, no matter how much physical memory you
have" and you're going to have to go down the database route
(or some approach which stores the mapping on disk and only
loads items into memory on demand).

--
\S -- si***@chiark.gr eenend.org.uk -- http://www.chaos.org.uk/~sion/
"Frankly I have no feelings towards penguins one way or the other"
-- Arthur C. Clarke
her nu becomeþ se bera eadward ofdun hlæddre heafdes bæce bump bump bump
Aug 1 '08 #5
On Fri, 01 Aug 2008 14:47:17 +0100, Sion Arrowsmith wrote:
Simon Strobl <Si**********@g mail.comwrote:
>>I tried to load a 6.8G large dictionary on a server that has 128G of
memory. I got a memory error. I used Python 2.5.2. How can I load my
data?

Let's just eliminate one thing here: this server is running a 64-bit OS,
isn't it? Because if it's a 32-bit OS, the blunt answer is "You can't,
no matter how much physical memory you have" and you're going to have to
go down the database route (or some approach which stores the mapping on
disk and only loads items into memory on demand).
I very highly doubt he has 128GB of main memory and is running a 32bit OS.
Aug 1 '08 #6
On Fri, 01 Aug 2008 14:47:17 +0100, Sion Arrowsmith wrote:
Simon Strobl <Si**********@g mail.comwrote:
>>I tried to load a 6.8G large dictionary on a server that has 128G of
memory. I got a memory error. I used Python 2.5.2. How can I load my
data?
Let's just eliminate one thing here: this server is running a 64-bit OS,
isn't it? Because if it's a 32-bit OS, [etc...]
I very highly doubt he has 128GB of main memory and is running a 32bit OS.
Aug 1 '08 #7
Simon Strobl wrote:
Hello,

I tried to load a 6.8G large dictionary on a server that has 128G of
memory. I got a memory error. I used Python 2.5.2. How can I load my
data?

SImon
Take a look at the python bsddb module. Uing btree tables is fast, and
it has the benefit that once the table is open, the programing interface
is identical to a normal dictionary.

http://docs.python.org/lib/bsddb-objects.html

Sean
Aug 2 '08 #8
On Fri, 01 Aug 2008 00:46:09 -0700, Simon Strobl wrote:
Hello,

I tried to load a 6.8G large dictionary on a server that has 128G of
memory. I got a memory error. I used Python 2.5.2. How can I load my
data?
How do you know the dictionary takes 6.8G?

I'm going to guess an answer to my own question. In a later post, Simon
wrote:

[quote]
I had a file bigrams.py with a content like below:

bigrams = {
", djy" : 75 ,
", djz" : 57 ,
", djzoom" : 165 ,
", dk" : 28893 ,
", dk.au" : 854 ,
", dk.b." : 3668 ,
....

}
[end quote]
I'm guessing that the file is 6.8G of *text*. How much memory will it
take to import that? I don't know, but probably a lot more than 6.8G. The
compiler has to read the whole file in one giant piece, analyze it,
create all the string and int objects, and only then can it create the
dict. By my back-of-the-envelope calculations, the pointers alone will
require about 5GB, nevermind the objects they point to.

I suggest trying to store your data as data, not as Python code. Create a
text file "bigrams.tx t" with one key/value per line, like this:

djy : 75
djz : 57
djzoom : 165
dk : 28893
....

Then import it like such:

bigrams = {}
for line in open('bigrams.t xt', 'r'):
key, value = line.split(':')
bigrams[key.strip()] = int(value.strip ())
This will be slower, but because it only needs to read the data one line
at a time, it might succeed where trying to slurp all 6.8G in one piece
will fail.

--
Steven
Aug 2 '08 #9
On Fri, 1 Aug 2008 01:05:07 -0700 (PDT), Simon Strobl <Si**********@g mail.comwrote:
>What does "load a dictionary" mean?

I had a file bigrams.py with a content like below:

bigrams = {
", djy" : 75 ,
", djz" : 57 ,
", djzoom" : 165 ,
", dk" : 28893 ,
", dk.au" : 854 ,
", dk.b." : 3668 ,
...

}

In another file I said:

from bigrams import bigrams
>How about using a database instead of a dictionary?

If there is no other way to do it, I will have to learn how to use
databases in Python.
If you use Berkeley DB ("import bsddb"), you don't have to learn much.
These databases look very much like dictionaries string:string, only
they are disk-backed.

(I assume here that Berkeley DB supports 7GB data sets.)

/Jorgen

--
// Jorgen Grahn <grahn@ Ph'nglui mglw'nafh Cthulhu
\X/ snipabacken.se R'lyeh wgah'nagl fhtagn!
Aug 3 '08 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

12
1569
by: possibilitybox | last post by:
this code here: def wordcount(lines): for i in range(len(lines)/8): words = lines.split(" ") if not locals().has_key("frequency"): frequency = {} for word in words: if frequency.has_key(word):
1
2915
by: DJTB | last post by:
zodb-dev@zope.org] Hi, I'm having problems storing large amounts of objects in a ZODB. After committing changes to the database, elements are not cleared from memory. Since the number of objects I'd like to store in the ZODB is too large to fit in RAM, my program gets killed with signal 11 or signal 9... Below a minimal working (or actually: it doesn't work because of memory
6
6166
by: shailesh kumar | last post by:
Hi, I need to design data interfaces for accessing files of very large sizes efficiently. The data will be accessed in chunks of fixed size ... My data interface should be able to do a random seek in the file, as well as sequential access block by block.... One aspect of the usage of this interface is that there is quite good chance of accessing same blocks again and again by the application..
3
5275
by: Alex Vinokur | last post by:
Dann Corbit has implemented function int ilog2 (unsigned long) at http://groups.google.com/groups?selm=lkPa4.2165%24I41.1498%40client Is exist a similar C++-function for very large numbers, e.g., function with signature vector<unsigned long> ilog2 (const vector<unsigned long>&)? -- Alex Vinokur email: alex DOT vinokur AT gmail DOT com
6
2657
by: Greg | last post by:
I am working on a project that will have about 500,000 records in an XML document. This document will need to be queried with XPath, and records will need to be updated. I was thinking about splitting up the XML into several XML documents (perhaps 50,000 per document) to be more efficient but this will make things a lot more complex because the searching needs to go accross all 500,000 records. Can anyone point me to some best practices...
2
2185
by: shsandeep | last post by:
Hi all, I have heard and read this many times: "Partitions should only be used for 'very large' tables". What actually determines whether a table is 'very large' or not? I have tables containing 0.5 million rows, 8 million rows, 14 & 29 million rows as well. How do I categorize them? Any comments will be helpful.
1
6322
by: Lars B | last post by:
Hey guys, I have written a C++ program that passes data from a file to an FPGA board and back again using software and DMA buffers. In my program I need to compare the size of a given file against a software buffer of size 3MB. This is needed so as to see which function to use to read from the file. As the files used range from very large (>30GB) to very small (<3MB), I have enabled large file support and I obtain the file size by using the...
0
22482
by: zephyrus360 | last post by:
This is about a technique to find the mod of a very large integer with a normal small integer. I recently encountered this problem when I needed to compute the modulus of a very large number with a normal integer. I needed this in a C++ program running on a 32-bit UNIX platform. The problem was that the number was 28 digits long and no native datatype in c++ could store a number of that size. (Eg: 1088263455689473669888943602 % 380) Not...
0
216
by: M.-A. Lemburg | last post by:
On 2008-07-31 02:29, python@bdurham.com wrote: If you don't have a problem with taking a small performance hit, then I'd suggest to have a look at mxBeeBase, which is an on-disk dictionary implementation: http://www.egenix.com/products/python/mxBase/mxBeeBase/ Of course, you could also use a database table for this. Together with a proper index that should work as well (but it's likely slower
0
9690
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9551
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
1
10251
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
10033
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
9085
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
6811
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5469
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
5606
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
4149
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.