By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
444,199 Members | 1,120 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 444,199 IT Pros & Developers. It's quick & easy.

Pysqlite tables in RAM

P: n/a
Hi all,

I'm relatively new to python and am facing a problem with database
access

I want to store my data in a database on the disk. I also want to be
able to reload the tables into the RAM whenever I have a lot of disk
accesses and commit the changes back to the database. There is an
option of storing the data in the RAM where you connect to :memory:
instead of a DB file. The problem with this is that the data is lost
everytime you close the connection to the database. Could somebody
suggest a way to load the tables into the RAM as tables and not as some
lists or dictionaries?

Thanks,
Simba

Sep 29 '06 #1
Share this Question
Share on Google+
6 Replies


P: n/a
Ranjitha wrote:
Hi all,

I'm relatively new to python
And to databases ?
and am facing a problem with database
access

I want to store my data in a database on the disk. I also want to be
able to reload the tables into the RAM whenever I have a lot of disk
accesses and commit the changes back to the database.
This should be the database duty, not yours. Serious RDBMS are highly
optimized wrt/ caching and file I/O, and there are very few chances you
can do anything better by yourself.
There is an
option of storing the data in the RAM where you connect to :memory:
instead of a DB file. The problem with this is that the data is lost
everytime you close the connection to the database.
Seems quite obvious !-)
Could somebody
suggest a way to load the tables into the RAM as tables and not as some
lists or dictionaries?
There's nothing like a "table" in Python's builtin datatypes !-)

More seriously: don't bother.

Focus first on writing correct code. Then, *if* and *when* you *really*
have a performance problem, *first* use a profiler to check where the
*real* problem is. If it then happens that SQLite is the bottleneck, try
switching to a real RDBMS like PostgreSQL.

Remember the 3 golden rules about optimisation:
1/ don't optimize
2/ don't optimize
3/ for the experts only: don't optimize
My 2 cents...
--
bruno desthuilliers
"Premature optimization is the root of all evil."
(some wise guy)
Sep 29 '06 #2

P: n/a
Ranjitha wrote:
Hi all,

I'm relatively new to python and am facing a problem with database
access

I want to store my data in a database on the disk. I also want to be
able to reload the tables into the RAM whenever I have a lot of disk
accesses and commit the changes back to the database. There is an
option of storing the data in the RAM where you connect to :memory:
instead of a DB file. The problem with this is that the data is lost
everytime you close the connection to the database. Could somebody
suggest a way to load the tables into the RAM as tables and not as some
lists or dictionaries?
As long as your data isn't too voluminous it's quite practical to read
it from a SQLite database on disk into a SQLite database in memory,
given the same table structures on each.

But if the data isn't too voluminous then probably any
halfway-reasonable database will also cache it effectively. So you may
win less performance that you expect.

regards
Steve
--
Steve Holden +44 150 684 7255 +1 800 494 3119
Holden Web LLC/Ltd http://www.holdenweb.com
Skype: holdenweb http://holdenweb.blogspot.com
Recent Ramblings http://del.icio.us/steve.holden

Sep 29 '06 #3

P: n/a
Ranjitha wrote:
Hi all,

I'm relatively new to python and am facing a problem with database
access

I want to store my data in a database on the disk. I also want to be
able to reload the tables into the RAM whenever I have a lot of disk
accesses and commit the changes back to the database. There is an
option of storing the data in the RAM where you connect to :memory:
instead of a DB file. The problem with this is that the data is lost
everytime you close the connection to the database. Could somebody
suggest a way to load the tables into the RAM as tables and not as some
lists or dictionaries?

Thanks,
Simba
Just a side note: judging from your subject line, are you using the
pysqlite extension? In case you are, SQLite support comes with the
standard library in 2.5 as sqlite3, in case you want to switch.
Sep 29 '06 #4

P: n/a
Ranjitha wrote:
I want to store my data in a database on the disk. I also want to be
able to reload the tables into the RAM whenever I have a lot of disk
accesses and commit the changes back to the database.
using the cache_size and synchronous pragmas sounds like a better way to
trade reliability against speed/memory use. e.g.

table_memory = 100000000 # bytes
cur.execute("pragma cache_size = %d;" % (table_memory / 1500))

....

cur.execute("pragma synchronous = off;")
# do lots of stuff
cur.execute("pragma synchronous = full;")

for more on this, see: http://www.sqlite.org/pragma.html

</F>

Sep 29 '06 #5

P: n/a

Fredrik Lundh wrote:
Ranjitha wrote:
I want to store my data in a database on the disk. I also want to be
able to reload the tables into the RAM whenever I have a lot of disk
accesses and commit the changes back to the database.

using the cache_size and synchronous pragmas sounds like a better way to
trade reliability against speed/memory use. e.g.

table_memory = 100000000 # bytes
cur.execute("pragma cache_size = %d;" % (table_memory / 1500))

...

cur.execute("pragma synchronous = off;")
# do lots of stuff
cur.execute("pragma synchronous = full;")

for more on this, see: http://www.sqlite.org/pragma.html

</F>
Thanks a lot for the help but could you please elaborate on this. I'm
finding it difficult to follow the link you sent me. The jargon seems
all too new for me.

Thanks,
Ranjitha

Oct 6 '06 #6

P: n/a

Ranjitha wrote:
Fredrik Lundh wrote:
Ranjitha wrote:
I want to store my data in a database on the disk. I also want to be
able to reload the tables into the RAM whenever I have a lot of disk
accesses and commit the changes back to the database.
using the cache_size and synchronous pragmas sounds like a better way to
trade reliability against speed/memory use. e.g.

table_memory = 100000000 # bytes
cur.execute("pragma cache_size = %d;" % (table_memory / 1500))

...

cur.execute("pragma synchronous = off;")
# do lots of stuff
cur.execute("pragma synchronous = full;")

for more on this, see: http://www.sqlite.org/pragma.html

</F>

Thanks a lot for the help but could you please elaborate on this. I'm
finding it difficult to follow the link you sent me. The jargon seems
all too new for me.
For a start, you don't need to read the whole page, just the sections
on "cache_size" [the more memory in the cache, the faster it will run
-- up to a point, which you could experiment with] and "synchronous"
[off = faster/dangerous, on = slower/safer].

It would help greatly if you said which particular words or phrases you
don't understand, plus give an assurance that you have made *some*
effort to help yourself e.g. googled the puzzling words/phrases ...

Have you actually tried doing what Fredrik suggested?

Oct 6 '06 #7

This discussion thread is closed

Replies have been disabled for this discussion.