By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
437,777 Members | 1,755 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 437,777 IT Pros & Developers. It's quick & easy.

Per-Table Transaction Isolation Level?

P: n/a
Hi

I'd like to know if there is a way to specify different transaction
isolation levels for different tables in the db. The reason i'm asking
this (rather bizarre sounding, i know ;-) ) question is the following:

I'm importing about 2 million records into my application each day (the
data is more or less fully replaced each day). My importer updates only
a few tables (about 5 - 10), but reads a lot of other tables (10 or so)
while importing. Those (read-only, meta-information) tables contains
information on how to
import the data, and what reports to calculate from the imported data.

My import sometimes crashed, becausse the meta-information tables are
changed while importing (e.h, I pass a id to a function, the function
does some calculations, than tries to select the row with the given id,
but fails, because the row was deleted in the meantime). I understand
that the standard approach to this problem is to set the transaction
isolation level to "serializeable", thus avoiding non-repeatable reads.

But since the import is a lenghty operation (a few hours), I don't want
to import in a searializeable transaction, since it would force me to
import "in a loop" until no serialization error occurs while importing.

But since it's only the meta-information tables for which I want to
avoid non-repeatable reads, and since those are read-only anyway (for my
importer), I wouldn't have to fear getting "serialization errors" when I
access only those tables in serializeable mode (since read-only
transaction never trigger serialization errors).

I know I could simulate something like that using dblink, but if
possible I'd prefer a simpler approach (Using dblink would meand that I
need to rewrite large parts of import, since it's mostly stored procedures).

greetings, Florian Pflug

---------------------------(end of broadcast)---------------------------
TIP 5: Have you checked our extensive FAQ?

http://www.postgresql.org/docs/faqs/FAQ.html

Nov 23 '05 #1
Share this Question
Share on Google+
3 Replies


P: n/a
On Tue, Nov 09, 2004 at 04:34:16AM +0100, Florian G. Pflug wrote:
My import sometimes crashed, becausse the meta-information tables are
changed while importing (e.h, I pass a id to a function, the function
does some calculations, than tries to select the row with the given id,
but fails, because the row was deleted in the meantime). I understand
that the standard approach to this problem is to set the transaction
isolation level to "serializeable", thus avoiding non-repeatable reads.


Sounds like you could use savepoints to be able to retry without
starting from scratch:

- function gets the Id
- savepoint foo
- do something with Id
- try to get row == Id
- if it doesn't exist, rollback to foo, go to top
- release foo
- go to top

--
Alvaro Herrera (<alvherre[a]dcc.uchile.cl>)
"Uno combate cuando es necesario... ¡no cuando está de humor!
El humor es para el ganado, o para hacer el amor, o para tocar el
baliset. No para combatir." (Gurney Halleck)
---------------------------(end of broadcast)---------------------------
TIP 5: Have you checked our extensive FAQ?

http://www.postgresql.org/docs/faqs/FAQ.html

Nov 23 '05 #2

P: n/a
> I'd like to know if there is a way to specify different transaction
isolation levels for different tables in the db.

Simply set up a connection for each transaction isolation level
you need and read the appropriate data from whichever
connection is suitable.

Karsten
--
GPG key ID E4071346 @ wwwkeys.pgp.net
E167 67FD A291 2BEA 73BD 4537 78B9 A9F9 E407 1346

---------------------------(end of broadcast)---------------------------
TIP 3: if posting/reading through Usenet, please send an appropriate
subscribe-nomail command to ma*******@postgresql.org so that your
message can get through to the mailing list cleanly

Nov 23 '05 #3

P: n/a
On Tue, Nov 09, 2004 at 04:34:16AM +0100, Florian G. Pflug wrote:
My import sometimes crashed, becausse the meta-information tables are
changed while importing (e.h, I pass a id to a function, the function
does some calculations, than tries to select the row with the given id,
but fails, because the row was deleted in the meantime). I understand
that the standard approach to this problem is to set the transaction
isolation level to "serializeable", thus avoiding non-repeatable reads.
I solved a problem similar to this by open two connections to the
database, one to do the readonly queries, one soley to import data. Also
had the nice feature that an error in one connection doesn't effect the
other.

Different connections could run at different isolation levels if
necessary.
--
Martijn van Oosterhout <kl*****@svana.org> http://svana.org/kleptog/ Patent. n. Genius is 5% inspiration and 95% perspiration. A patent is a
tool for doing 5% of the work and then sitting around waiting for someone
else to do the other 95% so you can sue them.


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.0.6 (GNU/Linux)
Comment: For info see http://www.gnupg.org

iD8DBQFBkJqrY5Twig3Ge+YRAu0uAJ4k5BKKjxe+T0ItY7D09U UzOvXZqgCgi7on
Z50jpuogLnsOd5SRvDOjMLY=
=uwww
-----END PGP SIGNATURE-----

Nov 23 '05 #4

This discussion thread is closed

Replies have been disabled for this discussion.