By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
438,178 Members | 987 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 438,178 IT Pros & Developers. It's quick & easy.

User-specific sequences...

P: n/a
In a previous thread, it was determined that rather than
use a table sequence for my primary key, I might consider
using a compound primary key of customer_id and customer_seq
where the customer_seq would be a sequence starting at say
1 and counting for each user separately ... thereby making
it difficult to guess the maximum total number of rows in a
table by simply seeing the currval of the table's sequence.

My question is...what is the best way to implement this?

----

1) Have a table which stores the current sequence value for
each customer and perform an update each time an id is pulled?

...
SELECT customer_seq INTO my_customer_seq
FROM customer_seq_table
WHERE customer_id = in_customer_id
FOR UPDATE;

UPDATE customer_seq_table
SET customer_seq = customer_seq + 1
WHERE customer_id = in_customer_id;

RETURN (my_customer_seq);
...

I guess the FOR UPDATE gives me the row-level lock I need to
ensure that I avoid the race condition between read and update?

----

2) Use max value in table plus one...

INSERT INTO some_table (customer_id, customer_seq, ...)
VALUES (in_customer_id,
(
SELECT MAX(customer_seq)
FROM some_table
WHERE customer_id = in_customer_id
), ... );

I like this approach because I don't have to maintain a separate
table, but it might be a lot slower. I don't know if it's safe
from the race condition problem of simultaneous reads, though.

----

3) Something else I haven't thought of...

----

With the sequences built into PostgreSQL, I can use functions like
CURRVAL and NEXTVAL and I'm guaranteed thread-safety, right? I'd
like my solution to have some of these features as well since common
usage will have this PK (customer_id, customer_seq) be an
auto-generated field which I'll also be using as a foreign key and
so need to retrieve the value as soon as it gets created.

Anybody got a good, clean solution that doesn't involve a lot of
table/trigger/function surgery? If it's clean, I might employ the
technique on other tables as well and so I'd rather avoid having
5 functions, 3 triggers, and 2 new tables everytime I need to do
something like this.

----------
Dante



---------------------------(end of broadcast)---------------------------
TIP 1: subscribe and unsubscribe commands go to ma*******@postgresql.org

Nov 22 '05 #1
Share this Question
Share on Google+
1 Reply


P: n/a
It would be nice if we had "Read Uncommitted" transaction level for this,
hint ;-). In this scenario you will probably have to have your app check
for errors and retry if you get an abort.

I would do as follows:
create table widgets (
customer_id int references customers,
widget_id int
primary key (customer_id, widget_id)
);

Then I would have a trigger set to assign widget_id to (Select
max(widget_id) + 1 from widgets where customer_id = new.customer_id);

Then if by chance you have 2 duplicate submissions, you will get an error
you ned to handle in your app by retrying the insert.

Best Wishes,
Chris Travers

---------------------------(end of broadcast)---------------------------
TIP 2: you can get off all lists at once with the unregister command
(send "unregister YourEmailAddressHere" to ma*******@postgresql.org)

Nov 22 '05 #2

This discussion thread is closed

Replies have been disabled for this discussion.