469,946 Members | 1,751 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,946 developers. It's quick & easy.

Insert Multiple Fails Completely If any Duplicates Exist

Joe
Hey,

I'm going to give some background on my situation in case anyone can
point out a way around my problem altogether... for the problem
itself, please skip to the bottom of the post. thanks.

I've been having some problems with database performance... Several
threads are constantly attempting INSERTs of new records into a large
table - that is hundreds of thousands of records -large by my account
:-) The table has a VARCHAR field tagged as UNIQUE and inserts are
kind of slow. i'm relying on the INSERT to fail on duplicates, i'm
not performing any checking before i attempt the INSERT (i figured an
additional SELECT would slow things even more - but any advice to the
contrary would be appreciated).

While these back end threads are constantly banging away with their
INSERTS, front end users are hitting the DB with searches. so i end
up with several threads simultaneously attempting very expensive DB
operations. the result is that the front end that users see is
DREADFULLY slow.

My proposed speedup is to modify the behavior of the backend threads
so that they don't constantly attempt INSERTS of single entries, but
build up a hashtable of local entries and at some interval, attempt a
multiple insert something like:

INSERT INTO table (col1,col2,col3 VALUES (x1,y1,z1), (x2,y2,z2)....

While i do expect to take a big hit when these multiple inserts
launch, in the time between, i expect front end performance to be
significantly improved.

The problem with this situation is that the multiple insert may
contain one or more new entries which will cause duplicates on my
table's UNIQUE VARCHAR field. I would like to have duplicates ignored
on insert, but have all newly unique entries be added. Sadly, from
what i've seen, once a duplicate violation for any of the new entries
is found, the call fails, leaving my newly UNIQUE entries NOT ADDED.
Is there any way to get the call to work as i want? Doing many single
row inserts sounds like it'll be much slower (judging from what i've
read) so i'd rather not resort to that.

I don't have a ton of DB experience (which may be obvious) so if i've
overlooked something simple please let me know. If there's a better
way to improve performance other than these multiple inserts, that
would be nice to know about too. Thanks for any insight,suggestions,
etc...

joe.
Jul 20 '05 #1
2 2391
Joe wrote:
once a duplicate violation for any of the new entries
is found, the call fails, leaving my newly UNIQUE entries NOT ADDED.


Read about the "IGNORE" keyword in this web page:

http://dev.mysql.com/doc/mysql/en/INSERT.html

Regards,
Bill K.
Jul 20 '05 #2
Joe
wow, couldn't have been more obvious... i don't know how i missed that. thanks.

joe.

Bill Karwin <bi**@karwin.com> wrote in message news:<ck********@enews1.newsguy.com>...
Joe wrote:
once a duplicate violation for any of the new entries
is found, the call fails, leaving my newly UNIQUE entries NOT ADDED.


Read about the "IGNORE" keyword in this web page:

http://dev.mysql.com/doc/mysql/en/INSERT.html

Regards,
Bill K.

Jul 20 '05 #3

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

6 posts views Thread by Tom Allison | last post: by
1 post views Thread by cefrancke | last post: by
reply views Thread by radiodes | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.