473,403 Members | 2,222 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,403 software developers and data experts.

Opinions about Insertion Technique

Looking for some insight from the professionals about how they handle
row inserts. Specifically single row inserts through a stored
procedure versus bulk inserts.

One argument are people who say all inserts (and updates and deletions
I guess) should go through stored procedures. The reasoning is that
the developers that code the client side have no reason to understand
HOW the data is stored, just that it is. Another problem is an insert
that deals with multiple tables. It would be very easy for the
developer to forget a step. That last point also applies to business
logic. In my case, adding a security to our SecurityMaster can touch 1
to 4 tables depending on the type of security. Also, certain fields
are required while others are set to null for depending on the type.

Because a stored procedure cannot be passed datasets but only scalar
values, when you need to deal with multiple (i.e. bulk) rows you are
stuck using cursors. This post is NOT about the pros and cons of
cursors. There are plenty of those on the boards (some of them
probably started by me and showing my understanding (or more
correctly, lack of) of the way to do things). Stored procedures also
give you the ability to abort and/or log inserts that cannot happen
because of contraints and/or business rule failures.

Another approach is to write code (not accessible from outside the
database) that handles bulk inserts. You would need to write in rules
to "extract" or "exclude" rows that do not match constraints or
business rules otherwise ALL the inserts would fail because of one bad
row. I guess you could put the "potential" rows into a temp table.
Apply your rules to the temp table and delete / move rows that would
fail. Any rows left can that be bulk inserted. (You could also use the
rows that were moved to another temp table for logging why they
failed.)

So that leaves use with two possible ways to get data into the system.
A single row based approach for client apps and a bulk based for
internal use. But that leaves use with another problem. You now have
business logic in TWO separate areas. You have to remember to modify
code or fix bugs in multiple locations.

For those that are still reading my post, my question is...

How do you handle this? What is the approach you take?
Jul 20 '05 #1
4 2128
Jay

What we have are a series of core stored procedures for single row
insert/update/deletes. As you point out, this may do extra
validation/processing etc. (Examples include security checking,
auditing, optimistic locking, business rule validation etc). These are
auto-generated.

In addition, we sometimes do bulk inserts from another table or data
source. To ensure we dont duplicate validation/processing AND to
ensure all data goes through the one filter before being committed, we
simply call the exact same stored procedure multiple times, typically
from within a cursor, but can be done from code in our case C#.
Depending on the application, this may be done as a transaction block.
We typically let the stored proc do the validation of the data and log
any rows that failed to insert.

This works like a charm for us and I have used this approach in a
number of successful projects.

However, this bulk insert approach does not scale very well and if you
do a lot of batch style processing, you really need to be looking at
customised scripts/stored procedures to get the best performance.
Ja*******@hotmail.com (Jason) wrote in message news:<f0**************************@posting.google. com>...
Looking for some insight from the professionals about how they handle
row inserts. Specifically single row inserts through a stored
procedure versus bulk inserts.

One argument are people who say all inserts (and updates and deletions
I guess) should go through stored procedures. The reasoning is that
the developers that code the client side have no reason to understand
HOW the data is stored, just that it is. Another problem is an insert
that deals with multiple tables. It would be very easy for the
developer to forget a step. That last point also applies to business
logic. In my case, adding a security to our SecurityMaster can touch 1
to 4 tables depending on the type of security. Also, certain fields
are required while others are set to null for depending on the type.

Because a stored procedure cannot be passed datasets but only scalar
values, when you need to deal with multiple (i.e. bulk) rows you are
stuck using cursors. This post is NOT about the pros and cons of
cursors. There are plenty of those on the boards (some of them
probably started by me and showing my understanding (or more
correctly, lack of) of the way to do things). Stored procedures also
give you the ability to abort and/or log inserts that cannot happen
because of contraints and/or business rule failures.

Another approach is to write code (not accessible from outside the
database) that handles bulk inserts. You would need to write in rules
to "extract" or "exclude" rows that do not match constraints or
business rules otherwise ALL the inserts would fail because of one bad
row. I guess you could put the "potential" rows into a temp table.
Apply your rules to the temp table and delete / move rows that would
fail. Any rows left can that be bulk inserted. (You could also use the
rows that were moved to another temp table for logging why they
failed.)

So that leaves use with two possible ways to get data into the system.
A single row based approach for client apps and a bulk based for
internal use. But that leaves use with another problem. You now have
business logic in TWO separate areas. You have to remember to modify
code or fix bugs in multiple locations.

For those that are still reading my post, my question is...

How do you handle this? What is the approach you take?

Jul 20 '05 #2
Jason (Ja*******@hotmail.com) writes:
Because a stored procedure cannot be passed datasets but only scalar
values,


Actually, this is not completely true anymore. With SQL 2000 we got
the rowset provider OPENXML. So you can send in a complex dataset with
data for umpteen tables in one single procedure call.

It is probably still not as effecient as bulk insert, but bulk insert has
its limitations in validation etc. You can use a staging table, but
that comes with a cost of course.

--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #3
Pr************@hotmail.com (Mystery Man) wrote in message news:<87*************************@posting.google.c om>...
Jay

What we have are a series of core stored procedures for single row
insert/update/deletes. As you point out, this may do extra
validation/processing etc. (Examples include security checking,
auditing, optimistic locking, business rule validation etc). These are
auto-generated.


I use a combination - depending on what I want to achieve - if I am
loading a routine 'bulk' load then that runs split into adequate
batches (to minimise impact on transactions and logs) - but if I have
a luser wanting to enter data I like the stored proc approach for all
sorts of reasons.
Jul 20 '05 #4
My personal experience is that business logic gets more and more
complicated as a project develops. The users wants to change this,
tweak that, etc. And stored procedures are the only sure-fire way I
can accomplish all of the user's requests. I do try to "modularize"
common code. For example, a user wants to view a dataset. The web
page calls the viewdataSP. Inside the viewdataSP, it calls a
userAccessSP. Erland has a pretty good article about how to call
stored procedures from a stored procedure.
Jul 20 '05 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

16
by: Andy Dingley | last post by:
I've just had a call from these people, http://www.browsealoud.com offering to sell me their wares. Anyone have an opinion on it ? I'll post my own thoughts about 24 hours from now. I'm...
8
by: bipod.rafique | last post by:
Hello All, I need your help in understanding something. I have a simple class class test{ };
20
by: Patrick Guio | last post by:
Dear all, I have some problem with insertion operator together with namespace. I have a header file foo.h containing declaration of classes, typedefs and insertion operators for the typedefs in...
3
by: Jack Klein | last post by:
I'm looking for opinions on a C technique I, and others, have used successfully in the past. While some people swear by, apparently others swear at it. Assume a part of a program too large to...
5
by: John N. | last post by:
Hi All, Here I have a linked list each containing a char and is double linked. Then I have a pointer to an item in that list which is the current insertion point. In this funtion, the user...
1
by: JJ | last post by:
Hi, I am working on a proj and was thinking of creating a Document object that would hold filepath and doc name. And in this Document obj. when page that is using it loads I would call a...
28
by: Peter Michaux | last post by:
Hi, I'm playing with dynamic script insertion to make a request to the server for a JavaScript file to be automatically run when it arrives in the browser. It works but... The page caching...
7
by: Don Li | last post by:
Hi, Env is ms sql server 2000. ddl: create table srchPool(tid int primary key, taid int, s tynyint, uid tynyint); -- and sql server automatically creates a clustered index for the pk dml:
9
by: python_newbie | last post by:
I don't know this list is the right place for newbie questions. I try to implement insertion sort in pyhton. At first code there is no problem. But the second one ( i code it in the same pattern i...
112
by: Prisoner at War | last post by:
Friends, your opinions and advice, please: I have a very simple JavaScript image-swap which works on my end but when uploaded to my host at http://buildit.sitesell.com/sunnyside.html does not...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.