> Kevin,
See inline:
SQL Server 2000 & C# 2005
Currently if I want to execute a batch of INSERT's I will join all
the inserts up with ; and then execute them at once in one execution,
There is no reason to do this. The SqlDataAdapter will do this
for you, and you don't have to string all of those statements
together, worrying about the upper limit (the number of parameters
that can be passed in one shot).
For more information on how to enable batch processing, see the
section of the framework documentation titled "Performing Batch
Updates with a DataAdapter", located at (watch for line wrap):
http://msdn2.microsoft.com/en-us/library/kbbwt18a.aspx
however I am considering the use of parameterised queries - I'm
wondering if there is any major performance issues here, say I have
100 INSERT's, now I've got 100 individual Command objects.
Why will you have 100 different command objects? Unless you are
targeting 100 different tables, you should be able to factor it down
to a few queries and parameterize them.
Any ideas? I would've thought it's not going to hit massively but I
wanted some in-sight into how SQL/.Net would handle a transaction of
around 100 commands.
Well, it depends on the nature of the work that you are doing. If
it is imply inserts updates and deletes that are not too complex, then
it should be able to handle this just fine.
We are building a framework and basically we are using Businessobjects that
are completely disconnected from the database, we have a data layer that
updates the database from the businessobjects, there could be a 100 [different]
types of businessobject in an array that is sent to the data layer, it then
constructs all of the SQL statements and sends the lot off to SQL Server
in one big concatenation of sql strings.
What I want to do is convert this behaviour of constructing SQL strings to
use parameterised queries instead - the only thing is I need to be sure that
a) it's a good idea, b) there is no performance hit and c) it isn't going
to be more hassle than it's worth.
HTMS
Kev