473,882 Members | 1,663 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

SqlCommand slow on INSERT


I have a c# program that loops through a table on a DB2 database.

On each iteration it assigns data to values in the SqlParameter
collection. The command text is an INSERT statement to a Sql Server
database, run with an .ExecuteQuery I enclosed the loop in a
SqlTransaction and commit it at the end.

I timed the program and it inserts about 70 records a second...which I
think is sort of slow...so I set up some Debug.WriteLine s to show where
the time was being spent.

The DataReader loop to the DB2 table is instantaneous. Almost 0s spent
getting each record. Same with assigning values.

The slow step is the actual execution of the SqlCommand. However, I
also ran a SQL Trace and monitored the execution of the statement on the
server. It took 0s to execute. The SqlCommand itself is adding an
extra 0.01s to 0.03s which can add up over the course of hundreds of
thousands of records.

So the only overhead is running .ExecuteQuery on the SqlCommand
object(!) Is there anyway to reduce or minimize this overhead, or a
setting that can affect performance.

I mean if my external source and target are running at 0s - my code
shouldn't be adding overhead to run a command!
Feb 24 '06
20 6109
> It's the fundamental difference in the mechanism. First, each INSERT
statement is sent as text to the server, not as raw data. The SQL Server
compiler then needs to compile the INSERT statement(s) and generate a
query plan.


....Which then causes me to question: Are you Prepare() ing the insert
statement? I know you mentioned you were using parameters, but I think the
speed bonus does not occurr if the statement is not prepared (because, as
Bill said, the insert statements have to be parsed each time).

At least with Oracle it makes a huge difference in speed.
Feb 24 '06 #11

Yes, I issue the .Prepare() method before entering the loop.

Gabriel Magaña wrote:
It's the fundamental difference in the mechanism. First, each INSERT
statement is sent as text to the server, not as raw data. The SQL Server
compiler then needs to compile the INSERT statement(s) and generate a
query plan.

...Which then causes me to question: Are you Prepare() ing the insert
statement? I know you mentioned you were using parameters, but I think the
speed bonus does not occurr if the statement is not prepared (because, as
Bill said, the insert statements have to be parsed each time).

At least with Oracle it makes a huge difference in speed.

Feb 24 '06 #12

Shouldn't using a .Prepare() method (which I do) eliminate that overhead?

Also, I enclose loop in a SqlTransaction.

Doesn't that mean that all the log entries are written when the
transaction is committed?

William (Bill) Vaughn wrote:
It's the fundamental difference in the mechanism. First, each INSERT
statement is sent as text to the server, not as raw data. The SQL Server
compiler then needs to compile the INSERT statement(s) and generate a query
plan. Nope, this does not take long, but it takes time. It then logs the
operation to the TL (which can't be disabled) and then to the database. At
that point the constraints are checked, the indexes are built and any RI
checks are made.

In the case of BCP, the protocol (which is proprietary and subject to
change) opens a channel, sends the meta data (once), and the server starts
an agent that simply writes the inbound data stream (binary) to the rows in
the target table. It requires very little overhead--90% of which can't be
disabled.

Feb 24 '06 #13
> Also, I enclose loop in a SqlTransaction.
Doesn't that mean that all the log entries are written when the
transaction is committed?


I might be mistaken for SQL Server (I'm certified as an Oracle DBA, but I
only know SQLServer superficially), but what happens in Oracle (somewhat
simplified) is that the DB itself is updated during a transaction, and the
log keeps the original data in it. That way when you commit a transaction
there is very little cost in terms of IO, there is only a cost if you
rollback the transaction (which is assumed to happen very few times). I
would not be suprised at all if SQL Server worked this way too, since, if
the assumption that the ratio of transaction commits to rollbacks is
extremely high holds, transactions take up very little extra overhead.

The only problem is when a transaction involves a great many operations:
The log system gets bogged down since a very big part of the log is "vital"
and could be needed in case of a rollback. For the stuff I do normally, I
have noticed that committing transactions every could hundred record
insertions is a good mid-point performance-wise. But that number is highly
dependant on the server hardware and also on the size of the records being
inserted. Inserting 100,000 records in a single transaction is just asking
for trouble, in other words.
Feb 24 '06 #14

I guess I might start believing that the transaction log is the
additional overhead, except that I would expect that to be reflected in
the duration reported in a the SQL trace.

But it isn't! The duration for each INSERT statement is 0 -- or less
than 1 millesecond! Wouldn't the time to write to the transaction log
be reflected in the duration for each INSERT?
Gabriel Magaña wrote:
Also, I enclose loop in a SqlTransaction.
Doesn't that mean that all the log entries are written when the
transaction is committed?

I might be mistaken for SQL Server (I'm certified as an Oracle DBA, but I
only know SQLServer superficially), but what happens in Oracle (somewhat
simplified) is that the DB itself is updated during a transaction, and the
log keeps the original data in it. That way when you commit a transaction
there is very little cost in terms of IO, there is only a cost if you
rollback the transaction (which is assumed to happen very few times). I
would not be suprised at all if SQL Server worked this way too, since, if
the assumption that the ratio of transaction commits to rollbacks is
extremely high holds, transactions take up very little extra overhead.

The only problem is when a transaction involves a great many operations:
The log system gets bogged down since a very big part of the log is "vital"
and could be needed in case of a rollback. For the stuff I do normally, I
have noticed that committing transactions every could hundred record
insertions is a good mid-point performance-wise. But that number is highly
dependant on the server hardware and also on the size of the records being
inserted. Inserting 100,000 records in a single transaction is just asking
for trouble, in other words.

Feb 25 '06 #15
> I guess I might start believing that the transaction log is the additional
overhead, except that I would expect that to be reflected in the duration
reported in a the SQL trace.
But it isn't! The duration for each INSERT statement is 0 -- or less
than 1 millesecond! Wouldn't the time to write to the transaction log be
reflected in the duration for each INSERT?


Writing to the transaction log at insert time is not the only I/O done on
the transaction log. There is another I/O done when you commit the
transaction, to tell the DB that the logs are ok to be recycled, since you
have just commited the transaction (and hence, cannot rollback anymore).

You might try splitting into several smaller transactions (with their own
commits, of course) to see if you get any speed improvements.

But to answer your question, I do not know exactly how SQL trace works, so I
do not know if it includes updates to the log.

Bypassing the log is dangerous though... You would not be able to rollback
(so that means no transactions either), nor would you be able to recover
from data curruption should the DB crash during the insert operation. You
may as well use DBF files instead of SQL server, inserts to that are
lightning fast. I have done this in the past where speed is absolutely of
the essence and data loss is no big deal... But there are lots of
downsides, I guess it's the same as using goto statements: Only people who
fully understand what they are doing should be messing with it!
Feb 25 '06 #16
No. You still don't understand. BCP requires no compile overhead and uses
direct IO to the DBMS instead of "logical" writes. And no, we don't usually
write to a permanent table so the Transaction Log issues are not a factor.
When we're ready to post the data to the live tables we run a SP that edits
the data and does a proper INSERT.

--
_______________ _______________ ______
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
_______________ _______________ ____

"John Bailo" <ja*****@texeme .com> wrote in message
news:15******** *************** *******@speakea sy.net...

Shouldn't using a .Prepare() method (which I do) eliminate that overhead?

Also, I enclose loop in a SqlTransaction.

Doesn't that mean that all the log entries are written when the
transaction is committed?

William (Bill) Vaughn wrote:
It's the fundamental difference in the mechanism. First, each INSERT
statement is sent as text to the server, not as raw data. The SQL Server
compiler then needs to compile the INSERT statement(s) and generate a
query plan. Nope, this does not take long, but it takes time. It then
logs the operation to the TL (which can't be disabled) and then to the
database. At that point the constraints are checked, the indexes are
built and any RI checks are made.

In the case of BCP, the protocol (which is proprietary and subject to
change) opens a channel, sends the meta data (once), and the server
starts an agent that simply writes the inbound data stream (binary) to
the rows in the target table. It requires very little overhead--90% of
which can't be disabled.


Feb 25 '06 #17

Would writing to a #temp table, or to a table variable (and then
transferring the data to a regular table with INSERT...SELECT ) be faster
than regular INSERTs to a table?
William (Bill) Vaughn wrote:
No. You still don't understand. BCP requires no compile overhead and uses
direct IO to the DBMS instead of "logical" writes. And no, we don't usually
write to a permanent table so the Transaction Log issues are not a factor.
When we're ready to post the data to the live tables we run a SP that edits
the data and does a proper INSERT.


Feb 28 '06 #18
Basically, yes. We generally transfer to "temporary" tables in the database
(not to tempdb) and perform intelligent INSERT/UPDATE statements from there.
It's far faster.

--
_______________ _______________ ______
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
_______________ _______________ ____

"John Bailo" <ja*****@texeme .com> wrote in message
news:B6******** ************@sp eakeasy.net...

Would writing to a #temp table, or to a table variable (and then
transferring the data to a regular table with INSERT...SELECT ) be faster
than regular INSERTs to a table?
William (Bill) Vaughn wrote:
No. You still don't understand. BCP requires no compile overhead and uses
direct IO to the DBMS instead of "logical" writes. And no, we don't
usually write to a permanent table so the Transaction Log issues are not
a factor. When we're ready to post the data to the live tables we run a
SP that edits the data and does a proper INSERT.

Feb 28 '06 #19

Hmmm...well, there's no reason I can't do that as an intermediate way in
my code.

Do I have to create the temporary table on the same command that I run
my INSERTS on?

Or can I just create two SqlCommand's and run them on the same
SqlConnection? Will the table remain in memory if I do that?

And then I'll need to do the transfer from the temp table to the table
after that.

William (Bill) Vaughn wrote:
Basically, yes. We generally transfer to "temporary" tables in the database
(not to tempdb) and perform intelligent INSERT/UPDATE statements from there.
It's far faster.


Feb 28 '06 #20

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
2343
by: Dinesh | last post by:
Hi, I have one stored procedure in SQL server in which i have written one insert statement. Now in my cs file i pass the parameters require to execute that stored procedure and finaly by mistaken I used command.ExecuteScalar instead of command.ExecuteNonQuery. Surprisingly i am able to insert the data in table. Can anyone please tell me how i am able to insert data using ExecuteScalar of SQLCommand class. Help is really appriciated.
2
1384
by: Tracey | last post by:
Sorry for the repeated post. I tried to update a record in database using SqlCommand.ExecuteNonQuery( ) method (I failed using SqlDataAdapter). I traced the above statement and found that it returned 1 rows afftected. Then I checked my database, the data was not updated at all Can someone give me a clue ?
6
8590
by: B B | last post by:
Okay, here is what's happening: I have a reasonably fast laptop (1.4 GHz Mobile M, so comparable to 2.5GHz P4) doing .net development. Running Windows XP pro, SP2 IIS is installed and running fine All SQL Servers I am referring to share a small (10 computers or so) LAN with a 100MB Switch. No other computers on the LAN exhibit this problem.
15
18899
by: Khurram | last post by:
I have a problem while inserting time value in the datetime Field. I want to Insert only time value in this format (08:15:39) into the SQL Date time Field. I tried to many ways, I can extract the value in timeonly format by using this command Format(now,"HH:mm:ss") But when I insert it into the Sql Server database, it embadded date value with it. the output looks like that "01/01/1900 08:59:00" in that case time is
8
4343
by: shenanwei | last post by:
I have 2 same windows machine, same instance configure and Database , all run DB2 UDB V8.1.5 Test 1 : create table OUT_1 (LINE VARCHAR(350), LINENUMBER INTEGER NOT NULL GENERATED ALWAYS AS IDENTITY (START WITH 1, INCREMENT BY 1, NO CACHE)); insert into out_1 (line) values ('C000000002XYTNF102020201855000000075000519600040547000003256510 0000000000000000000000000SIM CAR ADJ JOHN, SMITHJA CPRM SIM CARMBCORL XYTNF1020282726
5
2588
by: TheSteph | last post by:
How can I manually generate a SQL statement (SQLCommand) containing binary data ? I'd like to write all the text of the SQL statement for that operation... Example : I Have Binary data (that represent an image) Byte TmpByte; TmpByte = ....BINARY DATA OF AN IMAGE....
1
5852
by: Noppers | last post by:
I am trying to insert data into 2 tables, Order and Order_Item, in a transaction. Everything works fine if I only have 1 row in my objCartDT dataset. If I have only one row, the 2 tables are updated as expected. But if I have more than one row in the objCartDT, something fails, and I can't figure out what (but I suspect it has to do with the params in conjunction with "cmdNewOrder_Items.ExecuteNonQuery();" and the looping I'm trying to do. Maybe...
0
2234
by: Judge Garth | last post by:
With classic ADO you could attach a ReturnValue parameter to a command containing inline parameters. I didn't add any OTHER parameter objects to do so, The entire command was in the CommandText. The example is to insert a single new row to table Notes which has an Identity field named Notes_ID. There is no problem inserting the row, but I want the new Note_ID value returned with the command. Ex: In SQL Server (2000 or 2005) ...
2
3242
by: shauncl | last post by:
Hello Everyone, I'm currently writing a simple Windows Service (C#) that pings an ip address and inserts the pingreply result in a SQL table. I'm able to send a ping request and get the reply status but I have yet to figure out how to get the sql insert to work. The funny thing is, I'm able to ping and insert through a console application but not a Windows Service. I think it might have something to do with my Account information but my...
0
9931
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
10725
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
10403
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
9557
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
7113
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5781
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
5978
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
4601
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
3226
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.