I have a c# program that loops through a table on a DB2 database.
On each iteration it assigns data to values in the SqlParameter
collection. The command text is an INSERT statement to a Sql Server
database, run with an .ExecuteQuery I enclosed the loop in a
SqlTransaction and commit it at the end.
I timed the program and it inserts about 70 records a second...which I
think is sort of slow...so I set up some Debug.WriteLine s to show where
the time was being spent.
The DataReader loop to the DB2 table is instantaneous. Almost 0s spent
getting each record. Same with assigning values.
The slow step is the actual execution of the SqlCommand. However, I
also ran a SQL Trace and monitored the execution of the statement on the
server. It took 0s to execute. The SqlCommand itself is adding an
extra 0.01s to 0.03s which can add up over the course of hundreds of
thousands of records.
So the only overhead is running .ExecuteQuery on the SqlCommand
object(!) Is there anyway to reduce or minimize this overhead, or a
setting that can affect performance.
I mean if my external source and target are running at 0s - my code
shouldn't be adding overhead to run a command! 20 6095
I have a Model A Ford that does not climb hills that well when towing my 65'
boat. What should I do?
Ah, ADO.NET (or any of the data access interfaces) are not designed to do
bulk inserts. While the 2.0 is faster than ever (with batch mode), it's
still orders of magnitude slower than the fastest DAI INSERT loop. Using
SqlBulkCopy or DTS I can move 500,000 rows in about 20 seconds.
--
_______________ _______________ ______
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker www.betav.com/blog/billva www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
_______________ _______________ ____
"John Bailo" <ja*****@texeme .com> wrote in message
news:tf******** ************@sp eakeasy.net... I have a c# program that loops through a table on a DB2 database.
On each iteration it assigns data to values in the SqlParameter collection. The command text is an INSERT statement to a Sql Server database, run with an .ExecuteQuery I enclosed the loop in a SqlTransaction and commit it at the end.
I timed the program and it inserts about 70 records a second...which I think is sort of slow...so I set up some Debug.WriteLine s to show where the time was being spent.
The DataReader loop to the DB2 table is instantaneous. Almost 0s spent getting each record. Same with assigning values.
The slow step is the actual execution of the SqlCommand. However, I also ran a SQL Trace and monitored the execution of the statement on the server. It took 0s to execute. The SqlCommand itself is adding an extra 0.01s to 0.03s which can add up over the course of hundreds of thousands of records.
So the only overhead is running .ExecuteQuery on the SqlCommand object(!) Is there anyway to reduce or minimize this overhead, or a setting that can affect performance.
I mean if my external source and target are running at 0s - my code shouldn't be adding overhead to run a command!
> I have a Model A Ford that does not climb hills that well when towing my 65' boat. What should I do?
Put smaller tires on the Model A.
I'm not John, but thanks for the tip on SqlBulkCopy! Some of us don't get
very many chances to peek over the trenches to find the new jewels in 2.0.
-Mike
"William (Bill) Vaughn" wrote:
I have a Model A Ford that does not climb hills that well when towing my 65' boat. What should I do? Ah, ADO.NET (or any of the data access interfaces) are not designed to do bulk inserts. While the 2.0 is faster than ever (with batch mode), it's still orders of magnitude slower than the fastest DAI INSERT loop. Using SqlBulkCopy or DTS I can move 500,000 rows in about 20 seconds.
-- _______________ _______________ ______ William (Bill) Vaughn Author, Mentor, Consultant Microsoft MVP INETA Speaker www.betav.com/blog/billva www.betav.com Please reply only to the newsgroup so that others can benefit. This posting is provided "AS IS" with no warranties, and confers no rights. _______________ _______________ ____
"John Bailo" <ja*****@texeme .com> wrote in message news:tf******** ************@sp eakeasy.net... I have a c# program that loops through a table on a DB2 database.
On each iteration it assigns data to values in the SqlParameter collection. The command text is an INSERT statement to a Sql Server database, run with an .ExecuteQuery I enclosed the loop in a SqlTransaction and commit it at the end.
I timed the program and it inserts about 70 records a second...which I think is sort of slow...so I set up some Debug.WriteLine s to show where the time was being spent.
The DataReader loop to the DB2 table is instantaneous. Almost 0s spent getting each record. Same with assigning values.
The slow step is the actual execution of the SqlCommand. However, I also ran a SQL Trace and monitored the execution of the statement on the server. It took 0s to execute. The SqlCommand itself is adding an extra 0.01s to 0.03s which can add up over the course of hundreds of thousands of records.
So the only overhead is running .ExecuteQuery on the SqlCommand object(!) Is there anyway to reduce or minimize this overhead, or a setting that can affect performance.
I mean if my external source and target are running at 0s - my code shouldn't be adding overhead to run a command!
Ok, I'm curious.
Just why is BULK INSERT so fast? I mean, does it not go through the
SQL DBMS itself and somehow write directly to the the .mdf file? What
is the mechanism it uses?
Also, running my code on a quad proc (SQL 2000, W2K) I am getting 5,000
records in 10-14s ( or about 450 to 500 recs per second ). Not too shabby!
William (Bill) Vaughn wrote: I have a Model A Ford that does not climb hills that well when towing my 65' boat. What should I do? Ah, ADO.NET (or any of the data access interfaces) are not designed to do bulk inserts. While the 2.0 is faster than ever (with batch mode), it's still orders of magnitude slower than the fastest DAI INSERT loop. Using SqlBulkCopy or DTS I can move 500,000 rows in about 20 seconds.
One thing I found that seems to make a little difference is using the
..Prepare() method on my SqlCommand.
This seems to take a SqlCommand and turn it into a sproc, /on-the-fly/,
for SqlCommands that need to be reused.
John Bailo wrote: I have a c# program that loops through a table on a DB2 database.
On each iteration it assigns data to values in the SqlParameter collection. The command text is an INSERT statement to a Sql Server database, run with an .ExecuteQuery I enclosed the loop in a SqlTransaction and commit it at the end.
I timed the program and it inserts about 70 records a second...which I think is sort of slow...so I set up some Debug.WriteLine s to show where the time was being spent.
The DataReader loop to the DB2 table is instantaneous. Almost 0s spent getting each record. Same with assigning values.
The slow step is the actual execution of the SqlCommand. However, I also ran a SQL Trace and monitored the execution of the statement on the server. It took 0s to execute. The SqlCommand itself is adding an extra 0.01s to 0.03s which can add up over the course of hundreds of thousands of records.
So the only overhead is running .ExecuteQuery on the SqlCommand object(!) Is there anyway to reduce or minimize this overhead, or a setting that can affect performance.
I mean if my external source and target are running at 0s - my code shouldn't be adding overhead to run a command!
John Bailo wrote: Ok, I'm curious.
Just why is BULK INSERT so fast? I mean, does it not go through the SQL DBMS itself and somehow write directly to the the .mdf file? What is the mechanism it uses?
Well, I'm not a SQL Server expert or anything, but I believe bulk
inserts bypass the transaction log. That probably accounts for some of
the difference anyway.
Also, running my code on a quad proc (SQL 2000, W2K) I am getting 5,000 records in 10-14s ( or about 450 to 500 recs per second ). Not too shabby!
To bad I can't programmaticall y disable the transaction log or request
sql not to log my inserts...
Brian Gideon wrote: John Bailo wrote:
Ok, I'm curious.
Just why is BULK INSERT so fast? I mean, does it not go through the SQL DBMS itself and somehow write directly to the the .mdf file? What is the mechanism it uses?
Well, I'm not a SQL Server expert or anything, but I believe bulk inserts bypass the transaction log. That probably accounts for some of the difference anyway.
Also, running my code on a quad proc (SQL 2000, W2K) I am getting 5,000 records in 10-14s ( or about 450 to 500 recs per second ). Not too shabby!
The bulk copy interface and functionality has been in place since the Sybase
versions... a long time. In the Model A days we had to use the BCP utility,
but in SS7 and later we could use one of the SQL Server management "object"
libraries (SQLDMO, SQLSMO) to call Bulk copy functionality. In SS 2000 (or
earlier?) it appeared as TSQL functions too.
Yes, Bulk Copy turns off the TL. It assumes that you're copying to a temp
table (or a permanent work table). Generally, these tables don't have
indexes or other constraints to slow down the import process. BCP also uses
special TDS stream packets to move the data. It's THE way to go when moving
data from/to the server. DTS leverages this technology by permitting you to
setup a scripted BCP operation that can transform (edit) the data as it's
moved from any data source with a provider or driver (.NET, OLE DB, ODBC,
text, tight-string-with-two-cans).
hth
--
_______________ _______________ ______
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker www.betav.com/blog/billva www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
_______________ _______________ ____
"John Bailo" <ja*****@texeme .com> wrote in message
news:mv******** ************@sp eakeasy.net... Ok, I'm curious.
Just why is BULK INSERT so fast? I mean, does it not go through the SQL DBMS itself and somehow write directly to the the .mdf file? What is the mechanism it uses?
Also, running my code on a quad proc (SQL 2000, W2K) I am getting 5,000 records in 10-14s ( or about 450 to 500 recs per second ). Not too shabby!
William (Bill) Vaughn wrote: I have a Model A Ford that does not climb hills that well when towing my 65' boat. What should I do? Ah, ADO.NET (or any of the data access interfaces) are not designed to do bulk inserts. While the 2.0 is faster than ever (with batch mode), it's still orders of magnitude slower than the fastest DAI INSERT loop. Using SqlBulkCopy or DTS I can move 500,000 rows in about 20 seconds.
The thing is, that answer really doesn't address my question.
As I mentioned, when I run the Trace utility, the duration of each
insert command is 0s.
It's the wait time to return before the SqlCommand and after the
SqlCommand.
So, to me, if I were to string together a bunch of INSERT statements,
that represent the changing parameters of the INSERT, it should run
lightening fast.
My question still stands: why is there so much overhead in the
SqlCommand object? Or, is it the transmission time to send the
command on the network? Or the conversion to TDS protocol?
If either of those factors could be reduced, then I could send my
INSERTs through almost instantly.
William (Bill) Vaughn wrote: The bulk copy interface and functionality has been in place since the Sybase versions... a long time. In the Model A days we had to use the BCP utility, but in SS7 and later we could use one of the SQL Server management "object" libraries (SQLDMO, SQLSMO) to call Bulk copy functionality. In SS 2000 (or earlier?) it appeared as TSQL functions too.
Yes, Bulk Copy turns off the TL. It assumes that you're copying to a temp table (or a permanent work table). Generally, these tables don't have indexes or other constraints to slow down the import process. BCP also uses special TDS stream packets to move the data. It's THE way to go when moving data from/to the server. DTS leverages this technology by permitting you to setup a scripted BCP operation that can transform (edit) the data as it's moved from any data source with a provider or driver (.NET, OLE DB, ODBC, text, tight-string-with-two-cans).
hth
It's the fundamental difference in the mechanism. First, each INSERT
statement is sent as text to the server, not as raw data. The SQL Server
compiler then needs to compile the INSERT statement(s) and generate a query
plan. Nope, this does not take long, but it takes time. It then logs the
operation to the TL (which can't be disabled) and then to the database. At
that point the constraints are checked, the indexes are built and any RI
checks are made.
In the case of BCP, the protocol (which is proprietary and subject to
change) opens a channel, sends the meta data (once), and the server starts
an agent that simply writes the inbound data stream (binary) to the rows in
the target table. It requires very little overhead--90% of which can't be
disabled.
--
_______________ _______________ ______
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker www.betav.com/blog/billva www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
_______________ _______________ ____
"John Bailo" <ja*****@texeme .com> wrote in message
news:Z4******** *************** *******@speakea sy.net... The thing is, that answer really doesn't address my question.
As I mentioned, when I run the Trace utility, the duration of each insert command is 0s.
It's the wait time to return before the SqlCommand and after the SqlCommand.
So, to me, if I were to string together a bunch of INSERT statements, that represent the changing parameters of the INSERT, it should run lightening fast.
My question still stands: why is there so much overhead in the SqlCommand object? Or, is it the transmission time to send the command on the network? Or the conversion to TDS protocol?
If either of those factors could be reduced, then I could send my INSERTs through almost instantly.
William (Bill) Vaughn wrote: The bulk copy interface and functionality has been in place since the Sybase versions... a long time. In the Model A days we had to use the BCP utility, but in SS7 and later we could use one of the SQL Server management "object" libraries (SQLDMO, SQLSMO) to call Bulk copy functionality. In SS 2000 (or earlier?) it appeared as TSQL functions too.
Yes, Bulk Copy turns off the TL. It assumes that you're copying to a temp table (or a permanent work table). Generally, these tables don't have indexes or other constraints to slow down the import process. BCP also uses special TDS stream packets to move the data. It's THE way to go when moving data from/to the server. DTS leverages this technology by permitting you to setup a scripted BCP operation that can transform (edit) the data as it's moved from any data source with a provider or driver (.NET, OLE DB, ODBC, text, tight-string-with-two-cans).
hth This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics |
by: Dinesh |
last post by:
Hi,
I have one stored procedure in SQL server in which i have written one
insert statement. Now in my cs file i pass the parameters require to execute
that stored procedure and finaly by mistaken I used command.ExecuteScalar
instead of command.ExecuteNonQuery. Surprisingly i am able to insert the data
in table. Can anyone please tell me how i am able to insert data using
ExecuteScalar of SQLCommand class.
Help is really appriciated.
|
by: Tracey |
last post by:
Sorry for the repeated post. I tried to update a record in database using SqlCommand.ExecuteNonQuery( ) method (I failed using SqlDataAdapter). I traced the above statement and found that it returned 1 rows afftected. Then I checked my database, the data was not updated at all
Can someone give me a clue ?
|
by: B B |
last post by:
Okay, here is what's happening:
I have a reasonably fast laptop (1.4 GHz Mobile M, so comparable to 2.5GHz
P4) doing .net development.
Running Windows XP pro, SP2
IIS is installed and running fine
All SQL Servers I am referring to share a small (10 computers or so) LAN
with a 100MB Switch.
No other computers on the LAN exhibit this problem.
|
by: Khurram |
last post by:
I have a problem while inserting time value in the datetime Field.
I want to Insert only time value in this format (08:15:39) into the SQL
Date time Field.
I tried to many ways, I can extract the value in timeonly format by using
this command
Format(now,"HH:mm:ss")
But when I insert it into the Sql Server database, it embadded date value
with it.
the output looks like that "01/01/1900 08:59:00" in that case time is
|
by: shenanwei |
last post by:
I have 2 same windows machine, same instance configure and Database ,
all run DB2 UDB V8.1.5
Test 1 :
create table OUT_1 (LINE VARCHAR(350), LINENUMBER INTEGER NOT NULL
GENERATED ALWAYS AS IDENTITY (START WITH 1, INCREMENT BY 1, NO CACHE));
insert into out_1 (line) values
('C000000002XYTNF102020201855000000075000519600040547000003256510
0000000000000000000000000SIM CAR ADJ JOHN, SMITHJA
CPRM SIM CARMBCORL XYTNF1020282726
| |
by: TheSteph |
last post by:
How can I manually generate a SQL statement (SQLCommand) containing binary
data ?
I'd like to write all the text of the SQL statement for that operation...
Example :
I Have Binary data (that represent an image)
Byte TmpByte;
TmpByte = ....BINARY DATA OF AN IMAGE....
|
by: Noppers |
last post by:
I am trying to insert data into 2 tables, Order and Order_Item, in a transaction. Everything works fine if I only have 1 row in my objCartDT dataset. If I have only one row, the 2 tables are updated as expected. But if I have more than one row in the objCartDT, something fails, and I can't figure out what (but I suspect it has to do with the params in conjunction with "cmdNewOrder_Items.ExecuteNonQuery();" and the looping I'm trying to do. Maybe...
|
by: Judge Garth |
last post by:
With classic ADO you could attach a ReturnValue parameter to a command containing inline parameters. I didn't add any OTHER parameter objects to do so, The entire command was in the CommandText. The example is to insert a single new row to table Notes which has an Identity field named Notes_ID. There is no problem inserting the row, but I want the new Note_ID value returned with the command.
Ex:
In SQL Server (2000 or 2005) ...
|
by: shauncl |
last post by:
Hello Everyone,
I'm currently writing a simple Windows Service (C#) that pings an ip address and inserts the pingreply result in a SQL table. I'm able to send a ping request and get the reply status but I have yet to figure out how to get the sql insert to work.
The funny thing is, I'm able to ping and insert through a console application but not a Windows Service. I think it might have something to do with my Account information but my...
|
by: Hystou |
last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it.
First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
|
by: Oralloy |
last post by:
Hello folks,
I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>".
The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed.
This is as boiled down as I can make it.
Here is my compilation command:
g++-12 -std=c++20 -Wnarrowing bit_field.cpp
Here is the code in...
| |
by: jinu1996 |
last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth.
The Art of Business Website Design
Your website is...
|
by: tracyyun |
last post by:
Dear forum friends,
With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
|
by: conductexam |
last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one.
At the time of converting from word file to html my equations which are in the word document file was convert into image.
Globals.ThisAddIn.Application.ActiveDocument.Select();...
|
by: TSSRALBI |
last post by:
Hello
I'm a network technician in training and I need your help.
I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs.
The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols.
I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
|
by: adsilva |
last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
|
by: muto222 |
last post by:
How can i add a mobile payment intergratation into php mysql website.
| |
by: bsmnconsultancy |
last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...
| |