473,325 Members | 2,671 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,325 software developers and data experts.

bulk insert in C# (?)

How can I use ADO.NET to insert lots of records in a very fast way?

Thanks!
Nov 16 '05 #1
10 31668
Not sure what your exact problem is, however here is a general solution:

You can use the BULK INSERT sql statement to insert large volumes of records from a text file. To call the BULK INSERT statement from ADO .NET, you can use a SqlCommand object to execute a DDL statment.

Example:
----------
private void Test()
{
SqlConnection conn;
SqlCommand command;

try
{
conn = new SqlConnection("Data Source=MYSERVER;Initial Catalog=Northwind;Integrated_Security=SSPI");

command = new SqlCommand();

conn.Open();

command.Connection = conn;
command.CommandText = "BULK INSERT Northwind.dbo.[Order Details]" +
@"FROM 'f:\orders\lineitem.tbl'" +
"WITH" +
"(" +
"FIELDTERMINATOR = '|'," +
"ROWTERMINATOR = ':\n'," +
"FIRE_TRIGGERS" +
")";

command.ExecuteNonQuery();
}
catch (Exception e)
{
MessageBox.Show (e.Message);
}

}

"Daniel P." wrote:
How can I use ADO.NET to insert lots of records in a very fast way?

Thanks!

Nov 16 '05 #2
Also you can use .Update method of dataAdapter object to load datatable to a db table.

http://msdn.microsoft.com/library/de...pdatetopic.asp

"Daniel P." wrote:
How can I use ADO.NET to insert lots of records in a very fast way?

Thanks!

Nov 16 '05 #3
Lenn,

This method really isn't too fast for inserting a large number of records.
The following comes from the article you cite:

"It should be noted that these statements are not performed as a batch
process; each row is updated individually."

Another possible way of doing batch updates is to batch them yourself. IOW,
you can create parse the file to create insert statements. You can
concatenate many (100's or possibly 1000's) of these statements together by
placing a semicolon in between each statement and then execute the
concatenated statement. This is a really fast way of inserting records in a
batch since it can greatly reduce the number of trips to the DB. If an error
occurs though, it's almost impossible to isolate which record caused the
problem.

Good luck.

Scott
Also you can use .Update method of dataAdapter object to load datatable to a db table.

http://msdn.microsoft.com/library/de...pdatetopic.asp
Nov 16 '05 #4
Probably won't help you today, but in ADO.Net 2.0 (Whidbey), they are
supposed to be adding bulk operations. (so you would just be able to use
the data adapter, but set a batch size property). I saw a sample once of
the actual SQL that was generated. It was very similar to what you
describe, Lenn. I think that they separated each insert with a GO
statement.
"Scott" <me@me.com> wrote in message
news:ud**************@TK2MSFTNGP09.phx.gbl...
Lenn,

This method really isn't too fast for inserting a large number of records.
The following comes from the article you cite:

"It should be noted that these statements are not performed as a batch
process; each row is updated individually."

Another possible way of doing batch updates is to batch them yourself. IOW, you can create parse the file to create insert statements. You can
concatenate many (100's or possibly 1000's) of these statements together by placing a semicolon in between each statement and then execute the
concatenated statement. This is a really fast way of inserting records in a batch since it can greatly reduce the number of trips to the DB. If an error occurs though, it's almost impossible to isolate which record caused the
problem.

Good luck.

Scott
Also you can use .Update method of dataAdapter object to load datatable
to a db table.

http://msdn.microsoft.com/library/de...pdatetopic.asp

Nov 16 '05 #5
Quick correction: It was Scott who described the SQL. Sorry.
"J.Marsch" <je****@ctcdeveloper.com> wrote in message
news:eP**************@TK2MSFTNGP11.phx.gbl...
Probably won't help you today, but in ADO.Net 2.0 (Whidbey), they are
supposed to be adding bulk operations. (so you would just be able to use
the data adapter, but set a batch size property). I saw a sample once of
the actual SQL that was generated. It was very similar to what you
describe, Lenn. I think that they separated each insert with a GO
statement.
"Scott" <me@me.com> wrote in message
news:ud**************@TK2MSFTNGP09.phx.gbl...
Lenn,

This method really isn't too fast for inserting a large number of records. The following comes from the article you cite:

"It should be noted that these statements are not performed as a batch
process; each row is updated individually."

Another possible way of doing batch updates is to batch them yourself. IOW,
you can create parse the file to create insert statements. You can
concatenate many (100's or possibly 1000's) of these statements together

by
placing a semicolon in between each statement and then execute the
concatenated statement. This is a really fast way of inserting records in a
batch since it can greatly reduce the number of trips to the DB. If an error
occurs though, it's almost impossible to isolate which record caused the
problem.

Good luck.

Scott
Also you can use .Update method of dataAdapter object to load

datatable to
a db table.

http://msdn.microsoft.com/library/de...pdatetopic.asp


Nov 16 '05 #6
I thought about that but from what I know the BULK INSERT needs to have the
file on the SQL server or being able to access it on a network share. This
cannot happen in my environment.

"SoKool" <So****@discussions.microsoft.com> wrote in message
news:05**********************************@microsof t.com...
Not sure what your exact problem is, however here is a general solution:

You can use the BULK INSERT sql statement to insert large volumes of records from a text file. To call the BULK INSERT statement from ADO .NET,
you can use a SqlCommand object to execute a DDL statment.
Example:
----------
private void Test()
{
SqlConnection conn;
SqlCommand command;

try
{
conn = new SqlConnection("Data Source=MYSERVER;Initial Catalog=Northwind;Integrated_Security=SSPI");
command = new SqlCommand();

conn.Open();

command.Connection = conn;
command.CommandText = "BULK INSERT Northwind.dbo.[Order Details]" +
@"FROM 'f:\orders\lineitem.tbl'" +
"WITH" +
"(" +
"FIELDTERMINATOR = '|'," +
"ROWTERMINATOR = ':\n'," +
"FIRE_TRIGGERS" +
")";

command.ExecuteNonQuery();
}
catch (Exception e)
{
MessageBox.Show (e.Message);
}

}

"Daniel P." wrote:
How can I use ADO.NET to insert lots of records in a very fast way?

Thanks!

Nov 16 '05 #7
I decided to use an INSERT statment prepared with variable names @ inside.
Then set the params and call ExecuteNonQuery for each record. It is not as
fast as the previous implementation in C++ and DbLib but it works fione for
the time being..

Thanks!

"Scott" <me@me.com> wrote in message
news:ud**************@TK2MSFTNGP09.phx.gbl...
Lenn,

This method really isn't too fast for inserting a large number of records.
The following comes from the article you cite:

"It should be noted that these statements are not performed as a batch
process; each row is updated individually."

Another possible way of doing batch updates is to batch them yourself. IOW, you can create parse the file to create insert statements. You can
concatenate many (100's or possibly 1000's) of these statements together by placing a semicolon in between each statement and then execute the
concatenated statement. This is a really fast way of inserting records in a batch since it can greatly reduce the number of trips to the DB. If an error occurs though, it's almost impossible to isolate which record caused the
problem.

Good luck.

Scott
Also you can use .Update method of dataAdapter object to load datatable
to a db table.

http://msdn.microsoft.com/library/de...pdatetopic.asp

Nov 16 '05 #8
Might be too late to be of help... and I'm not sure if you're using SQL
Server...but..

In the past, I have inserted many records into one to many tables with a
single call by creating a sproc capable of processing a XML input
parameter (text or ntext data type).

I found this technique useful, too, when having to insert to or update a
table with many fields of data. Instead of having deal with all of the
fields, I could use a loop in the C# form to gather up all the data
field names and values, build an XML document and feed it to a sproc.

If done right, you don't have to update the sproc each time you add or
change fields in the table.

*** Sent via Devdex http://www.devdex.com ***
Don't just participate in USENET...get rewarded for it!
Nov 16 '05 #9
This is a great suggestion -- thanks!

Scott
In the past, I have inserted many records into one to many tables with a
single call by creating a sproc capable of processing a XML input
parameter (text or ntext data type).

Nov 16 '05 #10
Cool idea! I may use it.

Thanks Kenneth!
"Kenneth Courville" <krcourville@-nospam-msn.com> wrote in message
news:eP**************@TK2MSFTNGP09.phx.gbl...
Might be too late to be of help... and I'm not sure if you're using SQL
Server...but..

In the past, I have inserted many records into one to many tables with a
single call by creating a sproc capable of processing a XML input
parameter (text or ntext data type).

I found this technique useful, too, when having to insert to or update a
table with many fields of data. Instead of having deal with all of the
fields, I could use a loop in the C# form to gather up all the data
field names and values, build an XML document and feed it to a sproc.

If done right, you don't have to update the sproc each time you add or
change fields in the table.

*** Sent via Devdex http://www.devdex.com ***
Don't just participate in USENET...get rewarded for it!

Nov 16 '05 #11

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: php newbie | last post by:
Hello, I am trying to load a simple tab-delimited data file to SQL Server. I created a format file to go with it, since the data file differs from the destination table in number of columns. ...
5
by: me | last post by:
I'm also having problems getting the bulk insert to work. I don't know anything about it except what I've gleened from BOL but I'm not seeming to get anywhere...Hopefully there is some little (or...
7
by: iqbal | last post by:
Hi all, We have an application through which we are bulk inserting rows into a view. The definition of the view is such that it selects columns from a table on a remote server. I have added the...
6
by: pk | last post by:
Sorry for the piece-by-piece nature of this post, I moved it from a dormant group to this one and it was 3 separate posts in the other group. Anyway... I'm trying to bulk insert a text file of...
16
by: Philip Boonzaaier | last post by:
I want to be able to generate SQL statements that will go through a list of data, effectively row by row, enquire on the database if this exists in the selected table- If it exists, then the colums...
3
by: moonriver | last post by:
Right now I develop an application to retrieve over 30,000 records from a binary file and then load them into a SQL Server DB. So far I load those records one by one, but the performance is very...
2
by: Ted | last post by:
I have BULK INSERT T-SQL statements that work for all of my basic data tables except for one. Here is the problem statement (with obvious replacement of irrelevant path info): BULK INSERT...
3
by: Tim Satterwhite | last post by:
Hi All, I think this is a thorny problem, and I'm hoping you can help. I've not found this exact issue described anywhere yet. I have a stored procedure that calls BULK INSERT on a set of...
0
by: ozkhillscovington | last post by:
We have sp's in place that do BULK INSERTS from txt files into the tables. This works fine, however they have asked us to add a field that identifies accounting ctr. The only thing that identifies...
0
by: rshivaraman | last post by:
BULK INSERT bill_tbl FROM 'd:\ftp_Data\in\baddress.dat' WITH ( FIELDTERMINATOR = ';', ROWTERMINATOR = '\n' ) --------------------------------- This is the query used to populate bill_tbl....
0
by: DolphinDB | last post by:
Tired of spending countless mintues downsampling your data? Look no further! In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million...
0
by: Vimpel783 | last post by:
Hello! Guys, I found this code on the Internet, but I need to modify it a little. It works well, the problem is this: Data is sent from only one cell, in this case B5, but it is necessary that data...
0
by: jfyes | last post by:
As a hardware engineer, after seeing that CEIWEI recently released a new tool for Modbus RTU Over TCP/UDP filtering and monitoring, I actively went to its official website to take a look. It turned...
0
by: ArrayDB | last post by:
The error message I've encountered is; ERROR:root:Error generating model response: exception: access violation writing 0x0000000000005140, which seems to be indicative of an access violation...
1
by: PapaRatzi | last post by:
Hello, I am teaching myself MS Access forms design and Visual Basic. I've created a table to capture a list of Top 30 singles and forms to capture new entries. The final step is a form (unbound)...
1
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
1
by: Defcon1945 | last post by:
I'm trying to learn Python using Pycharm but import shutil doesn't work
1
by: Shællîpôpï 09 | last post by:
If u are using a keypad phone, how do u turn on JavaScript, to access features like WhatsApp, Facebook, Instagram....
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.