By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,152 Members | 1,647 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,152 IT Pros & Developers. It's quick & easy.

bulk insert in C# (?)

P: n/a
How can I use ADO.NET to insert lots of records in a very fast way?

Thanks!
Nov 16 '05 #1
Share this Question
Share on Google+
10 Replies


P: n/a
Not sure what your exact problem is, however here is a general solution:

You can use the BULK INSERT sql statement to insert large volumes of records from a text file. To call the BULK INSERT statement from ADO .NET, you can use a SqlCommand object to execute a DDL statment.

Example:
----------
private void Test()
{
SqlConnection conn;
SqlCommand command;

try
{
conn = new SqlConnection("Data Source=MYSERVER;Initial Catalog=Northwind;Integrated_Security=SSPI");

command = new SqlCommand();

conn.Open();

command.Connection = conn;
command.CommandText = "BULK INSERT Northwind.dbo.[Order Details]" +
@"FROM 'f:\orders\lineitem.tbl'" +
"WITH" +
"(" +
"FIELDTERMINATOR = '|'," +
"ROWTERMINATOR = ':\n'," +
"FIRE_TRIGGERS" +
")";

command.ExecuteNonQuery();
}
catch (Exception e)
{
MessageBox.Show (e.Message);
}

}

"Daniel P." wrote:
How can I use ADO.NET to insert lots of records in a very fast way?

Thanks!

Nov 16 '05 #2

P: n/a
Also you can use .Update method of dataAdapter object to load datatable to a db table.

http://msdn.microsoft.com/library/de...pdatetopic.asp

"Daniel P." wrote:
How can I use ADO.NET to insert lots of records in a very fast way?

Thanks!

Nov 16 '05 #3

P: n/a
Lenn,

This method really isn't too fast for inserting a large number of records.
The following comes from the article you cite:

"It should be noted that these statements are not performed as a batch
process; each row is updated individually."

Another possible way of doing batch updates is to batch them yourself. IOW,
you can create parse the file to create insert statements. You can
concatenate many (100's or possibly 1000's) of these statements together by
placing a semicolon in between each statement and then execute the
concatenated statement. This is a really fast way of inserting records in a
batch since it can greatly reduce the number of trips to the DB. If an error
occurs though, it's almost impossible to isolate which record caused the
problem.

Good luck.

Scott
Also you can use .Update method of dataAdapter object to load datatable to a db table.

http://msdn.microsoft.com/library/de...pdatetopic.asp
Nov 16 '05 #4

P: n/a
Probably won't help you today, but in ADO.Net 2.0 (Whidbey), they are
supposed to be adding bulk operations. (so you would just be able to use
the data adapter, but set a batch size property). I saw a sample once of
the actual SQL that was generated. It was very similar to what you
describe, Lenn. I think that they separated each insert with a GO
statement.
"Scott" <me@me.com> wrote in message
news:ud**************@TK2MSFTNGP09.phx.gbl...
Lenn,

This method really isn't too fast for inserting a large number of records.
The following comes from the article you cite:

"It should be noted that these statements are not performed as a batch
process; each row is updated individually."

Another possible way of doing batch updates is to batch them yourself. IOW, you can create parse the file to create insert statements. You can
concatenate many (100's or possibly 1000's) of these statements together by placing a semicolon in between each statement and then execute the
concatenated statement. This is a really fast way of inserting records in a batch since it can greatly reduce the number of trips to the DB. If an error occurs though, it's almost impossible to isolate which record caused the
problem.

Good luck.

Scott
Also you can use .Update method of dataAdapter object to load datatable
to a db table.

http://msdn.microsoft.com/library/de...pdatetopic.asp

Nov 16 '05 #5

P: n/a
Quick correction: It was Scott who described the SQL. Sorry.
"J.Marsch" <je****@ctcdeveloper.com> wrote in message
news:eP**************@TK2MSFTNGP11.phx.gbl...
Probably won't help you today, but in ADO.Net 2.0 (Whidbey), they are
supposed to be adding bulk operations. (so you would just be able to use
the data adapter, but set a batch size property). I saw a sample once of
the actual SQL that was generated. It was very similar to what you
describe, Lenn. I think that they separated each insert with a GO
statement.
"Scott" <me@me.com> wrote in message
news:ud**************@TK2MSFTNGP09.phx.gbl...
Lenn,

This method really isn't too fast for inserting a large number of records. The following comes from the article you cite:

"It should be noted that these statements are not performed as a batch
process; each row is updated individually."

Another possible way of doing batch updates is to batch them yourself. IOW,
you can create parse the file to create insert statements. You can
concatenate many (100's or possibly 1000's) of these statements together

by
placing a semicolon in between each statement and then execute the
concatenated statement. This is a really fast way of inserting records in a
batch since it can greatly reduce the number of trips to the DB. If an error
occurs though, it's almost impossible to isolate which record caused the
problem.

Good luck.

Scott
Also you can use .Update method of dataAdapter object to load

datatable to
a db table.

http://msdn.microsoft.com/library/de...pdatetopic.asp


Nov 16 '05 #6

P: n/a
I thought about that but from what I know the BULK INSERT needs to have the
file on the SQL server or being able to access it on a network share. This
cannot happen in my environment.

"SoKool" <So****@discussions.microsoft.com> wrote in message
news:05**********************************@microsof t.com...
Not sure what your exact problem is, however here is a general solution:

You can use the BULK INSERT sql statement to insert large volumes of records from a text file. To call the BULK INSERT statement from ADO .NET,
you can use a SqlCommand object to execute a DDL statment.
Example:
----------
private void Test()
{
SqlConnection conn;
SqlCommand command;

try
{
conn = new SqlConnection("Data Source=MYSERVER;Initial Catalog=Northwind;Integrated_Security=SSPI");
command = new SqlCommand();

conn.Open();

command.Connection = conn;
command.CommandText = "BULK INSERT Northwind.dbo.[Order Details]" +
@"FROM 'f:\orders\lineitem.tbl'" +
"WITH" +
"(" +
"FIELDTERMINATOR = '|'," +
"ROWTERMINATOR = ':\n'," +
"FIRE_TRIGGERS" +
")";

command.ExecuteNonQuery();
}
catch (Exception e)
{
MessageBox.Show (e.Message);
}

}

"Daniel P." wrote:
How can I use ADO.NET to insert lots of records in a very fast way?

Thanks!

Nov 16 '05 #7

P: n/a
I decided to use an INSERT statment prepared with variable names @ inside.
Then set the params and call ExecuteNonQuery for each record. It is not as
fast as the previous implementation in C++ and DbLib but it works fione for
the time being..

Thanks!

"Scott" <me@me.com> wrote in message
news:ud**************@TK2MSFTNGP09.phx.gbl...
Lenn,

This method really isn't too fast for inserting a large number of records.
The following comes from the article you cite:

"It should be noted that these statements are not performed as a batch
process; each row is updated individually."

Another possible way of doing batch updates is to batch them yourself. IOW, you can create parse the file to create insert statements. You can
concatenate many (100's or possibly 1000's) of these statements together by placing a semicolon in between each statement and then execute the
concatenated statement. This is a really fast way of inserting records in a batch since it can greatly reduce the number of trips to the DB. If an error occurs though, it's almost impossible to isolate which record caused the
problem.

Good luck.

Scott
Also you can use .Update method of dataAdapter object to load datatable
to a db table.

http://msdn.microsoft.com/library/de...pdatetopic.asp

Nov 16 '05 #8

P: n/a
Might be too late to be of help... and I'm not sure if you're using SQL
Server...but..

In the past, I have inserted many records into one to many tables with a
single call by creating a sproc capable of processing a XML input
parameter (text or ntext data type).

I found this technique useful, too, when having to insert to or update a
table with many fields of data. Instead of having deal with all of the
fields, I could use a loop in the C# form to gather up all the data
field names and values, build an XML document and feed it to a sproc.

If done right, you don't have to update the sproc each time you add or
change fields in the table.

*** Sent via Devdex http://www.devdex.com ***
Don't just participate in USENET...get rewarded for it!
Nov 16 '05 #9

P: n/a
This is a great suggestion -- thanks!

Scott
In the past, I have inserted many records into one to many tables with a
single call by creating a sproc capable of processing a XML input
parameter (text or ntext data type).

Nov 16 '05 #10

P: n/a
Cool idea! I may use it.

Thanks Kenneth!
"Kenneth Courville" <krcourville@-nospam-msn.com> wrote in message
news:eP**************@TK2MSFTNGP09.phx.gbl...
Might be too late to be of help... and I'm not sure if you're using SQL
Server...but..

In the past, I have inserted many records into one to many tables with a
single call by creating a sproc capable of processing a XML input
parameter (text or ntext data type).

I found this technique useful, too, when having to insert to or update a
table with many fields of data. Instead of having deal with all of the
fields, I could use a loop in the C# form to gather up all the data
field names and values, build an XML document and feed it to a sproc.

If done right, you don't have to update the sproc each time you add or
change fields in the table.

*** Sent via Devdex http://www.devdex.com ***
Don't just participate in USENET...get rewarded for it!

Nov 16 '05 #11

This discussion thread is closed

Replies have been disabled for this discussion.