moonriver wrote:
Right now I develop an application to retrieve over 30,000 records from a binary file and then load them into a SQL Server DB. So far I load those records one by one, but the performance is very poor: it takes more than 6 hours to finish the loading.
So could I make use of the bulk-insert mechanism of SQL Server to reduce the loading time in C# codes? or other performance improvement solutions?
Hi moonriver,
as long as the ADO.NET doesn't provide an equivalent to IRowsetFastLoad
(OLEDB) you'll have to use BULK INSERT, BCP or DTS.
This article might help you in your decision which tool to use.
http://msdn.microsoft.com/library/en...t_bcp_67oh.asp http://longhorn.msdn.microsoft.com/l...sqlclient.aspx
shows that bulk-insert-functionality might be build-in in .NET 2.0.
[BULK INSERT]
Reference of the BULK INSERT-Command:
http://msdn.microsoft.com/library/?u...ba-bz_4fec.asp
This should be easy to use in C#, something like:
SqlCommand cm = new SqlCommand();
cmd.CommandText = "BULK INSERT myData\nFROM '"
+ fileName + "'\nWITH (BATCHSIZE="
+ sqlCount.ToString()
+ ", FIELDTERMINATOR=';')";
//Console.WriteLine(cmd.CommandText);
cmd.ExecuteNonQuery();
[bcp]
bcp can be used with System.Diagnostics.Process.Start
Reference:
http://msdn.microsoft.com/library/?u...p_bcp_61et.asp
[DTS]
AFAIK there is no DTS object library (class) for .NET so one solution is
to use the DTS objects through the COM/interop.
An Example is here:
http://sqldev.net/dts/ExecutePackage.htm
Another option will be to create a job with the DTS package as Step 1.
Then, you could use sp_start_job through a stored procedure that the
application executed through ADO.NET's command object.
HTH!
Cheers
Arne Janning