473,776 Members | 1,645 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Most efficient way to insert 1,000,000 records?

Hi,

I need to insert more than 500,000 records at the end of the day in a C#
application. I need to finish it as soon as possible.

I created a stored procedure and called it from ADO to insert one by one. It
is kind of slow (seems slower than using a DTS package to import from a file).

Just a general question, in ADO, what will be the MOST efficient way to do
this work. I normally do it as I described. I am using .NET framework 1.1
and SQL Server 2000.

Thanks a lot

Chris
Jan 4 '06 #1
13 2706
Chris,

I would send the parameters over in batches... Unfortunately, ADO.NET
doesn't get command batching for SQL Server until .NET 2.0.

Because of this, I would try and use DTS. What is the format of the
file to begin with? You might have much faster results doing that...

Hope this helps.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard. caspershouse.co m

"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:20******** *************** ***********@mic rosoft.com...
Hi,

I need to insert more than 500,000 records at the end of the day in a C#
application. I need to finish it as soon as possible.

I created a stored procedure and called it from ADO to insert one by one.
It
is kind of slow (seems slower than using a DTS package to import from a
file).

Just a general question, in ADO, what will be the MOST efficient way to do
this work. I normally do it as I described. I am using .NET framework 1.1
and SQL Server 2000.

Thanks a lot

Chris

Jan 4 '06 #2

"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:20******** *************** ***********@mic rosoft.com...
Hi,

I need to insert more than 500,000 records at the end of the day in a C#
application. I need to finish it as soon as possible.

I created a stored procedure and called it from ADO to insert one by one.
It
is kind of slow (seems slower than using a DTS package to import from a
file).

Just a general question, in ADO, what will be the MOST efficient way to do
this work. I normally do it as I described. I am using .NET framework 1.1
and SQL Server 2000.

Thanks a lot

Chris


The most efficient and fastest way is using Sqlserver's bcp utility, you can
also use T-SQL BULK INSERT, but anything else will be much slower.

Willy.
Jan 4 '06 #3
Hi,
"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:20******** *************** ***********@mic rosoft.com...
Hi,

I need to insert more than 500,000 records at the end of the day in a C#
application. I need to finish it as soon as possible.
In a C# app or in a DB used by a C# app?
I created a stored procedure and called it from ADO to insert one by one.
It
is kind of slow (seems slower than using a DTS package to import from a
file).

Just a general question, in ADO, what will be the MOST efficient way to do
this work. I normally do it as I described. I am using .NET framework 1.1
and SQL Server 2000.


In short, none, ADO.NET is not intented for this. You have two options, use
bulk copy (bcp.exe ) that comes with SQL server or as Paldino suggested
(that is also my recommended solution) using DTS, DTS can be scripted from
C# so you will have flexibility to change it as needed.
Just create a DTS from enterprise manager, select write package to disk and
later you can load and execute it.

Let me know if you need code, I do this in a couple of deployments.

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation
Jan 4 '06 #4
Thanks. I do not know that I can create DTS script from C#. Can you tell me
know how to do that?
One common thing I often need is to import data file with customized name,
like mm_dd_yy_file.c sv. I cannot figure out how to do that in Enerprise
Manager. If I write a C# app to do this job, do you think that I can convert
the app to a DTS script?

Chris

"Ignacio Machin ( .NET/ C# MVP )" wrote:
Hi,
"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:20******** *************** ***********@mic rosoft.com...
Hi,

I need to insert more than 500,000 records at the end of the day in a C#
application. I need to finish it as soon as possible.


In a C# app or in a DB used by a C# app?
I created a stored procedure and called it from ADO to insert one by one.
It
is kind of slow (seems slower than using a DTS package to import from a
file).

Just a general question, in ADO, what will be the MOST efficient way to do
this work. I normally do it as I described. I am using .NET framework 1.1
and SQL Server 2000.


In short, none, ADO.NET is not intented for this. You have two options, use
bulk copy (bcp.exe ) that comes with SQL server or as Paldino suggested
(that is also my recommended solution) using DTS, DTS can be scripted from
C# so you will have flexibility to change it as needed.
Just create a DTS from enterprise manager, select write package to disk and
later you can load and execute it.

Let me know if you need code, I do this in a couple of deployments.

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation

Jan 4 '06 #5

I am still wondering whether I should ask the firm to convert our
development system from .NET 1.1 to .NET 2.0. It is costly considering the
licences related to studio and third party packages.

In .NET 2.0 ADO, can I gain much better performance using "command
batching"? WHat does it exactly mean of command batching in .NET 2? COuld
you give me an example?

Thanks

Chris
"Nicholas Paldino [.NET/C# MVP]" wrote:
Chris,

I would send the parameters over in batches... Unfortunately, ADO.NET
doesn't get command batching for SQL Server until .NET 2.0.

Because of this, I would try and use DTS. What is the format of the
file to begin with? You might have much faster results doing that...

Hope this helps.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard. caspershouse.co m

"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:20******** *************** ***********@mic rosoft.com...
Hi,

I need to insert more than 500,000 records at the end of the day in a C#
application. I need to finish it as soon as possible.

I created a stored procedure and called it from ADO to insert one by one.
It
is kind of slow (seems slower than using a DTS package to import from a
file).

Just a general question, in ADO, what will be the MOST efficient way to do
this work. I normally do it as I described. I am using .NET framework 1.1
and SQL Server 2000.

Thanks a lot

Chris


Jan 4 '06 #6
Chris,

If you use command batching, then internally, the SqlCommand will create
a multiple statement command, with the appropriate parameters, and then pass
that to SQL Server. I think that the maximum number of parameters that can
be handled in this way is 2100.

The benefit of this is that it will cause less round trips between the
server and your code, which always improves performance.

However, if you have your data in a file already that is
computer-readable, chances are the suggestion by Willy to use the BULK
INSERT command would work. Either that, or use DTS.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard. caspershouse.co m

"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:E8******** *************** ***********@mic rosoft.com...

I am still wondering whether I should ask the firm to convert our
development system from .NET 1.1 to .NET 2.0. It is costly considering the
licences related to studio and third party packages.

In .NET 2.0 ADO, can I gain much better performance using "command
batching"? WHat does it exactly mean of command batching in .NET 2? COuld
you give me an example?

Thanks

Chris
"Nicholas Paldino [.NET/C# MVP]" wrote:
Chris,

I would send the parameters over in batches... Unfortunately,
ADO.NET
doesn't get command batching for SQL Server until .NET 2.0.

Because of this, I would try and use DTS. What is the format of the
file to begin with? You might have much faster results doing that...

Hope this helps.
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard. caspershouse.co m

"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:20******** *************** ***********@mic rosoft.com...
> Hi,
>
> I need to insert more than 500,000 records at the end of the day in a
> C#
> application. I need to finish it as soon as possible.
>
> I created a stored procedure and called it from ADO to insert one by
> one.
> It
> is kind of slow (seems slower than using a DTS package to import from a
> file).
>
> Just a general question, in ADO, what will be the MOST efficient way to
> do
> this work. I normally do it as I described. I am using .NET framework
> 1.1
> and SQL Server 2000.
>
> Thanks a lot
>
> Chris


Jan 4 '06 #7
If your data file is a simple csv file, it takes two minutes to build a
commandline script that executes the bcp command using the datafile as is as
input file. Really, there is nothing in the world of SQL server that is
faster than bcp, it's specially designed for bulk inserts.
Willy.

"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:11******** *************** ***********@mic rosoft.com...
Thanks. I do not know that I can create DTS script from C#. Can you tell
me
know how to do that?
One common thing I often need is to import data file with customized name,
like mm_dd_yy_file.c sv. I cannot figure out how to do that in Enerprise
Manager. If I write a C# app to do this job, do you think that I can
convert
the app to a DTS script?

Chris

"Ignacio Machin ( .NET/ C# MVP )" wrote:
Hi,
"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:20******** *************** ***********@mic rosoft.com...
> Hi,
>
> I need to insert more than 500,000 records at the end of the day in a
> C#
> application. I need to finish it as soon as possible.


In a C# app or in a DB used by a C# app?
> I created a stored procedure and called it from ADO to insert one by
> one.
> It
> is kind of slow (seems slower than using a DTS package to import from a
> file).
>
> Just a general question, in ADO, what will be the MOST efficient way to
> do
> this work. I normally do it as I described. I am using .NET framework
> 1.1
> and SQL Server 2000.


In short, none, ADO.NET is not intented for this. You have two options,
use
bulk copy (bcp.exe ) that comes with SQL server or as Paldino suggested
(that is also my recommended solution) using DTS, DTS can be scripted
from
C# so you will have flexibility to change it as needed.
Just create a DTS from enterprise manager, select write package to disk
and
later you can load and execute it.

Let me know if you need code, I do this in a couple of deployments.

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation

Jan 4 '06 #8
Thanks for the advice. I rarely use bcp. If I have a file with name
unchnaged, I usually use DTS. However, if I have file name changed daily, for
example, for today, as 01_04_2006_Data .csv, and 01_05_2006_data .csv for
tomorrow, it is hard for me to handle in either way.

I do not want to rename the file to a fixed name using a script then import,
since it could import stale data. However, I do not know how to script a
changing name like this for a bcp script. That is why I am wondering whether
I can translate a C# app to a DTS or script, if possible. At least, I can
code it in C#.

Any suggestions?
Thanks

Chris

"Willy Denoyette [MVP]" wrote:
If your data file is a simple csv file, it takes two minutes to build a
commandline script that executes the bcp command using the datafile as is as
input file. Really, there is nothing in the world of SQL server that is
faster than bcp, it's specially designed for bulk inserts.
Willy.

"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:11******** *************** ***********@mic rosoft.com...
Thanks. I do not know that I can create DTS script from C#. Can you tell
me
know how to do that?
One common thing I often need is to import data file with customized name,
like mm_dd_yy_file.c sv. I cannot figure out how to do that in Enerprise
Manager. If I write a C# app to do this job, do you think that I can
convert
the app to a DTS script?

Chris

"Ignacio Machin ( .NET/ C# MVP )" wrote:
Hi,
"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:20******** *************** ***********@mic rosoft.com...
> Hi,
>
> I need to insert more than 500,000 records at the end of the day in a
> C#
> application. I need to finish it as soon as possible.

In a C# app or in a DB used by a C# app?

> I created a stored procedure and called it from ADO to insert one by
> one.
> It
> is kind of slow (seems slower than using a DTS package to import from a
> file).
>
> Just a general question, in ADO, what will be the MOST efficient way to
> do
> this work. I normally do it as I described. I am using .NET framework
> 1.1
> and SQL Server 2000.

In short, none, ADO.NET is not intented for this. You have two options,
use
bulk copy (bcp.exe ) that comes with SQL server or as Paldino suggested
(that is also my recommended solution) using DTS, DTS can be scripted
from
C# so you will have flexibility to change it as needed.
Just create a DTS from enterprise manager, select write package to disk
and
later you can load and execute it.

Let me know if you need code, I do this in a couple of deployments.

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation


Jan 4 '06 #9
Nothing stops you from launching the bcp.exe command from C# using
Process.Start.
That way you could supply all command line arguments you need like the in
file name.

Willy.

"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:CA******** *************** ***********@mic rosoft.com...
Thanks for the advice. I rarely use bcp. If I have a file with name
unchnaged, I usually use DTS. However, if I have file name changed daily,
for
example, for today, as 01_04_2006_Data .csv, and 01_05_2006_data .csv for
tomorrow, it is hard for me to handle in either way.

I do not want to rename the file to a fixed name using a script then
import,
since it could import stale data. However, I do not know how to script a
changing name like this for a bcp script. That is why I am wondering
whether
I can translate a C# app to a DTS or script, if possible. At least, I can
code it in C#.

Any suggestions?
Thanks

Chris

"Willy Denoyette [MVP]" wrote:
If your data file is a simple csv file, it takes two minutes to build a
commandline script that executes the bcp command using the datafile as is
as
input file. Really, there is nothing in the world of SQL server that is
faster than bcp, it's specially designed for bulk inserts.
Willy.

"chrisben" <ch******@discu ssions.microsof t.com> wrote in message
news:11******** *************** ***********@mic rosoft.com...
> Thanks. I do not know that I can create DTS script from C#. Can you
> tell
> me
> know how to do that?
> One common thing I often need is to import data file with customized
> name,
> like mm_dd_yy_file.c sv. I cannot figure out how to do that in Enerprise
> Manager. If I write a C# app to do this job, do you think that I can
> convert
> the app to a DTS script?
>
> Chris
>
> "Ignacio Machin ( .NET/ C# MVP )" wrote:
>
>> Hi,
>>
>>
>> "chrisben" <ch******@discu ssions.microsof t.com> wrote in message
>> news:20******** *************** ***********@mic rosoft.com...
>> > Hi,
>> >
>> > I need to insert more than 500,000 records at the end of the day in
>> > a
>> > C#
>> > application. I need to finish it as soon as possible.
>>
>> In a C# app or in a DB used by a C# app?
>>
>> > I created a stored procedure and called it from ADO to insert one by
>> > one.
>> > It
>> > is kind of slow (seems slower than using a DTS package to import
>> > from a
>> > file).
>> >
>> > Just a general question, in ADO, what will be the MOST efficient way
>> > to
>> > do
>> > this work. I normally do it as I described. I am using .NET
>> > framework
>> > 1.1
>> > and SQL Server 2000.
>>
>> In short, none, ADO.NET is not intented for this. You have two
>> options,
>> use
>> bulk copy (bcp.exe ) that comes with SQL server or as Paldino
>> suggested
>> (that is also my recommended solution) using DTS, DTS can be scripted
>> from
>> C# so you will have flexibility to change it as needed.
>>
>>
>> Just create a DTS from enterprise manager, select write package to
>> disk
>> and
>> later you can load and execute it.
>>
>> Let me know if you need code, I do this in a couple of deployments.
>>
>>
>>
>> --
>> Ignacio Machin,
>> ignacio.machin AT dot.state.fl.us
>> Florida Department Of Transportation
>>
>>
>>


Jan 4 '06 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

7
2589
by: Nova's Taylor | last post by:
Hi folks, I am a newbie to Python and am hoping that someone can get me started on a log parser that I am trying to write. The log is an ASCII file that contains a process identifier (PID), username, date, and time field like this: 1234 williamstim 01AUG03 7:44:31 2348 williamstim 02AUG03 14:11:20
2
13399
by: george | last post by:
This is like the bug from hell. It is kind of hard to explain, so please bear with me. Background Info: SQL Server 7.0, on an NT box, Active Server pages with Javascript, using ADO objects. I'm inserting simple records into a table. But one insert command is placing 2 or 3 records into the table. The 'extra' records, have the same data as the previous insert incident, (except for the timestamp).
6
3271
by: John | last post by:
Just a general question... I'm currently using a combobox that when updated, opens a form with its recordset based on a query using the combo box value as the criteria. I'm I correct in thinking that using: docmd.openfrm "frmName",,,where "=" & cboSelectID will open all records and then just navigate to that filtered record, which is not as fast/efficient as using a query where the criteria is
11
3621
by: hoopsho | last post by:
Hi Everyone, I am trying to write a program that does a few things very fast and with efficient use of memory... a) I need to parse a space-delimited file that is really large, upwards fo a million lines. b) I need to store the contents into a unique hash. c) I need to then sort the data on a specific field. d) I need to pull out certain fields and report them to the user.
14
1835
by: Roy | last post by:
Apologies for the cross-post, but this truly is a two-sided question. Given the option of creating Looping statements within a stored proc of sql server or in the code-behind of an .net webpage, which would you choose and why? Reason I ask is I created a webpage which essentially runs through a litany of loops to determine which stored proc to kick off. This is written in the code-behind. It occurred to me that I could probably just...
20
18388
by: Mark Harrison | last post by:
So I have some data that I want to put into a table. If the row already exists (as defined by the primary key), I would like to update the row. Otherwise, I would like to insert the row. I've been doing something like delete from foo where name = 'xx'; insert into foo values('xx',1,2,...);
1
1660
by: Michael Bray | last post by:
I have a table that stores data points for several different data sources. The general format is: DECLARE @Data TABLE ( DataID int, TimeCollected int, DataValue decimal(9,9) )
3
1862
by: manny | last post by:
Problem: how to have query show only most recent records. This query shows all exams in 2005 for particular individual (grades not shown to avoid embarrassing John Slacker!): SELECT examhistory.coursekey, students.userid, students.firstname, students.lastname, examhistory.examdate,coursekeylookup.examdescription FROM students, examhistory, coursekeylookup AND students.userid='1234567'
21
2024
by: py_genetic | last post by:
Hello, I'm importing large text files of data using csv. I would like to add some more auto sensing abilities. I'm considing sampling the data file and doing some fuzzy logic scoring on the attributes (colls in a data base/ csv file, eg. height weight income etc.) to determine the most efficient 'type' to convert the attribute coll into for further processing and efficient storage... Example row from sampled file data: , ....]
0
9628
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9464
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10289
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10120
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
9923
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
1
7471
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
6722
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5367
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
1
4031
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.