(an******@gmail.com) writes:
You're right, I removed the drop and re-create and it's definitely
slower when data already exists. So how would you suggest loading this
data?
I can't really give good suggestions about data that I don't anything
about.
The text file contains 11 different types of Rows. Each type of row
goes to a separate table, so I need to read each line, determine its
type, parse it and insert into the appropriate table.
Can BCP handle this? DTS? Or your XML idea?
If the data has a conformant appearance, you could load the lot in a staging
table and then distribute the data from there.
You could also just write new files for each table and then bulk-load
these tables.
It's possible that a Data Pump task in DTS could do all this out
of the box, but I don't know DTS.
The XML idea would require you parse the file, and build an XML document
of it. You wouldn't have to build 11 XML documents, though. (Although
that might be easier than building one big one.)
It's also possible to bulk-load from variables, but not in C# with
ADO .Net 1.1.
So how big did you make the database before you started loading? With
two million records, you should have at least 100 MB for both data and
log.
By the way, how do call the stored procedure? You are using
CommandType.StoredProcedure, aren't you?
--
Erland Sommarskog, SQL Server MVP,
es****@sommarskog.se
Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp