By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
443,571 Members | 1,659 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 443,571 IT Pros & Developers. It's quick & easy.

Loading TXT to DB (VB)

P: n/a
I have large TEXT files (less than 50MB) that I want to load into
MSSQL. Each line has to be extracted, parsed and then inserted into the
DB. What's the best way of doing this?

Currently I extract each line, parse it and then INSERT it into the DB.
I am not using any DataSet/DataGrids. I have one db connection always
open for this process. The problem is that this takes almost half an
hour to finish. I'm wondering if using a DataSet would be beneficial
for me.

Any help would be appreciated!

Nov 21 '05 #1
Share this Question
Share on Google+
5 Replies


P: n/a
You didn't mention what version of SQL server you are using, but do you
need to go through an application at all? You could instead set up a
DTS in SQL that imports from a text file to your SQL database.

To activate the DTS, you could run it with the SQL's Job Scheduler, or
use DTS objects from your .NET application.

Nov 21 '05 #2

P: n/a
"Data" <da***********@gmail.com> wrote in message
news:11**********************@g43g2000cwa.googlegr oups.com...
I have large TEXT files (less than 50MB) that I want to load into
MSSQL. Each line has to be extracted, parsed and then inserted into
the DB. What's the best way of doing this?


For anything more than a few thousand rows at a time, I'd tackle this
as a two-stage process:

1) Read the file, parse the data and write it to /another/ file,
ready for...
2) Use the bcp utility to read this second file and slam it into
SQLServer - much, *much* faster.

HTH,
Phill W.
Nov 21 '05 #3

P: n/a
>I have large TEXT files (less than 50MB) that I want to load into
MSSQL. Each line has to be extracted, parsed and then inserted into the
DB. What's the best way of doing this?
Lookup the BULK INSERT statement in SQL Server Books Online.

Make a "Work-Table" that has the same structure as your TXT file. Not a
Temp Table, a Real one.
I am not using any DataSet/DataGrids. I have one db connection always
open for this process. The problem is that this takes almost half an
hour to finish.


The system that I inherited processes .txt files in excess of 500MB in less
than a half hour. Our SQL Server is a 4 CPU XEON with 3 gigs of RAM. We
load several dozen of these files all at once, on a quarterly basis.

--
Peace & happy computing,

Mike Labosh, MCSD

"When you kill a man, you're a murderer.
Kill many, and you're a conqueror.
Kill them all and you're a god." -- Dave Mustane
Nov 21 '05 #4

P: n/a
The server is MSSQL. And unfortunately, I can't use bcp or DTS because
there are several text files that require parsing and are contingent
upon each other.

The solution has to be in such order:

File > Manipulate contents > DB

Nov 21 '05 #5

P: n/a
Don't worry about this. I will post the method that I used so that
others can benifit from it.

I inserted the text files contents into a single column DataSet table
and then parsed one-by-one into the database. I was able to do a 32MB
text file within 7 minutes.

Alternatively, with the original method that I had which was to read
each line from the file, parse it and then insert it into the db was
taking around 20 minutes to do.

Nov 21 '05 #6

This discussion thread is closed

Replies have been disabled for this discussion.