By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
435,052 Members | 1,456 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 435,052 IT Pros & Developers. It's quick & easy.

Best approach

P: n/a
Hello All,

Before I starting writing the code, I would like to figure out what the best approach to this problem would be. I want to create a process that imports a text file (Pipe | delimited), then after looping through the entire file, puts each field into a SQL Server database. The text file is exported from a different system, then I download it via FTP to process the data for the proposed system. Currently, I used DTS to process the data, but the user wants more control outside of DTS.

Initially, I thought of creating a data table, parsing out the individual fields, then writing the data table to the database. The problems I see are :
- the file is 80K has over 200,000 records, so looping seems resource intensive for an ASP.nET page
- file must reside on a virtual directory

One other thought was to create a webservice that would consume the file and automatically import it into the database, is this reasonable for a file that size, how would I go about doing this?
Any and all help would be greatly appreciated! Thanks ahead of time!

Sep 27 '06 #1
Share this Question
Share on Google+
1 Reply


P: n/a
Take a look at the new SqlBulkCopy class in ADO.NET 2.0
Peter

--
Co-founder, Eggheadcafe.com developer portal:
http://www.eggheadcafe.com
UnBlog:
http://petesbloggerama.blogspot.com


"Shawn Ferguson" wrote:
Hello All,

Before I starting writing the code, I would like to figure out what the best approach to this problem would be. I want to create a process that imports a text file (Pipe | delimited), then after looping through the entire file, puts each field into a SQL Server database. The text file is exported from a different system, then I download it via FTP to process the data for the proposed system. Currently, I used DTS to process the data, but the user wants more control outside of DTS.

Initially, I thought of creating a data table, parsing out the individual fields, then writing the data table to the database. The problems I see are :
- the file is 80K has over 200,000 records, so looping seems resource intensive for an ASP.nET page
- file must reside on a virtual directory

One other thought was to create a webservice that would consume the file and automatically import it into the database, is this reasonable for a file that size, how would I go about doing this?
Any and all help would be greatly appreciated! Thanks ahead of time!
Sep 27 '06 #2

This discussion thread is closed

Replies have been disabled for this discussion.