472,121 Members | 1,430 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 472,121 software developers and data experts.

Insert more than 40,000 records

Hi,
My requirement is to read a CSV file which has around 40,000 records and insert into DB.
I have the CSV read and stored in a collection object which is passed to the Webservice.

My client doesn't want to do abulk insert of all the records and wants to insert 1000 records at a time.

what would be the best way to do that?
1. Pass the entore colelction object to the webservice and the pass a xml stream from webservice to SQl and do a split in sql server and insert.
2. Pass 1000 records at a time from the application to the webservice and bulk insert.

Expecting replies ASAP.
thanks for ur help.
Jun 4 '08 #1
1 2048
debasisdas
8,127 Expert 4TB
i believe the second is a better option .
Jun 4 '08 #2

Post your reply

Sign in to post your reply or Sign up for a free account.

Similar topics

16 posts views Thread by Philip Boonzaaier | last post: by
14 posts views Thread by Demetris | last post: by
3 posts views Thread by ben.werdmuller | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.