By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
459,473 Members | 1,267 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 459,473 IT Pros & Developers. It's quick & easy.

Dataset with large data...PLEASE HELP

P: n/a
Hi ALL,

I am working in an windows based application using SQL Server 2000 as
database. There are few tables (refer parent tables) in the
application which are uploaded by a seprate application.

My application fetches data from parent tables and put data in
separate tables (chils tables) of application.
I am using dataset to fetch data from parent tables and insert/update
data in child table.
the problem is the records are so high (3 4 millions) then data is too
large and it takes hell lot of time to complete the process. Also
application server CPU utilization shoots out to max.

What will be the best way to achive this.
1. Should i use DataRepeater instead of dataset. or
2. Should i do processing in chunks. how can I do processing in
chunks???

or is there any other way i can process data.

Thanks
PAL

Jul 11 '07 #1
Share this Question
Share on Google+
2 Replies


P: n/a

<pa****@gmail.comwrote in message
news:11**********************@g4g2000hsf.googlegro ups.com...
Hi ALL,

I am working in an windows based application using SQL Server 2000 as
database. There are few tables (refer parent tables) in the
application which are uploaded by a seprate application.

My application fetches data from parent tables and put data in
separate tables (chils tables) of application.
I am using dataset to fetch data from parent tables and insert/update
data in child table.
the problem is the records are so high (3 4 millions) then data is too
large and it takes hell lot of time to complete the process. Also
application server CPU utilization shoots out to max.

What will be the best way to achive this.
1. Should i use DataRepeater instead of dataset. or
2. Should i do processing in chunks. how can I do processing in
chunks???

or is there any other way i can process data.
Use the SQL Command Object, data reader, dynamic SQL statements or calling
Stored Procedures.

Jul 12 '07 #2

P: n/a
I am not sure I understand what you are doing with data.

In any case, dataset loads everything into memory, that's why you get into
troubles.

If you just pump data from one table into another with some processing in
between, use Bulk Copy (SqlBulkCopy or bcp) for inserts. Then you can just
read from some data reader, which should solve your issues.

For updates you have to issue them individually, but even in this case you
can batch them up to command string limit and issue in batches. Say, save
all updates in some file. When reading is complete, read back updates and
run them.

<pa****@gmail.comwrote in message
news:11**********************@g4g2000hsf.googlegro ups.com...
Hi ALL,

I am working in an windows based application using SQL Server 2000 as
database. There are few tables (refer parent tables) in the
application which are uploaded by a seprate application.

My application fetches data from parent tables and put data in
separate tables (chils tables) of application.
I am using dataset to fetch data from parent tables and insert/update
data in child table.
the problem is the records are so high (3 4 millions) then data is too
large and it takes hell lot of time to complete the process. Also
application server CPU utilization shoots out to max.

What will be the best way to achive this.
1. Should i use DataRepeater instead of dataset. or
2. Should i do processing in chunks. how can I do processing in
chunks???

or is there any other way i can process data.

Thanks
PAL

Jul 12 '07 #3

This discussion thread is closed

Replies have been disabled for this discussion.