468,535 Members | 1,662 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 468,535 developers. It's quick & easy.

Inserting large amounts of data

Does anyone have ideas on the best way to move large amounts of data
between tables? I am doing several simple insert/select statements
from a staging table to several holding tables, but because of the
volume it is taking an extraordinary amount of time. I considered
using cursors but have read that may not be the best thing for this
situation. Any thoughts?

--
Posted using the http://www.dbforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbforumz.com/General-Disc...ict254055.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=877392
Sep 8 '05 #1
4 3505
Have you looked at third party ETL tools and DTS? In particular,
Google Sunopsis if you have to bring in data from several sources.

Sep 8 '05 #2

"oshanahan" <Us************@dbForumz.com> wrote in message
news:4_***************************************@dbf orumz.com...
Does anyone have ideas on the best way to move large amounts of data
between tables? I am doing several simple insert/select statements
from a staging table to several holding tables, but because of the
volume it is taking an extraordinary amount of time. I considered
using cursors but have read that may not be the best thing for this
situation. Any thoughts?

For SQL Server, DTS, bcp or bulkcopy.

Do NOT use cursors.

And if you can, drop indexes first and build them later.
--
Posted using the http://www.dbforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbforumz.com/General-Disc...ict254055.html Visit Topic URL to contact author (reg. req'd). Report abuse:

http://www.dbforumz.com/eform.php?p=877392
Sep 8 '05 #3
"" wrote:
Have you looked at third party ETL tools and DTS? In
particular,
Google Sunopsis if you have to bring in data from several
sources.


Thanks for your response.
Iím using DTS to populate a staging table with raw streams of data (a
lot of it) from one source. I thought about imbedding the SQL
somewhere in the VB script that DTS uses. Our resident DTS man said
he didnít think that was possible here.
The problem essentially is that a large field on the staging table
must be substringed to populate other tables, which have no
relationship to the staging table. The substringing is where the
performance drag is, but there is no way around that. Iím looking to
somehow shave a little time on each populate transaction to help cut
down processing.

--
Posted using the http://www.dbforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbforumz.com/General-Disc...ict254055.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=878479
Sep 8 '05 #4
oshanahan (Us************@dbForumz.com) writes:
I'm using DTS to populate a staging table with raw streams of data (a
lot of it) from one source. I thought about imbedding the SQL
somewhere in the VB script that DTS uses. Our resident DTS man said
he didn't think that was possible here.
The problem essentially is that a large field on the staging table
must be substringed to populate other tables, which have no
relationship to the staging table. The substringing is where the
performance drag is, but there is no way around that. I'm looking to
somehow shave a little time on each populate transaction to help cut
down processing.


Had you been on SQL 2005, you could have written an user-defined
function in C# of VB to decode this large field. It is not unlikely
that that would be faster than SQL builtins.

Using cursor to load handle one by one is definitely not a good idea.
What sometimes can be a good idea is to do, say, 10000 at a time. Batching
can be achieved with SET ROWCOUNT or TOP, but also be achieved by using
ranges in the source data. Important here is that the selection of a
batch follows a clustered index, or else the selection itself will kill it.

But this is more of interest if you get problems with the transaction
log. When the problem is with decoding a field, I don't think batching
is going to help you much. Do you need to have the data in the table
when you substring the field? Can't you substring the field before you
load it into the database? Doing the substringing in T-SQL is not the
most optimal for performance.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp

Sep 8 '05 #5

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

2 posts views Thread by flamesrock | last post: by
3 posts views Thread by Wayne Marsh | last post: by
2 posts views Thread by Dennis C. Drumm | last post: by
3 posts views Thread by Brent | last post: by
7 posts views Thread by =?Utf-8?B?TW9iaWxlTWFu?= | last post: by
4 posts views Thread by bcomeara | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.