473,399 Members | 4,192 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,399 software developers and data experts.

Inserting large amounts of data

Does anyone have ideas on the best way to move large amounts of data
between tables? I am doing several simple insert/select statements
from a staging table to several holding tables, but because of the
volume it is taking an extraordinary amount of time. I considered
using cursors but have read that may not be the best thing for this
situation. Any thoughts?

--
Posted using the http://www.dbforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbforumz.com/General-Disc...ict254055.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=877392
Sep 8 '05 #1
4 3792
Have you looked at third party ETL tools and DTS? In particular,
Google Sunopsis if you have to bring in data from several sources.

Sep 8 '05 #2

"oshanahan" <Us************@dbForumz.com> wrote in message
news:4_***************************************@dbf orumz.com...
Does anyone have ideas on the best way to move large amounts of data
between tables? I am doing several simple insert/select statements
from a staging table to several holding tables, but because of the
volume it is taking an extraordinary amount of time. I considered
using cursors but have read that may not be the best thing for this
situation. Any thoughts?

For SQL Server, DTS, bcp or bulkcopy.

Do NOT use cursors.

And if you can, drop indexes first and build them later.
--
Posted using the http://www.dbforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbforumz.com/General-Disc...ict254055.html Visit Topic URL to contact author (reg. req'd). Report abuse:

http://www.dbforumz.com/eform.php?p=877392
Sep 8 '05 #3
"" wrote:
Have you looked at third party ETL tools and DTS? In
particular,
Google Sunopsis if you have to bring in data from several
sources.


Thanks for your response.
I’m using DTS to populate a staging table with raw streams of data (a
lot of it) from one source. I thought about imbedding the SQL
somewhere in the VB script that DTS uses. Our resident DTS man said
he didn’t think that was possible here.
The problem essentially is that a large field on the staging table
must be substringed to populate other tables, which have no
relationship to the staging table. The substringing is where the
performance drag is, but there is no way around that. I’m looking to
somehow shave a little time on each populate transaction to help cut
down processing.

--
Posted using the http://www.dbforumz.com interface, at author's request
Articles individually checked for conformance to usenet standards
Topic URL: http://www.dbforumz.com/General-Disc...ict254055.html
Visit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=878479
Sep 8 '05 #4
oshanahan (Us************@dbForumz.com) writes:
I'm using DTS to populate a staging table with raw streams of data (a
lot of it) from one source. I thought about imbedding the SQL
somewhere in the VB script that DTS uses. Our resident DTS man said
he didn't think that was possible here.
The problem essentially is that a large field on the staging table
must be substringed to populate other tables, which have no
relationship to the staging table. The substringing is where the
performance drag is, but there is no way around that. I'm looking to
somehow shave a little time on each populate transaction to help cut
down processing.


Had you been on SQL 2005, you could have written an user-defined
function in C# of VB to decode this large field. It is not unlikely
that that would be faster than SQL builtins.

Using cursor to load handle one by one is definitely not a good idea.
What sometimes can be a good idea is to do, say, 10000 at a time. Batching
can be achieved with SET ROWCOUNT or TOP, but also be achieved by using
ranges in the source data. Important here is that the selection of a
batch follows a clustered index, or else the selection itself will kill it.

But this is more of interest if you get problems with the transaction
log. When the problem is with decoding a field, I don't think batching
is going to help you much. Do you need to have the data in the table
when you substring the field? Can't you substring the field before you
load it into the database? Doing the substringing in T-SQL is not the
most optimal for performance.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp

Sep 8 '05 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: flamesrock | last post by:
Hi, Basically, what I'm trying to do is store large amounts of data in a list or dictionary and then convert that to a custom formatted xml file. My list looks roughly like this: (d,r,p]]])...
1
by: michaaal | last post by:
If I use a form to pass data (say, for example, through a textbox) the data seems to be limited to somewhat smaller amounts. What should I do if I want to pass a large amount of data? For example...
10
by: Digety | last post by:
We are looking to store a large amount of user data that will be changed and accessed daily by a large number of people. We expect around 6-8 million subscribers to our service with each record...
3
by: Wayne Marsh | last post by:
Hi all. I am working on an audio application which needs reasonably fast access to large amounts of data. For example, the program may load a 120 second stereo sound sample stored at 4bytes per...
2
by: Dennis C. Drumm | last post by:
What is the best way to add several pages of text to a readonly TextBox? The text does not change and was created in a Word rtf document but could as easly be put in a ASCII text file. Can this be...
1
by: Bart | last post by:
Dear all, I would like to encrypt a large amount of data by using public/private keys, but I read on MSDN: "Symmetric encryption is performed on streams and is therefore useful to encrypt large...
3
by: Brent | last post by:
Hi, I'm wondering if it is good to use datasets for large amounts of data with many users. I'm talking tables with 130,000 records and 15 columns. And we want current data, so no cached data....
7
by: =?Utf-8?B?TW9iaWxlTWFu?= | last post by:
Hello everyone: I am looking for everyone's thoughts on moving large amounts (actually, not very large, but large enough that I'm throwing exceptions using the default configurations). We're...
4
by: bcomeara | last post by:
I am writing a program which needs to include a large amount of data. Basically, the data are p values for different possible outcomes from trials with different number of observations (the p...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.