By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
438,843 Members | 2,324 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 438,843 IT Pros & Developers. It's quick & easy.

ETL: OLTP -> DATA Store

P: n/a
Greetings All, I was wondering if any of you would share some of your
experiences regarding the task of loading a data store from an Oltp
source. We are using Analysis Services in a BI product that requires
data to be pulled from one of our products, an OLTP database. The
design is to first run an ETL process from the OLTP source into an
operational data store, from here Analysis Services will pull its data
to do its thing. Now, for small OLTP databases (< 1Gb) the stored
procs I have written to do the extraction works well, it is relatively
fast and efficient. However, we have a few databases that are 10Gb's
and the load could end up taking several hours. During this long load
the OLTP source may be in use and I want to avoid write blocks, or if I
were to use "select ... NOLOCK" I could get dirty data brought over. I
could used BCP for some of the big tables or Bulk Copy but I wanted to
see if anyone has dealt with this issue and what their specific
resolution was for their specific problem. It is my hope that by
seeing how others have dealt with this I will be able to architect a
solution for my specific problem

Regards, TFD.

Jul 23 '05 #1
Share this Question
Share on Google+
1 Reply


P: n/a
LineVoltageHalogen (tr****************@yahoo.com) writes:
Greetings All, I was wondering if any of you would share some of your
experiences regarding the task of loading a data store from an Oltp
source. We are using Analysis Services in a BI product that requires
data to be pulled from one of our products, an OLTP database. The
design is to first run an ETL process from the OLTP source into an
operational data store, from here Analysis Services will pull its data
to do its thing. Now, for small OLTP databases (< 1Gb) the stored
procs I have written to do the extraction works well, it is relatively
fast and efficient. However, we have a few databases that are 10Gb's
and the load could end up taking several hours. During this long load
the OLTP source may be in use and I want to avoid write blocks, or if I
were to use "select ... NOLOCK" I could get dirty data brought over. I
could used BCP for some of the big tables or Bulk Copy but I wanted to
see if anyone has dealt with this issue and what their specific
resolution was for their specific problem. It is my hope that by
seeing how others have dealt with this I will be able to architect a
solution for my specific problem


A common solution when you have long-running reports, is to have a report
server which is feed by log shipping or replication. Or simply a
backup/restore, depending on how fresh you need the data to be.

You are not doing reports, but you are running a long-runing stored
procedure, so it's a similar scenario.

Of course, the backup/restore thing would add some time to the ETL
process.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 23 '05 #2

This discussion thread is closed

Replies have been disabled for this discussion.