473,320 Members | 1,879 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,320 software developers and data experts.

Questions regarding large MDC tables

Hi all,

I'm currently investigating the use of MDC Tables for large data warehouse
tables.
My scenario:

A fact table with 1000 Million Rows distributed on 12 Partitions (3 physical
hosts with 4 logical partitions each).
Overall size of table is 350 GB . Each night 1.5 Million new rows will be
added
and approx. the same amount of old records will be deleted (Roll in/Roll out
with SQL INSERT/DELETE).
The table is stored in SMS tablespace with 16K Pagesize and 64 Pages
Extentsize.
The tablespace has 6 containers on each partition. Each container is on a
separate IBM ESS array. Prefetchsize is 384 (6 containers * 64 pages).
Prefetch behaves very well
with these settings (DB2_PARALLEL_IO is set). DB2 is V8.1 ESE (DPF) FP5 and
runs on AIX 5.2.

We figured out that for our choosen MDC dimensions we will have to use an
extentsize of 2 pages otherwise we would waste too much space. This
extentsize gives us headaches:

- What is an optimal prefetchsize here?
With prefetchsize of 12 (6 containers * extentsize 2) each prefetcher will
only read 32 KB of data from one container.
With a prefetchsize of 384 (which is optimal from a disk layout point of
view) will DB2 start 192 prefetchers (that would be certainly overkill)?

Further:
Does anybody have experiences with MDC tables for large warehouse tables and
is willing to share them?
Especially performance experiences when inserting/deleting during the
roll-in/roll-out of daily data ?

Unfortunately I do not have an adequate environment to test all these
issues - so any comments are highly appreciated.

TIA
Joachim

PS: Feel free to send comments by email to joklassen at web dot de
Nov 12 '05 #1
4 3400
Joachim Klassen wrote:
Hi all,

I'm currently investigating the use of MDC Tables for large data warehouse
tables.
My scenario:

A fact table with 1000 Million Rows distributed on 12 Partitions (3 physical
hosts with 4 logical partitions each).
Overall size of table is 350 GB . Each night 1.5 Million new rows will be
added
and approx. the same amount of old records will be deleted (Roll in/Roll out
with SQL INSERT/DELETE).
The table is stored in SMS tablespace with 16K Pagesize and 64 Pages
Extentsize.
The tablespace has 6 containers on each partition. Each container is on a
separate IBM ESS array. Prefetchsize is 384 (6 containers * 64 pages).
Prefetch behaves very well
with these settings (DB2_PARALLEL_IO is set). DB2 is V8.1 ESE (DPF) FP5 and
runs on AIX 5.2.

We figured out that for our choosen MDC dimensions we will have to use an
extentsize of 2 pages otherwise we would waste too much space. This
extentsize gives us headaches:

- What is an optimal prefetchsize here?
With prefetchsize of 12 (6 containers * extentsize 2) each prefetcher will
only read 32 KB of data from one container.
With a prefetchsize of 384 (which is optimal from a disk layout point of
view) will DB2 start 192 prefetchers (that would be certainly overkill)?

Further:
Does anybody have experiences with MDC tables for large warehouse tables and
is willing to share them?
Especially performance experiences when inserting/deleting during the
roll-in/roll-out of daily data ?

Unfortunately I do not have an adequate environment to test all these
issues - so any comments are highly appreciated.

TIA
Joachim

PS: Feel free to send comments by email to joklassen at web dot de

Try the new advisor in V8.2. You can simply download DB2 V8.2 onto your
laptop, mimic the stats and then apply the proposal to DB2 V8.1.5.

Cheers
Serge

--
Serge Rielau
DB2 SQL Compiler Development
IBM Toronto Lab
Nov 12 '05 #2
Serge,
thanks for the quick reply.

I already followed your suggestion but with no luck. Design advisor makes no
recommendations for MDC.
Does that mean that MDC is not recommended at all for my scenario or do I
have to refine my workload input (which is at the moment very basic with 3
Queries and 1 Insert statement).

Thanks again
Joachim

"Serge Rielau" <sr*****@ca.ibm.com> schrieb im Newsbeitrag
news:35*************@individual.net...
Joachim Klassen wrote:
Hi all,

I'm currently investigating the use of MDC Tables for large data
warehouse
tables.
My scenario:

A fact table with 1000 Million Rows distributed on 12 Partitions (3
physical hosts with 4 logical partitions each).
Overall size of table is 350 GB . Each night 1.5 Million new rows will be
added
and approx. the same amount of old records will be deleted (Roll in/Roll
out with SQL INSERT/DELETE).
The table is stored in SMS tablespace with 16K Pagesize and 64 Pages
Extentsize.
The tablespace has 6 containers on each partition. Each container is on a
separate IBM ESS array. Prefetchsize is 384 (6 containers * 64 pages).
Prefetch behaves very well
with these settings (DB2_PARALLEL_IO is set). DB2 is V8.1 ESE (DPF) FP5
and runs on AIX 5.2.

We figured out that for our choosen MDC dimensions we will have to use an
extentsize of 2 pages otherwise we would waste too much space. This
extentsize gives us headaches:

- What is an optimal prefetchsize here?
With prefetchsize of 12 (6 containers * extentsize 2) each prefetcher
will
only read 32 KB of data from one container.
With a prefetchsize of 384 (which is optimal from a disk layout point of
view) will DB2 start 192 prefetchers (that would be certainly overkill)?

Further:
Does anybody have experiences with MDC tables for large warehouse tables
and
is willing to share them?
Especially performance experiences when inserting/deleting during the
roll-in/roll-out of daily data ?

Unfortunately I do not have an adequate environment to test all these
issues - so any comments are highly appreciated.

TIA
Joachim

PS: Feel free to send comments by email to joklassen at web dot de

Try the new advisor in V8.2. You can simply download DB2 V8.2 onto your
laptop, mimic the stats and then apply the proposal to DB2 V8.1.5.

Cheers
Serge

--
Serge Rielau
DB2 SQL Compiler Development
IBM Toronto Lab

Nov 12 '05 #3
Joachim,

Sure it does. The V8.2 advisor handles Indexes, Partitioning, MQT, and
MDC. You sure you tried V8.2?
MDC will not improve your insert performance. It can help with delete
(rollout) and query.

Cheers
Serge
--
Serge Rielau
DB2 SQL Compiler Development
IBM Toronto Lab
Nov 12 '05 #4
"Serge Rielau" <sr*****@ca.ibm.com> schrieb im Newsbeitrag
news:35*************@individual.net...
Joachim,

Sure it does. The V8.2 advisor handles Indexes, Partitioning, MQT, and
MDC. You sure you tried V8.2?
MDC will not improve your insert performance. It can help with delete
(rollout) and query.

Cheers
Serge
--
Serge Rielau
DB2 SQL Compiler Development
IBM Toronto Lab

Serge
Yes, I tried V8.2 (V8.1 FP7). I will retry with FP8 and a refined workload.
But maybe I use it wrong. Here is what I've done so far:
- captured objects ddl (tables and indexes) and statistics from the original
tables via db2look
- recreated the objects on my laptop in a non-partitioned instance (maybe
thats the problem)
- inserted some thousand records and run RUNSTATS to get entries in
SYSCOLDIST etc.. to be able to update them later
- updated the stats with the original values
- started Design advisor GUI and defined a workload consisting of 3 typical
queries and 1 insert
- started recommendations

I thought that MDC would not help with insert performance (more likely the
opposite ?) thats why we consider LOAD FROM CURSOR.
Our main goal is improvement for roll-out. Queries already perform well.
But the more informations I find about MDC in a partitioned environment the
more I doubt that they will help us in this special scenario.

Thanks for your comments
Joachim
Nov 12 '05 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
by: Jack Coxen | last post by:
------_=_NextPart_001_01C3584E.5FF65B60 Content-Type: text/plain; charset="iso-8859-1" I've gone through the mailing list archives, read the appropriate parts of the manual and searched the...
2
by: Warren Wright | last post by:
Hi All, First, where can I get some questions of this sort answered? Preferably, are there good books or online guides that I can consult for these types of answers when necessary? 1. How do...
1
by: Invalidlastname | last post by:
Hi, I have some questions regarding to use vs.net to generate xsd from database tables then generate classes, not typed datasets, from the xsd. Basically I want to have some light-weigh classes...
8
by: Mike | last post by:
Hello, I have a few rather urgent questions that I hope someone can help with (I need to figure this out prior to a meeting tomorrow.) First, a bit of background: The company I work for is...
1
by: mairhtin o'feannag | last post by:
Hello, After reading the fine article on the Stinger enhancements for Linux, I have two questions. First, our application is in essence a warehouse, with 100-400 million rows for various...
4
by: Bob Alston | last post by:
Some more, rather specific Access performance questions. IN a split front-end & back-end Access/Jet ONLY LAN situation, and with all query criteria fields and join fields indexed: 1. Is is...
4
by: Drew | last post by:
I posted this to the asp.db group, but it doesn't look like there is much activity on there, also I noticed that there are a bunch of posts on here pertaining to database and asp. Sorry for...
6
by: Phil Sandler | last post by:
All, I am designing a system that will involve an IIS/ASP.Net application server. The main purpose of the application server will be to load large amounts of static data into memory and do fast...
22
by: Jesse Burns | last post by:
I'm about to start working on my first large scale site (in my opinion) that will hopefully have 1000+ users a day. ok, this isn't on the google/facebook scale, but it's going to be have more hits...
0
by: DolphinDB | last post by:
Tired of spending countless mintues downsampling your data? Look no further! In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million...
0
by: ryjfgjl | last post by:
ExcelToDatabase: batch import excel into database automatically...
0
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
1
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
0
by: Vimpel783 | last post by:
Hello! Guys, I found this code on the Internet, but I need to modify it a little. It works well, the problem is this: Data is sent from only one cell, in this case B5, but it is necessary that data...
1
by: PapaRatzi | last post by:
Hello, I am teaching myself MS Access forms design and Visual Basic. I've created a table to capture a list of Top 30 singles and forms to capture new entries. The final step is a form (unbound)...
0
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
0
by: Defcon1945 | last post by:
I'm trying to learn Python using Pycharm but import shutil doesn't work
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.