468,765 Members | 1,017 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 468,765 developers. It's quick & easy.

Q: Analysis Services Cube Design

I have a (hopefully typical) problem when it comes to cube design. We
store millions of product records every year, broken down by
month/quarter. Each product can be assigned to various heirarchial
classification groups etc. The data in an OLTP DB occupies roughly
100G for a typical year.

We're looking at breaking this out into OLAP to provide faster access
to the data in various configurations and groupings. This is not a
problem, as this is the intended use for Analysis Services.

The problem is that we apply projection factors on the product prices
and quantities. This would be ok if it only happened once, however,
this happens every quarter (don't ask why). The projection factors
change 4 times a year, and they affect all historical product records.

This presents a challenge because to aggregate the data into a useful
configuration in the cubes, you throw out the detail data, but this
means throwing out the price and quantities which are needed to apply
the projection.

So if you have Product A at $10 and Product B at $20, and roll both up
into Category X, you'll have $30, but you'll lose the ability to apply
a projection factor of .5 to Product A and .78 to Product B. They're
rolled up.

I don't want to regenerate the cubes every 3 months. That's absurd.
But we can't live without the ability of projection the
prices/quantities on a product level (detail level). So how can this
be achieved when the other cubes are created at a higher level with
less details and sums of the detail data?

My initial guess is that we have to update the product data, and then
reaggregate all the other data that is built upon that product data.
Is there any other way to apply math to the data on the way out?

Thanks in advance!

Regards,

Zach
Jul 20 '05 #1
2 3597
zn*****@hpis.net (znelson) wrote in message news:<f5**************************@posting.google. com>...
I have a (hopefully typical) problem when it comes to cube design. We
store millions of product records every year, broken down by
month/quarter. Each product can be assigned to various heirarchial
classification groups etc. The data in an OLTP DB occupies roughly
100G for a typical year.

We're looking at breaking this out into OLAP to provide faster access
to the data in various configurations and groupings. This is not a
problem, as this is the intended use for Analysis Services.

The problem is that we apply projection factors on the product prices
and quantities. This would be ok if it only happened once, however,
this happens every quarter (don't ask why). The projection factors
change 4 times a year, and they affect all historical product records.

This presents a challenge because to aggregate the data into a useful
configuration in the cubes, you throw out the detail data, but this
means throwing out the price and quantities which are needed to apply
the projection.

So if you have Product A at $10 and Product B at $20, and roll both up
into Category X, you'll have $30, but you'll lose the ability to apply
a projection factor of .5 to Product A and .78 to Product B. They're
rolled up.

I don't want to regenerate the cubes every 3 months. That's absurd.
But we can't live without the ability of projection the
prices/quantities on a product level (detail level). So how can this
be achieved when the other cubes are created at a higher level with
less details and sums of the detail data?

My initial guess is that we have to update the product data, and then
reaggregate all the other data that is built upon that product data.
Is there any other way to apply math to the data on the way out?

Thanks in advance!

Regards,

Zach


You might want to post this in microsoft.public.sqlserver.olap - you
may get a better answer there.

Simon
Jul 20 '05 #2
znelson wrote:

This presents a challenge because to aggregate the data into a useful
configuration in the cubes, you throw out the detail data, but this
means throwing out the price and quantities which are needed to apply
the projection.
Not necessarily.

So if you have Product A at $10 and Product B at $20, and roll both up
into Category X, you'll have $30, but you'll lose the ability to apply
a projection factor of .5 to Product A and .78 to Product B. They're
rolled up.
They're rolled up, yes, but you can also have the detail data in the
cube depending on your dimension structure. For example, if your
hierarchy has the actual products in it (i.e. your hierarchy in this
example would be:

Category X
Product A
Product B
Category Y
Product C
Product D
etc.

I don't want to regenerate the cubes every 3 months. That's absurd.
If you want the historical data to reflect the new projections then I
don't see that you have a choice. Plus, you'll be updating your cube
with new sales data anyway.
But we can't live without the ability of projection the
prices/quantities on a product level (detail level). So how can this
be achieved when the other cubes are created at a higher level with
less details and sums of the detail data?

My initial guess is that we have to update the product data, and then
reaggregate all the other data that is built upon that product data.
Is there any other way to apply math to the data on the way out?


Unfortunately I really don't understand enough about your scenario or
how the projections are used to be able to offer any suggestions beyond
what I've already mentioned above. I hope that helps you some.

Zach Wells

Jul 20 '05 #3

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

reply views Thread by Bounty Bob | last post: by
1 post views Thread by BruceGilpin | last post: by
reply views Thread by wwalkerbout | last post: by
1 post views Thread by Nesaar | last post: by
reply views Thread by Kannan.V [MCSD.net] | last post: by
1 post views Thread by CARIGAR | last post: by
reply views Thread by zhoujie | last post: by
reply views Thread by Marin | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.