Discussion:
Could an cube cacluation be cached some how?
(too old to reply)
Knut Ivar
2007-03-13 08:35:08 UTC
Permalink
Hi,
I have an benchmark calculation that shows the values for each Group in
different Period. Eg. Group A in an period 2006m10.

This calculation is heavy and goes very slow. My question is not about
optimizing this query is self- but if it's possible to chach the value from
this query
some where or somehow in the AS2005. Because when the value first is
producer produce an report in this group for that spesific period, I would
like to chach the
value and use that run his report later to other users making use of the
same calculated benchmark value, and the query should then go significant
faster.

The query could take aprox 1 min, but when I run it again - it get the value
from the cache and that takes aprox 0-1 sec.

Thanks to any suggestions.

Kind Regards
Knut Ivar
Peter Yang [MSFT]
2007-03-14 02:15:21 UTC
Permalink
Hello Knut,

I understand that you'd like to know if it's possible to cache cube
calculation somehome so that clients can get report quicker after 1st
query. If I'm off-base, please let me know.

Analysis Services does not save the calculated results in the query results
cache, so these calculations must be re-executed each time they are
requested. Simple calculations are performed very quickly, but Analysis
Services may consume a significant amount of time resolving queries that
invoke these calculations when complex calculations are involved.

It's suggest taht you use the following guidelines when working with
complex query time calculations:

• Ensure that the additional functionality provided through the use of
complex query time calculations justifies the cost in query responsiveness.
In particular, it is best whenever possible to avoid calculations that
recursively operate on cells or members, or scan large numbers of members
to obtain the desired result.

• When you use a calculated member on the Measures dimension, set the Non
Empty Behavior property to the name of a measure, in order to treat the
calculated measure as empty if the specified measure is empty. Otherwise,
computations that include the NON EMPTY keyword can be slow and consume a
lot of client resources, because the calculation is performed for every row
or column just to determine whether the result is empty. On sparse cubes,
the computation time can be cut dramatically.

• Consider shifting the performance impact of complex calculations to the
processing phase. You can accomplish this by using views or computed
columns in the relational database or by using SQL queries to define
computed columns in the Measures dimension.

You could see above information from the following articles:

Microsoft SQL Server 2000 Analysis Services Performance Guide
http://www.microsoft.com/technet/prodtechnol/sql/2000/maintain/ansvcspg.mspx

Microsoft SQL Server 2005 Analysis Services Performance Guide
http://download.microsoft.com/download/8/5/e/85eea4fa-b3bb-4426-97d0-7f7151b
2011c/SSAS2005PerfGuide.doc

You may want to consider cache the query result on application level so
that it can be re-used if same query is submitted to the application again.
For example, reporting service could cache the report with backend data
source as Analysis services. If report parameters are same, the report
server may return the result to clients from cache directly without
performing data retriving/report processing again. This may improve
performance of reporting if there is similar report requests.

Report Caching in Reporting Services
http://msdn2.microsoft.com/en-us/library/ms155927.aspx

Best Regards,

Peter Yang
MCSE2000/2003, MCSA, MCDBA
Microsoft Online Community Support
==================================================
Get notification to my posts through email? Please refer to
http://msdn.microsoft.com/subscriptions/managednewsgroups/default.aspx#notif
ications
<http://msdn.microsoft.com/subscriptions/managednewsgroups/default.aspx>.
Note: The MSDN Managed Newsgroup support offering is for non-urgent issues
where an initial response from the community or a Microsoft Support
Engineer within 1 business day is acceptable. Please note that each follow
up response may take approximately 2 business days as the support
professional working with you may need further investigation to reach the
most efficient resolution. The offering is not appropriate for situations
that require urgent, real-time or phone-based interactions or complex
project analysis and dump analysis issues. Issues of this nature are best
handled working with a dedicated Microsoft Support Engineer by contacting
Microsoft Customer Support Services (CSS) at
<http://msdn.microsoft.com/subscriptions/support/default.aspx>.
==================================================
This posting is provided "AS IS" with no warranties, and confers no rights.
Knut Ivar
2007-03-14 12:09:16 UTC
Permalink
Thank you Peter, you give some grate suggestions there.

This is the kinde of benchmark calculation I struggles with. And look like
this:
CREATE MEMBER CURRENTCUBE.[MEASURES].PricePerAnimalTop10pctCompany AS
avg(toppercent(([Producer].[RegionProducer].[All],[ProducerGroup].[ProducerGroup].[All],[HarvestAnimal].[HarvestAnimalID].[HarvestAnimalID].members),10,[Measures].[PricePerAnimal]),[Measures].[PricePerAnimal]),
FORMAT_STRING = "### ###.0", VISIBLE = 1;

FACTTABLE::
HarvestAnimalID ProducerID DeliveryDate PricePerAnimal ProducerGroupID
15322 781122 2006-08-24 840,11 2
67857 728122 2006-08-24 840,11 4
88802 919925 2006-09-18 837,05 3
36267 999125 2006-09-18 837,05 2
88942 781705 2006-09-18 1128,83 1
85014 761212 2006-09-14 1118,39 4
.... this factable could contain millions of rows......
..............lar, it is best whenever possible to avoid calculations that
recursively operate on cells or members, or scan large numbers of members
to obtain the desired result.
I think that could explain why the calculation is slow, because the
calculation drilldown to the lowest level "factabel". And I may could not
expect that it should go fast also, since there is no kind of caching.

So my next steep is to made an work around this slow calculation. My thought
is to create an Assembly and let the calculated benchmark member get the
values from and databaselookup to an precalculated resulttable in the
SQL-database. That resulttable could contain my benchmark values, price for
the best 10%. and ex..
Do you think that could be an option for us?


Thank you

Kind Regards
Knut Ivar
Peter Yang [MSFT]
2007-03-15 06:12:42 UTC
Permalink
Hello Knut,

Thank you for your reply and additional informaiton to express your
concerns.

I agree with you that pre-calculation/aggregation on source table level
show benefit the performance a lot under your situation. However, this will
certainly bring load to pre-calculate/aggregation.

Please let's know if you have any further feedback or comments. Thanks.

Regards,

Peter Yang
MCSE2000/2003, MCSA, MCDBA
Microsoft Online Partner Support

When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.

=====================================================

This posting is provided "AS IS" with no warranties, and confers no rights.
Loading...