In the last two years I worked on advanced modeling with Analysis Services. One result of this was the paper “The many-to-many revolution” I released in September 2006. Since then, I got a lot of feedback and I have been working on other evolutions that I will publish in the future (I need to find the time to write!). However, there is a background question that I would like to discuss with some other Analysis Services developers. I hope that some of my blog readers will give their contribution to the discussion.

The general question is: is it better to give more analysis feature to the end user adding attributes and measures in the UDM model on the server or defining client-side MDX formulas?

Let’s make some examples just to explain the needs. Imagine a user that want to see the Year-to-Date value of a measure or the Year-over-Year percentage difference. This seems a basic requirement of any analysis, but it’s not for free. Assuming that the user doesn’t know MDX, you need either an Olap client able to define the necessary MDX query or some calculated members on the server offering the required feature. The Time Intelligence Wizard provides a way (with many limitations, in my opinion) to obtain the desired feature. I use a more sophisticated and flexible approach (requiring less calculated members on the server) that could be an argument for a future post, but it is not relevant to this discussion.

We can compare pros and cons of each approach.

  • Server-side calculated members / MDX Scripts
    • Pros
      • Full availability on any client
      • One single source of the “truth”
      • Query execution engine cache can be shared by different client sessions
    • Cons
      • Need to be prepared in advance
      • Might rise complexity of the model
  • Client-side MDX formulas / calculated members
    • Pros
      • Higher flexibility (the user choose what she want whenever she want)
      • Simpler cube model (simplify cube maintenance)
    • Cons
      • Query execution engine cache cannot be shared by different queries
      • Users might defines similar calculations with the same name but different formulas, losing the single “truth”
      • Requires an Olap client able to define calculated members (Excel has many limitations for that)

It is clear that I’m a strong supporter of “put it on the model” approach. I better control optimization and performance and I limit the need to train end users. That said, I feel that sometime this approach is very expensive. For example, when you have several Date/Time dimensions (normally using role-based dimensions) it could be hard to provide a consistent set of calculations on the server (each role dimension is potentially a multiplier of the resulting members).

Please note that raising model complexity is often the result of improvements that provides a better and simpler end-user experience. What I found difficult is to find the right balance. Having a too much complex model can have an undersired side-effect: nobody tries to change anything on the model, resulting in new (and redundant) cubes when new requirements arise and the original developer has moved to another project.

What do you think about it? What is your experience? Do you have a different pro and cons balance?

Let me know… your comments will be welcome.

NOTE: Meet us at SQLBI booth #532 at the Microsoft BI Conference this week!