After many years of helping several companies around the world creating small and large data models using SQL Server Analysis Services Tabular, I’ve seen a common performance issue that is underestimated at design time. The VertiPaq engine in SSAS Tabular is amazingly fast, you can have billion of rows in a table and query performance are incredible. However, in certain conditions queries made over tables with a few million rows are very slow. Why that?

Sometime the problem is caused by DAX expressions that can be optimized. But if the problem is in the storage engine (something that you can measure easily with DAX Studio), then you might be in bigger troubles. However, if you are not materializing too much (and this is a topic for another post of for the Optimizing DAX course), chances are that you are paying the price of expensive relationships in your data model.

The rule of thumb is very simple: a relationship using a column that has more than 10 million unique values will be likely slow (hopefully this will improve in future versions of Analysis Services – this information is correct for SSAS 2012/2014). You might observe slower performance already at 1 million unique values in the column defining the relationship. As a consequence, if you have a star schema and a large dimension, you have to consider some particular optimization (watch my session at Microsoft Ignite to get some hint about that).

If you want to know more, read my article on SQLBI about the Costs of Relationships in DAX, with a more complete discussion of the problem and a few measures of the timings involved.