Divya Agrawal

Automate documentation of SQL Server Analysis Server Tabular Model

March 13, 2018 by

There has been an ever-growing discomfort on documenting things especially when it’s very dynamic in nature and I was at one time undergoing the same. We have been developing SQL Server Analysis Tabular Model’s, which were quite a many in numbers and documenting 50 or 60 models manually is a big effort. In addition, development always demands change or enhancements that incurs changes on the documentation that is a continuous process. Adding the changes on the documentation is a time consuming process and sometimes loses track or remains inconsistent with the code built which is a critical issue when the system is in production for years.

I decided to automate the documentation of the SSAS model, to ease the manual effort here as well as save time. In addition, help the end users to see and review the latest change on the models quickly without even bothering the development team.

Solution

I have used C#.NET to build up the documentation tool referring MSDN help for the libraries used to fetch the detailed properties for the Model. Microsoft has provided Tabular Object Model (TOM) library, which extracts all the metadata and properties for the model from SQL Server 2016 onwards, which is an extension of the AMO library used to extract Multidimensional cube metadata. This library is under Microsoft.AnalysisServices.Tabular.dll assembly.

Overview

An SSAS Tabular model is a database that run in-memory or in DirectQuery mode, accessing data directly from backend relational data sources. The model database is nothing but a Json object based definition that can be accessed via TOM object library.

Logically, in a Tabular model, everything is driven from a Model which acts like a root that in turn is descendant of a Database (same as in Multidimensional). Here are the different objects exposed via TOM library referenced from MSDN.

As a part of design documentation, we need mainly four objects, which can be derived from the green highlighted nodes.

  1. Tables and their relationships with other tables
  2. Partitions and their corresponding tables
  3. Columns and their corresponding tables
  4. Perspectives and their corresponding tables

Input

Now what should be the input to the tool? In order to make this tool friendly with any end user, the input should be simple like the name of Tabular model, which needs documentation with the server name from where the model is present. It should have capability to generate document for multiple models at one time too.

  1. Server Name
  2. Model Names in a csv file

Internal Process Overview

Once we receive the input, the code steps will be as follows

Get the Server connectivity for the models to read the metadata.

  1. Connect to the server provided. Ensure sufficient access to the user for server access is available.
    1. Create a new server object
    2. Use the Connect method to connect to server where the input parameter is “Provider = MSOLAP; Data Source = “<servername>”
  2. Iterate on the models one by one and extract the four main objects mentioned above
    1. Read the input csv file where the model names are provided which needs to be documented
    2. Iterate over the model names and find the model present on the server

Tables and their relationships with other tables

To get the first main output report for Tables, follow the steps below

  1. Once the model name is available, iterate on the tables present on the model.
    1. Fetch all the tables in the model using in mdl.Model.Tables.ToArray() property and iterate over them one by one using a Table object.
    2. Get the different properties for the table like Logical Table Name which is a Friendly table name or Displayed table name provided to the table in the model using <Table object>.Name property.
    3. Get the Table Description property which can be a data dictionary definition given for a table using <Table object>.Description property.
    4. Get the Physical or the Actual table name using the annotations on the model using <Table object>. Annotations[“_TM_ExtProp_DbTableName”].Value property.
    5. Get the source query of the physical table used in the model using tbl.Annotations[“_TM_ExtProp_QueryDefinition”].Value property. If the table is hidden or not, get that Boolean flag using <Table object>.IsHidden property.

    6. The complete code for step 3 is as below

    7. If the table is a Calculated Table, use the partition source property
  2. Get all the relationships from the table by iterating on Model.Relationships property in object.
    1. Compare the “Relationship To table name” with the Logical name extracted in step 1.c.
    2. If it is same, get the Relationship name, From Table name using <relationship object>.Name and <relationship object>.FromTable.Name property.
    3. Get the Physical table name of the From Table name using <relationship object>.FromTable.Annotations[“_TM_ExtProp_DbTableName”].Value.property.
    4. Get the Logical key name for From Table name used in joining the table using <relationship object>.FromColumn.Name property.
    5. Get the Physical key name for From Table name used in joining the table based on the Column type – Data or Calculated using <relationship object>.((Microsoft.AnalysisServices.Tabular.DataColumn)rel.FromColumn).SourceColumn
    6. Or
    7. <relationship object>.((Microsoft.AnalysisServices.Tabular.CalculatedTableCoumn)rel.FromColumn).SourceColumn
    8. Repeat steps f to I for “Relationship To Table Name” properties.
    9. Get the cardinality of the relationships using <relationship object>.FromCardinality and <relationship object>.ToCardinality properties.
    10. One main important property of a relationship is the joining type based on <relationship object>.RelyOnReferentialIntegrity property. If the property value is true then it is an “Inner Join” else “Outer”
    11. Get the Boolean value of whether the relationship is active or not using <relationship object>.IsActive property.

Complete code for getting the relationships is as follows

Partitions and their corresponding tables

Once the table metadata is extracted, we need to go one more level down is the Partitions. Let’s understand the properties to get the same.

  1. For each table in the model, there exist a partition which can be at least one or more than one.
  2. In the same iterative loop for table, after a single table is fetched get all the partitions for the table using <Table object>.Partitions property
    1. Get the properties of a partition like partition name using <Partition object>.Name property
    2. Get the Partition source which can be of three types using (<Partition object>.Source.Partition).Source property
      1. Query – Data in this partition is retrieved by executing a query against a DataSource. The DataSource must be a data source defined in the model.bim file.
      2. Calculated – Data in this partition is populated by executing a calculated expression.
      3. None – Data in this partition is populated by pushing a rowset of data to the server as part of the Refresh operation.
    3. Get the Query for partition if the partition is a query source using ((Microsoft.AnalysisServices.Tabular.QueryPartitionSource)(<Partition object>.Source.Partition).Source).Query

    Complete code for step 2 is as below

    Columns and their corresponding tables

    The next level is to get the column definitions present in the table.

    1. Columns will be of two types – Columns means attributes of a table and Measures that means the summarized or calculated value of a column specifically in a fact table.
    2. Once we have the table in a model, in the same iterative loop we can fetch all the columns using <Table Object>.Columns property.
      1. Get all the column properties based on the column types.
        1. DataColumn – For regular columns in regular tables
        2. CalculatedColumn – For columns backed by DAX expression
        3. CalculatedTableColumn – For regular columns in calculated tables
        4. RowNumberColumn – Special type of column internally created by SSAS for every table
      2. Based on the column type, fetch the physical column name using ((Microsoft.AnalysisServices.Tabular.DataColumn)<Column Object>).SourceColumn propery. Change the casting to Calculated and CalculatedTableColumn based on the column types.
      3. Get the logical column name which is a friendly column name given to the column using <Column object>.Name property.
      4. Get the display folder, which is a logical grouping of different attributes shown in form of folder when the model is browsed using <Column object>.DisplayFolder property.
      5. Get the Format string, source type, formula properties using “FormatString”,”SourceProviderType”, and “SummarizeBy” properties.
      6. Get the basic properties like Description, Hidden and Datatype using “Description”,”IsHidden”, and “ColumnDataType” properties.

    3. Complete code for step 2 is as follows

    4. After the columns, get the measure metadata from the table object by iterating using <Table object>.Measures property.
    5. Extract all properties like Measure Name, Data Type, Display Folder, Expression which is the calculation used, Format string, description, Hidden using code below

    This completes the Table level iterative loop and continue the same loop until all the tables metadata is extracted and can be saved in a Data Table. The Data table is then exported to excel.

    Perspectives and their corresponding tables

    After tables, we can extract the Perspectives on the model which will give an idea which table belongs to which perspective. Here we are extracting only tables in the perspective, but can be extended to get the columns exposed in the perspective also.

    1. Iterate over all perspectives on the model using <Model object>.Model.Perspectives property
    2. Get the Table name in the perspective using <Perspective object>.PerspectiveTables property

    Here is the complete code to get the perspectives

    Output

    I have extracted all the above four main objects in different Data Tables and then exported to Excel in different work sheets which provides a good documentation for end users. There are lot of references available to write data to excel, my version is as follow

    Create a method to write data to Excel that takes the datatable as input and writes to different sheets

    Screenshot sample of the output file

    Sheet Model Tables – Tables information and their relationship

    Sheet Model Columns – Columns information with their corresponding tables

    Sheet Partitions – Partitions information with their corresponding tables

    Sheet Perspectives – Perspectives and their corresponding tables

    Conclusion

    The automation of documentation of the model in an excel file makes it very easy for any type of user to view the model and use it accordingly. The tool is built with all references available from MSDN and can be enhanced further to add further more objects and properties.

    Feel free to contact me on the complete code for the tool or suggest any feedback.

    See more

    For SSAS cube documentation, consider ApexSQL Doc, a tool that offers the possibility of documenting both Multidimensional and Tabular databases in different output formats.

    References



    Divya Agrawal

    Divya Agrawal

    Divya has proficient experience in SQL Server database and Business Intelligence technologies from Design to Development. She is passionate about accelerating development and have been contributing many user friendly and configurable utilities. She has designed effective solutions and resolved business-critical issues using skilled SQL queries and techniques.

    View all posts by Divya Agrawal
    Divya Agrawal

    Latest posts by Divya Agrawal (see all)

Analysis Services (SSAS)

About Divya Agrawal

Divya has proficient experience in SQL Server database and Business Intelligence technologies from Design to Development. She is passionate about accelerating development and have been contributing many user friendly and configurable utilities. She has designed effective solutions and resolved business-critical issues using skilled SQL queries and techniques. View all posts by Divya Agrawal

532 Views