Previously in our “What’s New in IBM Cognos 10.2” article, we touched upon the new IBM Cognos OLAP capability known as Dynamic Cubes. In this article, we will discuss what it is, when it is useful, and how to put it into practice.

What are Cognos Dynamic Cubes?

Cognos Dynamic Cubes supplement existing Cognos BI solutions and are designed to enable high performance interactive analysis over terabytes of data contained in enterprise data warehouses. They leverage the existing Dynamic Query Layer to work with star or snowflake schema modeled data warehouses to provide an in-memory relational OLAP functionality with aggregate awareness. Cognos Dynamic Cubes are tightly integrated into the overall Cognos Suite for a seamless user experience.

Why should I leverage Cognos Dynamic Cubes?

Cognos Dynamic cubes was developed to address the challenge of high-performance/low latency interactive analysis and reporting against terabytes of data. DMR (Dimensionally Modeled Relational) and OOR (OLAP Over Relational) solutions have traditionally performed well with low to medium data volumes, however when applied against data sources with higher row-counts (20 – 25 million and over), performance degrades and user satisfaction – a critical metric of Business Intelligence success – decreases.

How do Dynamic Cubes compare to other Cognos OLAP Technologies?

Cognos offers a variety of data solutions for different data requirements, with each technology designed to tackle specific data problems.

IBM Cognos TM1 – MOLAP Technology built to support write-back and handle medium data volumes. Aggregation happens on the fly, which can impact performance with very large data sets and high user volumes.

IBM Cognos PowerCubes – MOLAP Technology that makes use of pre-aggregation. Cube is “static”, in that it has to no active connection to the data source. Designed to allow for interactive analysis of data contained in operational/transactional data sources. Data movement into cube forces some data latency.

DMR – OOR Technology to allow for dimensional data exploration over low data volumes. Not suitable for medium to large data sets.

Dynamic Cubes – Allows the creation of a dynamic cube data source that is preloaded with dimensions. Optimal for interactive analysis over very large data sets. Aggregate aware, with extensive in-memory caching (through DQM) offers solid performance. Minimizes movement of data between relational data sources and dynamic cubes engine. Only works with star or snowflake schema data sources.

What are the Requirements?

  • Memory – Because Dynamic Cube stores data in-memory, sufficient server RAM is essential to the support the application.
  • 64 bit Report Server enabled – Even on a 64 bit OS, the default setting for the report server is 32 bit. You will need to change this to 64 bit. Note that with 64 bit enabled you can only run DQM enabled reports.
  • Cognos 10.2 or newer
  • Supported Databases (in the current 10.2 release) include: IBM DB2, IBM Netezza, Microsoft SQL Server, Oracle, Teradata
  • IBM Cognos Cube Designer
  • To better understand the hardware requirements for implementing Dynamic Cubes, refer to the Understanding Hardware Requirements for Dynamic Cubes guide available from IBM.

What is the Workflow for Implementing a Dynamic Cube?

After ensuring that the requirements in the previous section are met, you will need to leverage Cognos Cube Designer to start the workflow process. For those of you with experience using IBM Cognos Framework Manager and IBM Cognos Transformer (the OLAP modeling tool for Cognos Powerplay Cubes), you will find many similarities.

Set up your data source connection to your relational database

  • Dynamic Cubes will leverage the DQM (Dynamic Query Mode) data sources and connections that you have established in your Cognos BI environment.

Import metadata from your relational database source

  •  This is the start of the modeling process, much like the process Transformer and Framework Modelers users follow.

Model the Dynamic Cube:

  • This step of the workflow involves identifying measures, dimensions, keys, captions and levels, and then establishing the appropriate relationships between them
  • This step also allows for the creation of calculated members through the use of expressions.

Model Aggregate Cubes:

  • If your imported relational data source includes aggregate tables, aggregate cubes can be modeled to provide aggregate awareness to your Dynamic Cube and redirect queries as appropriate, reducing latency.

Define Security

  • In this step, you define security for dimensions, measures or cube views.

Deploy your Dynamic Cube

  • In this step of the workflow, you publish your Dynamic Cube into your Cognos BI environment, providing appropriate credentials for the cube. Without appropriate credentials, the cube will not start.
  • You will also leverage IBM Cognos Administration to assign the Dynamic Cube to the Query Service.
  • Finally, the Dynamic Cube needs to be started before it can be leveraged
  • A Quick-Deploy option is available from within Cube Designer, however this may not be appropriate for all environments.  

Maintenance and Optimization:

  • Once you’ve published your Dynamic Cubes and you’ve allowed your user community to access them, it’s important to monitor their performance. You can leverage the Metrics window with Cognos Administration to monitor key metrics including:

  • Average Successful Request Time – this outlines the average time for a successful report execution
  • Data Cache Hit Rate – this profiles the utilization of your data cache and response time performance of requests against your Dynamic Cube
  • Result Set Cache Hit Rate – this metric profiles result set cache reuse- a higher hit rate means the majority of your report results can be reused.

Optionally, once you’ve created and published your Dynamic Cube, you can leverage the Aggregate Advisor tool within Cognos Dynamic Query Analyzer to review the profile of usage via the Dynamic Cube logs and provide recommendations about additional aggregate tables that can be created in your relational data source to optimize performance.

You should run the Aggregate Advisor on a periodic basis to understand if the aggregate tables you have created are still in line with the needs of your user community – as defined by the logs that are created by the Dynamic Cubes.

What are the Licensing Requirements?

According to IBM, no additional license roles need to be purchased in order to use Dynamic Cubes. Existing roles such as “Administrator” and “Modeler” apply as usual, as Cognos Dynamic Cubes is part of the BI query layer.

For organizations on a PVU pricing model, be aware that there might be an impact to the number of cores being used in the application, as this is a memory-intensive technology and a larger server may be needed to support a growing application.