Ironside Group
  • Home
  • About
  • AWS Ascent Solutions
  • IBM Cognos Analytics Services
  • Resources
  • Careers
  • Contact
  • Click to open the search input field Click to open the search input field Search
  • Menu Menu
Request a Meeting
Business Analytics, Technology Spotlight

IBM Cognos Dynamic Cubes for Existing PowerPlay Customers

assembling cubes powerplay and dynamic cubes concept
February 3, 2014/by Ironside Group

You may have heard about the new Dynamic Cubes feature available in IBM Cognos. New technology is great, but if you have a library of existing PowerPlay cubes in your environment, you need to decide if it makes sense to migrate these cubes to Dynamic Cubes. Where does this new technology fit in your environment? What should you use to develop new cube capabilities if you choose to pursue them? This article will lay out the pros and cons of each approach in use case form to help you decide how best to leverage Cognos in your organization.

First of all, why use a cube technology at all? The answer is speed. Cubes are, at their most basic, a data caching technology enabling “analysis at the speed of thought” ¹. This means that cubes allow very fast reporting and analysis, such as drill-down and “slice and dice”. However, both PowerPlay cubes and Dynamic Cubes will do this quite nicely. For this reason, it is important to consider the differences and similarities between the two to find out which structure best supports your goals.

Cube Architecture Comparison

Let’s take a look at the architecture of both options.

PowerPlay cubes are designed and built using Transformer. Transformer can read a large number of data sources and types and then build one or more PowerPlay cubes based on the same general design. These cubes are files that reside on the BI servers or network drives in your environment. Cognos then reads the cubes from these files located on the disk. A single cube can be read by all servers within a multi-server environment. Cubes are somewhat limited in size, having roughly 25 million source rows or 6 GB as a maximum cube size (although source rows is not the best measure of the size and complexity of a cube). PowerPlay cubes can also provide additional levels of reporting by building a series of cubes (including cube groups) and by enabling cube-to-cube drill through. After the cube is built, the data within it remains static until the cube is built again. While this can lead to drift in your data accuracy compared to source systems, it can occasionally be useful in archiving data views as of a certain point in time if you do not have a well-constructed archiving system in your data warehouse.

PowerPlay cubes can be time-consuming to build, with 4-6 hour cube builds being fairly common. For this reason, many organizations use dedicated Transformer servers to load balance.

In contrast, Dynamic Cubes are stored in-memory, residing on the Cognos report servers. For this reason, they can take advantage of and provide recommendations based on usage for data warehouse summary tables. Holding this position in the infrastructure also means that Dynamic Cubes require an underlying star schema database from one of the major database vendors to function correctly.

Because you get the best performance when all the cube queries use the same cache, you typically deploy a cube onto a specific dispatcher. 64-bit dispatchers are recommended for large cubes as they provide the best performance. If your environment includes relational, compatibility mode (32 bit) reporting, which is common, you’ll need a mix of 32-bit and 64-bit dispatchers, and some management around routing queries to the appropriate dispatcher.

Since the cubes reside in memory, it goes without saying that the BI servers use a great deal of RAM, along with 64-bit operating systems and Cognos installations, in order to address the memory space. The benefit is very fast query speed against very large (terabyte-sized) cubes. This is not to say that the entire cube must fit in available RAM. Dynamic cubes will also use smart caching and summary tables to optimize performance.

Modeling Comparison

Anyone modeling PowerPlay cubes in Transformer will feel comfortable modeling Dynamic Cubes in the Cube Designer tool. One notable difference is that Dynamic Cubes are modeled directly against data warehouse tables, and do not use or require Framework Manager packages to access data.

Also, dimensions can be modeled independently of cubes in the Dynamic Cube structure. When the dimension is complete, you simply apply it to one or more cubes, maintaining consistency across multiple cubes. The same process works for maintaining new dimensions. If you add a new hierarchy, you can model it once and it will apply to all your cubes. As you can see, this greatly reduces modeling lead time since you have access to a wealth of shared information.

Security views are also created in the Cube Designer. Security can be applied at the Member, Dimension, or Measure levels. Dynamic Cubes also allow for table-based security if you need it. This allows the modeler to create a database query that drives the security views allowed to the users. Also, updates to security access can be automatically handled via database records.

Other Considerations

There are a number of other key differences between Dynamic Cubes and PowerPlay cubes. For instance, Dynamic Cubes let you create Virtual Cubes. As the name implies, these are cubes derived from 2 or more underlying physical cubes. Below are two examples of how you might use these.

One case is a 5 year sales history cube. One physical cube consists of 5 years of history, through the end of the prior month. The second cube is the current month’s sales data, which has a much faster refresh cycle. If you put them together, you get a full view of 5 years of history up through the current date. Another case is a physical cube containing Currency Exchange rates that is updated daily. This cube can be utilized within several virtual cubes (such as sales, inventory, and open orders) to analyze all the business situations occurring across multiple currencies, all with a single currency update.

There are also many options for optimizing and tuning performance within Dynamic Cubes. Two of the more notable options are the Aggregate Advisor and Aggregate Cubes. The Aggregate Advisor is a tool to help the modeler design in-database aggregate tables and aggregate cubes, based on the data and usage patterns. The IBM Cognos BI server can then seamlessly direct specific queries to the most efficient data source.

An important note about cube builds is that the cubes are only available for use when all the members are loaded. For timing estimates, IBM projects a rate of 27,000 members/second and 54,000 rows/second for aggregates (see quote below). The best performance will come after all the aggregates are loaded.

For optimal IBM Cognos Dynamic Cube performance, in-memory aggregates must be loaded. Although the throughput for loading aggregates is highly dependent on the cube complexity, a throughput of 54,000 rows/second can be used to help assess the time required to load aggregates.²

This rate compares quite favorably to typical cube build times with Transformer.

Dynamic Cubes also allow you to model recursive (multi-level parent-child) relationships. If you need to model employee-manager, Bill of Materials, or Work Breakdown structure relationships with ragged hierarchies, you now have a great tool to accomplish that.

Dynamic Cubes also support multiple attributes, which PowerPlay cubes do not. In PowerPlay, this created a need to model attributes as separate dimensions, such as product color or customer market segment. Using Dynamic Cubes, attributes can still be analyzed without the need to pre-define them as dimensions, allowing for richer analysis.

PowerPlay Studio is a favorite tool of some users, and obviously only works with PowerPlay cubes. However, between Cognos Workspace Advanced and Analysis Studio, there are good alternatives available.

Both PowerPlay cubes and Dynamic Cubes are read-only cubes. If your application calls for write-back or what-if analysis, you should consider TM1 as a solution instead.

Conclusion

PowerPlay cubes still have their supporters, and continue to be an integral part of IBM’s cubing strategy. That said, if you are outgrowing the limits of PowerPlay cubes, particularly in size, performance, or specific capabilities, Dynamic Cubes provide the headroom you need to handle exploding data requirements and capabilities.

The sweet spot for Dynamic Cubes seems to be in large data cubes that are accessed and/or updated regularly. If you only require a few clicks a month, you may want to consider Dynamic Query Mode (DQM) or PowerPlay cubes, which are not so memory-intensive.

There are clear architecture differences between the two approaches. Whichever option you choose will have a significant impact on your BI system architecture profiles. For this reason, it is important to know that IBM provides multiple options to accommodate your specific reporting and data handling needs.

More on sizing: IBM Business Analytics Proven Practices: Dynamic Cubes Hardware Sizing Recommendations

Performance: IBM Business Analytics Proven Practices: IBM Business Intelligence Pattern with DB2 BLU Acceleration Performance Considerations

___________________________________________

 ¹ Cubes are not typically a primary data source, although I’ve certainly seen PowerPlay used that way. A common case of this is spreadsheet-based budgets imported into a cube for budget vs actuals comparisons. Ideally, though, if the spreadsheet is important enough to report on, it should be brought into the data warehouse.

² IBM Business Analytics Proven Practices: IBM Business Intelligence Pattern with DB2 BLU Acceleration Performance Considerations

 

 

Tags: Business Analytics, Business Intelligence, Cognos, Dimensional Modeling, Dynamic Cubes, IBM Cognos, Metadata, Platform, PowerPlay Cubes
https://www.ironsidegroup.com/wp-content/uploads/2016/07/Assembling-Cubes-Resized.jpg 350 750 Ironside Group https://www.ironsidegroup.com/wp-content/uploads/2018/03/logo-with-words.png Ironside Group2014-02-03 15:53:132019-01-22 11:55:50IBM Cognos Dynamic Cubes for Existing PowerPlay Customers

See 3 reasons why AWS analytics are well within your reach.

Recent Posts

  • Transforming Enterprise Productivity with Advanced AI Assistance from Amazon Q
  • Transforming Mystery Shopping with AI: How HS Brands is Revolutionizing Customer Experience Measurement
  • Navigating the Amazon Q Ecosystem: Amazon Q Business vs. Amazon Q in QuickSight
  • Rapid Insights with Amazon Q in QuickSight
  • AWS Summit New York 2024: Transforming Cloud Innovation and AI Solutions

Categories

  • artificial intelligence
  • Awards & Recognition
  • Business Analytics
  • Data Management
  • Data Science
  • Featured
  • Generative AI
  • Technology Spotlight
  • Uncategorized

Newsletter

Ironside is a data, analytics, and AI consulting and technology implementation company. We work with clients to understand their business objectives and day-to-day challenges, then employ our deep expertise in data strategy, architecture and engineering, integration, business intelligence, AI, and generative AI. We create technical solutions and practical approaches designed to drive smarter, data-driven decisions across a company and help it move further and faster toward its business goals.

LINKS

  • Home
  • About
  • AWS Ascent Solutions
  • IBM Cognos Analytics Services
  • Resources
  • Careers
  • Contact

GET IN TOUCH

781-860-8840

For inquiries: GetInsights@IronsideGroup.com

For HR: AMacMaster@ironsidegroup.com

Office Address
131 Hartwell Ave
Lexington, MA 02421

Corporate Mailing Address
Ironside Group, LLC
6 Liberty Sq
Suite 91143
Boston, MA 02109

Regional offices in Atlanta, Austin, Boston, Charlotte, New York City and Orlando

>>Managed Services Support

  • linkedin
  • x
  • youtube
  • mail
Also of Interest
  • Webinar Series: Overview
  • Ironside Webinar Series: Overview of...
  • Automating Cube Build in IBM Cognos...
© 1999-2024 Ironside Group. All Rights Reserved.
Link to: Ironside Webinar Series: IBM Predictive Maintenance and Quality Link to: Ironside Webinar Series: IBM Predictive Maintenance and Quality Ironside Webinar Series: IBM Predictive Maintenance and Qualityinspecting documents predictive maintenance and quality concept Link to: Enhanced Security in IBM Cognos BI 10.2.1 Dynamic Cubes Link to: Enhanced Security in IBM Cognos BI 10.2.1 Dynamic Cubes enhanced security bodyguard conceptEnhanced Security in IBM Cognos BI 10.2.1 Dynamic Cubes
Scroll to top Scroll to top Scroll to top
Send this to a friend