Tag Archive for: Framework Manager

Determinants can play a crucial role in the overall performance and consistency of your Framework Manager model but remain one of the most confusing aspects of the application to most developers. This article will attempt to end the confusion.

Determinants are used so that a table of one grain (level of detail) behaves as if it were another actually stored at another grain. They are primarily used for dimension tables where fact tables join to dimension tables at more than one level in the dimension. (There are other cases where you could use them, but they are less common and fairly specific situations.)

The Situation

Let’s use the example of a date dimension table with day level grain. If all the fact tables join at the day level, the most detailed level, then you do not need determinants.  But as many of us know from experience, this is not always the case. Fact table are often aggregated or stored at different levels of granularity from a number of reasons.

The Problem

The trouble arises when you wish to join to the dimension table at a level that is not the lowest level. Consider a monthly forecast fact table which is at the month level of detail (1 row per month). A join to the month_id (e.g. 2009-12) would return 28 to 31 records (depending on the month) from the date dimension, and throw off the calculations. Determinants solve this problem.

The SQL

Often when modeling, it’s useful to think about the SQL code you would like to generate. Without determinants, the incorrect SQL code would look something like this.

SELECT
F.FORCAST_VALUE,
D.MONTH_ID,
D.MONTH_NAME
FROM SALES_FORECAST F INNER JOIN DATE_DIM D ON
F.MONTH_ID = D.MONTH_ID

This code will retrieve up to 31 records for each of the sales forecast records. Applying mathematical functions, for example Sum and Count, would produce an incorrect result. What you would like to generate is something along the following lines, which creates a single row per month, AND THEN join to the fact table.

SELECT
F.FORCAST_VALUE,
D1.MONTH_ID,
D1.MONTH_NAME
FROM SALES_FORECAST F INNER JOIN
( SELECT DISTINCT
D.MONTH_ID,
D.MONTH_NAME
FROM DATE_DIM D ) AS D1
ON F.MONTH_ID = D1.MONTH_ID

As shown  above, the trick is to understand which columns in the dimension table are related to the month_id, and therefore are unique along with the key value.  This is exactly what determinants do for you.

Unraveling the Mystery in Framework Manager

Following Cognos best practices, determinants should be specified at the layer in the model in which the joins are specified.

Here we see a date dimension with 4 levels in the dimension, Year, Quarter, Month and day level.

This means we can have up to 4 determinants defined in the query subject depending on the granularity of the fact tables present in your model.  The first three levels, Year, Quarter, Month, should be set to “group-by” as they do not define a unique row within the table and Framework Manager needs to be made aware that the values will need to be “Grouped” to this level. In other words, the SQL needs to “group by” a column or columns in order to uniquely identify a row for that level of detail (such as Month or Year).  The Day level (often called the leaf level) should be set to “Uniquely Identified”, as it does uniquely identify any row within the dimensional table. While there can be several levels of “group by” determinants, there is typically only one uniquely identified determinant, identified by the unique key of the table. The “uniquely identified” determinant by definition contains all the non-key columns as attributes, and is automatically set at table import time, if it can be determined.

The Key section identifies the column or columns which uniquely identify a level.  Ideally, this is one column, but in some cases may actually need to include more than one column.  For example, if your Year and Month values (1-12) are in separate columns.  In short, the key is whatever columns are necessary to uniquely identify that level.

Using our aforementioned table, the setup would look like this:

The Attributes section identifies all the other columns which are distinct at that level.  For example, at a month_id  (e.g. 2009-12) level , columns such as month name, month starting date, number of days in a month are all distinct at that level. And obviously items from a lower level, such as date or day-of-week, are not included at that level.

Technically, the order of the determinants does not imply levels in the dimension. However, columns used in a query are matched from the top down which can be very important to understanding the SQL that will be generated for your report. If your report uses Year, Quarter and Month, the query will group by the columns making up the Year-key, Quarter-key and Month-key. But if the report uses just Year and Month (and not the Quarter) then the group by will omit the Quarter-key.

How Many Levels Are Needed?

Do we need all 4 levels of determinants? Keep in mind that determinants are used to join to dimensions at levels higher than the leaf level of the dimension. In this case, we’re joining at the month level (via month_id). Unless there are additional joins at the year or quarter level, we do not strictly need to specify those determinants. Remember that year and quarter are uniquely defined by the month_id as well, and so should be included as attributes related to the month, as shown.

The Result

Following these simple steps the following SQL will be generated for your report. The highlighted section is generated by the determinant settings. Notice how it groups by the Month_ID, and uses the min function to guarantee uniqueness at that level.  (No, it doesn’t trust you enough to simply do a SELECT DISTINCT.)  The second level of group by is the normal report aggregation by report row.  So the result is that the join is done correctly, which each monthly fact record joined to 1 dimensional record at the appropriate level, to produce the correct values in the report.

This blog is an updated version of a previously released post, which can be found here.

framework manager curriculum training computers conceptIronside is happy to announce the release of our new, enhanced, and fully-revised Framework Manager curriculum. This update combines customer suggestions with expert insights from Ironside instructors and consultants to form the most comprehensive Framework Manager course available today. These two classes represent the current best-of-breed thinking around metadata modeling in the BI space.

Built around the latest IBM Cognos 10.2.1 technology, these two classes (Fundamentals and Advanced) provide a cumulative 5-day program that gradually builds a complete understanding of the metadata modeling process. Each chapter contains conceptual information and hands-on, practical demos and workshops.

New Course Features

Using the original Ironside Framework Manager curriculum as a foundation, we have developed many additions to it that further flesh out the intricacies of metadata modeling and supplement the already existing content. Below are some highlights of new material included in the Framework Manager Fundamentals and Advanced courses:

• Overview content describing in more depth how Framework Manager fits into and interacts with Cognos as a whole.

• Detailed descriptions of different database design structures and how best to make those structures available through Framework Manager.

• Key reporting requirements and how gathering them can set your project up for success.

• Deepened explorations of the core modeling layers and the objects used within them.

• Rich new chapters investigating DMR options in Framework Manager, determinants, and security.

• Full discussions of parameter maps, session parameters, and implementing them in calculations and filters.

• Enhanced content around accessing the SQL back end that governs how data appears in Framework Manager.

• Crucial performance tips and strategies for working in multi-modeler environments.

Development Process

The top priority during the development process was to create course that provides an all-inclusive and cohesive understanding of all the Framework Manager features and functionality, promoting an understanding of the entire metadata modeling process from end to end. To achieve this, we enlisted the guidance of our technical trainers with years of Cognos and technical training experience, as well as our technical consultants, to provide real-world, practical modeling perspectives.

Course Structure

Below is a listing of the modules and topics covered in the Framework Manager curriculum.

Framework Manager curriculum

Upcoming Framework Manager Classes

Below is a listing of several of the upcoming public Framework Manager courses.

IBM Cognos 10.2.1 Framework Manager Fundamentals and Advanced Training Courses- Ohio

Mon Apr 21, 2014, 8:30 AM – Fri Apr 25, 2014, 4:30 PM

Worthington, OH

IBM Cognos 10.2.1 Framework Manager Fundamentals and Advanced Training Courses- Detroit

Mon Apr 28, 2014, 8:30 AM – Fri May 2, 2014, 4:30 PM

Detroit, MI

IBM Cognos 10.2.1 Framework Manager Fundamentals and Advanced Training Courses- Miami

Mon May 12, 2014, 8:30 AM – Fri May 16, 2014, 4:30 PM

Doral, FL

IBM Cognos 10.2.1 Framework Manager Fundamentals and Advanced Training Courses – Atlanta

Mon Jun 2, 2014, 8:30 AM – Fri Jun 6, 2014, 4:30 PM

Atlanta, GA

IBM Cognos 10.2.1 Framework Manager Fundamentals and Advanced Training Courses – DC

Mon Jun 16, 2014, 8:30 AM – Fri Jun 20, 2014, 4:30 PM

Washington, DC

 

In continuing with our Report Studio Cookbook series, this month’s cookbook recipe will show you the steps for altering report XML to make bulk changes. This is most commonly used when name changes are made to the FM Model and abandoned data items result. Read more

IBM SPSS offers a variety of integration options with other enterprise solutions, i.e., enterprise data warehouses, file systems, and business intelligence applications. In this article, we specifically focus on the integration with the IBM Cognos BI environment. As most of you are aware, IBM SPSS leverages the power of predictive analytics by allowing the user to examine the current state of their business, while at the same time providing a view of the future using advanced analytics techniques. With the use of the IBM Cognos BI integration point, the results are immediately available for IBM Cognos reporting, allowing for easy distribution to broad user communities.

In our previous article, we described the general workflow on how to import and export IBM Cognos packages from IBM SPSS Modeler. In this article, we will provide a simple case study using the standard IBM Cognos sample set to further illustrate the above mentioned integration.

Note:

-The integration is only available in IBM SPSS Modeler version 14.1 or higher.

-The supported IBM Cognos environments are version 8.4 and higher.

Providing Data to IBM SPSS from IBM Cognos

The IBM Cognos BI source node enables data miners to read data directly from IBM Cognos Framework Manager, including relational, dimensionally-modeled relational (DMR), and OLAP sourced packages.

In this case study, we select the Go Sales (query) package from IBM Cognos samples and use the following items from the query subject as our source data:

Go Sales (query)

  • Sales (query)

    • Sales

      • Revenue
    • Time

      • Year
      • Month Key
      • Month
      • Date

Before choosing a Framework Manager package to import the data, the user needs to establish a connection to the IBM Cognos server by providing the dispatcher URL and user credentials.

Conduct Data Mining Analysis in IBM SPSS Modeler

The source data we bring in from IBM Cognos contains revenue data for years 2004-2007. The objective of this study is to use advanced modeling techniques to discover trends behind the revenue data and predict future revenues. The following stream in IBM SPSS Modeler uses time series analysis to analyze the patterns in the monthly historical data and project those patterns to determine a range within which future values of the series are likely to fall.

 

 

 

In this example, we extend the forecast of the expected revenue to the next four time periods. The following output from IBM SPSS is a time series graph and shows the predicted (green line) vs. actual (blue line) revenue for each month between Jan, 2004 – Jul, 2007 and the projected revenue for Aug – Nov 2007.

 

 

 

 

Exporting IBM SPSS Predictive Results into IBM Cognos BI

The end results from an IBM SPSS Modeler stream can be exported back to the IBM Cognos BI environment, including both the transformed and scored data based on predictive modeling. In this case study, we will export the scored dataset of time series analysis into a database table called “Predicted_Sales” and generate an IBM Cognos package on the Cognos server reflecting the new database table.

Once the export is complete, we could use IBM Cognos Report Studio to create a visual report based on the exported data, including the predictions and confidence interval values for consumption by a broad user base.

 

 

 

 

 

 

 

 

Through this case study, we showcase how a user can combine the predictive analytic capabilities of IBM SPSS Modeler with business intelligence features of IBM Cognos BI. For any additional questions regarding the integration and its implementation, please contact us at the Ironside Group.

 

 

One of the features of IBM SPSS Modeler 14.1 is that it can now directly integrate with your IBM Cognos BI environment to leverage the power of predictive analytics. If you are new to IBM SPSS then feel free to reference our previous newsletter article showcasing all the components of IBM SPSS. This month’s tech tip will detail how IBM SPS Modeler integration is accomplished.

Before we get started with the tech tip, let’s review SPSS Modeler architecture from a high level.

1. Use an IBM Cognos BI source node to read data directly from a Cognos 8/10 Framework Manager package. This allows analysts to dive into the data mining process using friendly IBM Cognos package items rather than use underlying database tables that they may or may not be intimately familiar with.

2. Once the data mining process is completed, export the results back to a database table and create a Framework Manager package to be published on the IBM Cognos server for additional analysis and reporting.

Importing from Cognos Packages

1. Launch IBM SPSS Modeler and create a new stream.

2. From the Sources node tab, drag and drop the IBM Cognos BI node into the stream. Right click on the node and choose Edit.

3. Click on Edit within the Connection box to enter the Cognos connection information. Click OK to establish the connection.

4. Enter the following:

  •  Cognos server URL: Enter the dispatcher URL.
  •  Mode: Choose Set Credentials to log in with Cognos username and password. Choose Use Anonymous Connection to log in as anonymous user (only if anonymous login is enabled on the Cognos side).
  •  Namespace: Enter the namespace used to authenticate the Cognos user.
  • User name/Password: Cognos log in credential.

5. Click on Edit within the Package box to select a Cognos package.

6. Once a package is chosen, the content will show on the left hand side of the dialog. Choose one or multiple items to bring them to the right side. All chosen items will be imported as columns to SPSS.

Exporting Cognos Packages

1. From the Export node tab, drag and drop the IBM Cognos BI Export node into the stream.

2. Right click on the node and choose Edit.

3. Click on Edit within the Connection box to enter the Cognos connection information. This is similar to what you have entered in the Cognos BI source node.

4. Once the connection is established, click on the Refresh button next to the Data Source to choose a data source where the information will be written.

5. Click on Edit next to the Folder box to choose a location where the package will be stored.

6. Enter a package name and choose to publish it now or create an action script to publish it later.

7. On the left side, choose ODBC connection to enter the ODBC connection info to the database.

8. Click on the drop down arrow next to the data source box to select or create the database connection. This connection must point to the same data source that you selected in the Cognos connection tab. If the connection has not already been created, the user needs to create it in ODBC Data Source Administrator on the Windows machine. If you are connecting through an SPSS server, the data connection needs to be created on the SPSS server.

9. Enter the table name and select how you would like the information to be added to the table. Edit additional information as needed.

10. Click Run to publish the package in Cognos server.

For any additional questions regarding SPSS Modeler and its implementation, please Contact Us.

 

 

 

 

 

 

 

In September, IBM introduced Cognos version 10.1.1, the latest update to Cognos 10. Among the significant enhancements in the software is the ability to Dynamic Query Mode with relational databases.

Dynamic Query Mode (DQM) is an intelligent data caching mechanism which can dramatically improve the performance of OLAP style queries. With the prior version, DQM supported cubes built from TM1, Essbase and SAP BW. With version 10.1.1, IBM Cognos adds support for Microsoft SQL Server Analysis cubes, and the most common relational databases, specifically DB2, Netezza, SQL Server, Teradata and Oracle. Read more

While Framework Manager didn’t receive the dramatic changes in IBM Cognos 10 that some other areas did, there are still some interesting and useful new features, which might have slipped under the radar with the new release.

The main new features are:

  • The ability to create “Durable Models”
  • Model Design Accelerator
  • Dynamic query Mode Read more

The Ironside Group is excited to once again publicly offer both our Fundamentals and Advanced Framework Manager Training courses in back-to-back fashion. As with all of our course offerings, this curricula is lead by our veteran consultants who have decades of in-depth real-world technical experience with IBM Cognos software. Take advantage of this special combined offering and become a Framework Manager expert in a weeks time.

Course: IBM Cognos 8 Framework Manager Fundamentals (View Description)
When: March 7-9, 2011

Course: IBM Cognos 8 Framework Manager Advanced (View Description)
When: March 10-11, 2011

Where: Ironside Group Lexington, MA (Boston)
Read more

The Ironside Group is excited to once again publicly offer both our Fundamentals and Advanced Framework Manager Training courses in back-to-back fashion. As with all of our course offerings, this curricula is lead by our veteran consultants who have decades of in-depth real-world technical experience with IBM Cognos software. Take advantage of this special combined offering and become a Framework Manager expert in a weeks time.

Course: IBM Cognos 8 Framework Manager Fundamentals (View Description)
When: January 10-12, 2011

Course: IBM Cognos 8 Framework Manager Advanced (View Description)
When: January 12-14, 2011

Where: Ironside Group Lexington, MA (Boston)
Read more

As consultants, we have found that the majority of struggling IBM Cognos implementations we encounter are due to either poor framework model design, or more often, a flawed database architecture. The case of the latter can present itself in a number of ways, but in the worst cases, we’ve discovered reporting applications built upon highly normalized OLTP systems that are ineffective and detrimental to both analytical and operational performance of an organization’s information systems. Another common case is an implementation of Cognos upon an existing data warehouse where users are provided with unfettered ad-hoc access to the data source for the first time, exposing previously unforeseen or unknown data quality issues. Read more