Posts

Last week at Analytics University, IBM formally announced the release of the next major version of Cognos Analytics, v11.1.

IBM has hinted at the inclusion of “smarts” for “augmented analytics” and improvements in the usability of this new version over the past year. Our expectation was that these improvements would continue to “modernize” Cognos and help address some of the competitive pressures that organizations with legacy investments have been encountering in recent years. Read more

Well in advance of the IBM acquisition of Cognos, the Cognos name was synonymous with powerful, trusted enterprise business intelligence and managed reporting. Between its ability to scale to meet the needs of the largest enterprises, its robust, governed metadata layer that made it possible to report against a vast array of different data sources, powerful reporting capabilities churning out highly complex managed BI reporting solutions, and ad-hoc reporting and analysis against those governed data sources, IBM Cognos was the answer for almost all enterprise reporting needs. Until it wasn’t.

Read more

The last several years have represented an interesting journey for organizations and teams leveraging Cognos for analytics. During that time, visual data discovery tools have made a significant impact. However, as of late, we have seen the pendulum swing back to concepts introduced by enterprise BI tools long ago.¹ What’s old is new again.

When these new tools arrived, they challenged both the status quo and what many of us saw as an ideal solution to the localized, ungoverned, manually-intensive, and often error-prone data manipulation (i.e. “shadow analytics”) processes of the past. If we think back to the dawn of the modern business intelligence age in the mid 1990’s, we realize that these challenges are what tools like Cognos were developed to solve. Read more

Many Finance professionals are faced with the daily task of solving problems. Whether they come from IT Services, the Business Intelligence team, or even the next cube over, we can never seem to fix enough.

The first, and most important step in solving problems is to identify them, otherwise known as Root Cause Analysis. The “5 Whys” methodology is far and away one of the most efficient methods for honing in on and conceptualizing the root cause of any dilemma you might come across.

Read more

One size does not fit all. Try as they might, there is not a single BI platform that can offer every capability that users require. With organizational complexity increasing, and the growing demand for self-service analytics, it has become commonplace, even recommended, for organizations to maintain multiple BI platforms to meet the needs of people in diverse roles with differing needs across the organization. Read more

Governance is the ongoing process of creating and managing processes, policies, and information. This includes strategies, processes, activities, skills, organizations, and technologies for the purpose of accelerating business outcomes. It also involves creating organizations, roles and responsibilities to perform this management. In our experience, many organizations address governance once and often without completing the necessary tasks. Organizations that excel in data and analytics governance continuously manage the process on an ongoing basis. Read more

According to ‘The Economist’, data is the new oil. It is now the world’s most valuable resource. The volume of data available to organizations to capture, store, and analyze has changed the ways in which organizations address innovation, and analytics is a true competitive differentiator.

Unfortunately, business analysts, data scientists, and other line of business users performing self-service analytics are spending a majority of their time preparing data for analysis rather than actually garnering and sharing the insights to be found in it (1), even with the help of self-service data prep tools like Alteryx, Trifacta, and Tableau’s Maestro (coming soon). Read more

Ironside’s Greg Bonnette was featured in NRF’s Magazine, STORES, in the article titled “Mapping Crime Risks: Location data helps specialty retailer minimize loss and maximize sales.” written by Liz Parks in September 2017. Read more

Being the sole data champion within your organization can present difficulties when you’re vying for limited company resources and attention from the “powers that be.” No doubt, you may find the role to be frustrating at times. Yet you may also find the role to be extremely rewarding, because it gives you a great deal of responsibility and offers you with the opportunity to achieve the goal that every data champion aspires to: Gaining user buy-in of the data insights you’ve unlocked. Read more

Ironside’s Ray Haddad published an article on LinkedIn titled “Applications of Nonlinearity in Life and Customer Analytics” in August 2017.

 

What’s the best way to hold on to your customers in a competitive marketplace?

That’s a classic business question, for which Ray Haddad uses an innovative methodology to provide surprising answers in this just-posted article. Ray, an Account Executive in Data and Analytics Strategy at Ironside, initially poses a hypothetical question to serve as an example of this methodology in action: Are you inclined to think that increasing your automotive speed from 60 MPH to 120 MPH will save you more time in getting to your destination than by increasing it from 10 MPH to 20 MPH? Read more

Portfolio Items