Evolve or Die: Business Intelligence’s Future

plant in the desert

Business intelligence has been around for a long time. From decision support systems in the 1960s through Ralph Kimball’s books on dimensional modeling in the 1980s, the core concepts of the discipline are decades old. As these concepts and the products built around them mature, more advanced techniques and technologies come to light that evolve and redefine what we thought we knew about the business intelligence space and business intelligence’s future. For instance, developments like the cloud, data visualization tools, and predictive analytics are changing the way businesses evaluate and make decisions from their data.

The Cloud

business intelligence's future cloud banner

The cloud is a popular topic of discussion among IT professionals, with questions like “will the cloud replace the data center?” being commonly heard in many a water cooler conversation. This is an interesting question from an infrastructure standpoint, but it’s less relevant when talking about BI. Whether a data warehouse exists on-premise or in the cloud, it’s still the same warehouse, but what’s interesting here is the speed and flexibility benefits that cloud implementations can offer.

Cloud instances can be created, expanded, and eliminated very quickly and very easily. For the most part, payment is based on usage, so companies aren’t paying for unused capacity. BI platforms can leverage this flexibility to more quickly handle expanded demand, or to create new solutions to answer previously unknown requirements. BI is always an iterative process: providing answers to questions always leads to more (and better) questions. Putting BI in the cloud can make that process faster. If you aren’t exploring cloud solutions, you should – they’re cheap to look into and easy to implement.

Data Visualization

business intelligence's future visualization examples

Data visualization tools have emerged over the last 20 years as product offerings – the same capabilities existed earlier, but weren’t brought together in the same way. With these tools, the focus is on simple and fast analysis of data by individual users. Tools like Tableau and Spotfire load data into memory and use OLAP-style reporting to quickly produce graphical visualization of data. They appeal to business power users who are able to load data, process it, and produce results without IT assistance (or resistance, as the case may be). This has led to a high rate of adoption, because when the business side of the house is clamoring for a tool it tends to get picked up more quickly.

The challenges around data visualization tools are related to their strengths. Due to their focus on local handling of data, they are less adept at managing data across the enterprise. They are less focused than traditional BI on data integration and manipulation – often a large number of spreadsheets spring up to serve these tools, with all the issues of data integrity and consistency that brings. Lastly, data visualization is extremely useful but it is not the only type of reporting that organizations require. Considering these tools a replacement for an enterprise BI tool like Cognos is a mistake – each has different complementary capabilities. That said, however, visualization as a module within an enterprise BI system can add a lot, especially when displaying large amounts of data points like location intelligence solutions do.

Recently, data visualization tools and the cloud have started to converge. Products like Domo and Watson Analytics stage data and provide analytics from the cloud. This can help with data sourcing issues and makes distribution much easier. These tools are a good option for striking a balance between end user power and overall management of the data and reporting.

Big Data and the Data Lake

business intelligence's future data lake concept

The concept of a less organized “dumping ground” for data has been around for a long time. A properly modeled dimensional data warehouse represents a great deal of planning, analysis, and development. Such “first-class” treatment of data simply isn’t possible for everything (or, when it is tried, it usually proves cost-prohibitive). Keeping an operational data store, for example, from the source systems that the warehouse uses is a good compromise between making data available and making it easy to access.

The advent of big data and unstructured data in particular has made such a concept even more important. Companies now have access to vast streams of data – social media feeds, online records, feedback from comments, etc. – that either didn’t exist before or were too vast to process. Thus, the concept of the data lake arose – a vast pool of data that analysts can draw from as needed.

In the past, limits on storage and processing made such a repository impractical. Now, storage is cheaper and processing is easier to handle using big data technologies (Hadoop, Spark, etc.). What has not changed, however, is the need to analyze and structure this data into a format that can be used for true analytics. Big data provides the methods to create this structure, but someone with technical skills still needs to apply those methods to provide something that the business can use. This concept of putting a data analyst into the business information loop requires a much more agile/iterative approach to projects. While data may go from a data lake into a warehouse, more often it will go into a temporary analysis or an aspect of a predictive model. As more types of data emerge, more ways of using them emerge as well.

Predictive Analytics

business intelligence's future railroad horizon predictive analytics concept

Statistical analysis software is not new, but its use in business is rapidly growing as this kind of software and the skills required to use it become more common. As businesses master historical (backward looking) BI, they start to look into the future. Predictive analytics lets a business leverage their historical knowledge to make better decisions going forward.

The basic concepts of predictive analytics are simple. Historical data is fed into a model that produces insights about that data. Those insights can then be applied to new data as it arrives – the better the model, the more accurate the insights. For example, data on consumer fraud can be used to predict new instances of fraud as (or before) they happen. Additionally, customer data can be grouped to find types of customers who prefer different types of promotions or products. Crimes can even be prevented through targeted analysis of where incidents occurred most in the past. These are only a few applications. The list of ways in which you can use this technology goes on and on.

Statistical analysis is a well-established branch of applied mathematics. For this reason, using predictive analytics tools requires extensive skills and training to understand how data can be used to create a model and how to evaluate the accuracy and applicability of that model. Without training, a predictive tool can create models that appear to function but don’t actually make meaningful predictions. Extensive experience is a requirement before adopting a tool like SPSS on a wide scale, although tools like SPSS Modeler and Watson Analytics are making it easier for citizen data scientists to take advantage of predictive technologies through intuitive user interfaces and powerful AI. At the very least, however, you should consult with an experienced data scientist before deciding how to integrate these processes into your wider analytics environment.

This type of analysis benefits greatly from high volumes of data, making a great fit with big data technologies. Additionally, the insights it creates are often extremely valuable when pushed back to data warehouses (a customer retention score or a future revenue projection for customers, for example). The models that predictive tools create can be applied to existing or arriving data as part of data processing, meaning the model can be improved over time without disrupting the process.

Conclusion: What Will Business Intelligence’s Future Be?

All of these things – the cloud, big data, predictive analytics, and data visualization – add to existing BI capabilities. Companies aren’t replacing their data warehouses; they are adding new tools to gain deeper and faster insights. This means new initiatives, new skills, and changes to existing business and technical processes are on the horizon.
BI is evolving, and what has always been true will remain so: businesses that embrace the newer techniques will see competitive advantages over those who don’t. If you’d like to start mapping your evolutionary path, get in touch with us. Our expert advisors can help you find the approach that’s right for you.

Advanced Analytics eBook Download