MicroPact Blog

Gartner BPM Summit 2013 – Analytics Insights for the Business Process

A 21st century perspective

Sandeep Khare's avatar

By: Sandeep Khare

April 3, 2013 | Analysts

Yesterday at the Gartner BPM summit I attended a thought provoking session on “Bringing Analytics Insights to the Business Processes.” Gartner Research Vice President Kurt Schlegel’s session focused on how analytics capabilities’ are evolving impacting traditional business process management systems. A lot of ground was covered in this presentation, for this post I have presented my thoughts in respect to Kurt’s overview of traditional and emerging analytics models.

So what is the value addition of analytics? Is it access to more data, to provide better analytic tools, or just create more reports? Per Kurt, the biggest value add that analytics lends is precision. In his introductory slide, Kurt emphatically stated,

“In the 20th century, business processes were standardized, making them more homogeneous for the sake of efficiency. In the 21st century, business processes will be personalized, making them more precise for the sake of efficacy.”

Today analytics are driving some of the biggest businesses such as Amazon, Netflix, Pandora, etc., by allowing them to deliver personalized content to end-users while simultaneously giving them access to relevant data so they can make their own decisions.

Analytics is becoming a powerful medium to empower customers

Traditional Model

Historically, analytics has been synonymous with data warehousing. In this model, you extract the data from disparate systems, and then transform and load that data into a centralized data warehouse. That data warehouse may connect to smaller, manageable data marts, which in turn integrate with reporting and analysis tools. Companies have to employ workers with strong skills in data analysis and business intelligence to deliver complex reporting needs.  The diagram below from Kurt’s presentation illustrates this model.

Where are the limitations?

First, this type of system takes a long time (several months to years) to implement and secondly, it limits data access to a very small subset of people in the organization. Most end-users therefore rely on this group for all their reporting and analysis needs. I agree with Kurt’s assessment that this is an overly centralized information management architecture and does not provide enough autonomy to the end-users to make analytics successful.

Emerging Model

Rapidly growing business needs, competition and need to differentiate is driving  analytics towards a much more agile model whereby a variety of data sets can be brought together quickly. These solutions can be built in rapid fashion and include both structured and unstructured data. In this scenario it may not be necessary to create a centralized data warehouse – a lot of content can stay in the application systems where it currently resides and can be analyzed directly. New analytics frameworks will support more decentralized, autonomous operations. This model will complement the traditional data warehousing model, not replace it. The diagram below from Kurt’s presentation illustrates this decentralized analytical framework.

Kurt used the analogy of a strongly regulated and centralized energy sector where all the power generation is done by big companies and then distributed across power grids, to explain how empowering your customers not only helps them but also adds value for the organization. This model does not leave the customer with much flexibility as they always have to rely on what these companies supply (This is similar to a centralized data warehousing model). Fast-forward to 21st century, where companies are encouraging customers to deploy solar panels on their roof tops or within their properties. This not only gives customers much more control over the power they consume but at the same time consumers can add value to the overall system by pushing the extra power they generate into the grid lines. Similar benefits occur when analytics is made more decentralized and agile.

Several vendors have begun to fill this gap through the development of tools and capabilities which make it easier for end users to analyze their data with a little bit of training, bringing down the learning curve for analytics. As per Kurt, several new analytics tool allow,

  • rapid prototyping by quickly integrating data from multiple sources
  • unrestricted drilling of data
  • intuitive navigation allowing users to explore data with minimal training

To make analytics more and more relevant and valuable, companies are employing advanced visualization/in-memory based technologies; search based technology (think Google); and big-data based technologies, where the data aggregation happens not only at an organizational level but across an entire industry such as banking, insurance, etc.

The future of analytics is changing rapidly and companies must evaluate their long-term strategy carefully. The competitive advantage and performance improvement analytics can generate if implemented with a clear vision and with the right approach can be invaluable to an organization. Empowering the end-users is a rewarding process but must be done in a diligent manner. Organizations should have strong data governance rules and policies in place before they make their data accessible to end-users for analysis and reporting purposes.

About the Author

Sandeep Khare worked at MicroPact from 2003-2016, most recently in the Marketing department.