Automating business processes is an important way to boost productivity, not only by providing the means to increase throughput, but also by allowing staff to concentrate on other, more important, parts of their jobs. Even in today’s digital businesses, we have electronic equivalents to old fashioned paper pushing, and many important tasks that take longer than they should, such as processing invoices, approving payments, and managing purchases.
Business process optimization isn’t new. It’s something we’ve been trying to do for decades, from the days of the first mainframes. Many of today’s digital workflows were defined back in the 1990s, with the arrival of client/server systems. Many of those processes are still embodied in aging Visual Basic or Delphi applications, ossifying work practices even as line-of-business applications have been modernized. It’s time for business process automation to make the same jump, using cloud-native tools and frameworks, and even taking advantage of machine learning.
An introduction to process mapping
Part of the problem facing many enterprises is that they think they know their business processes, but despite having defined inputs and outputs, most have diverged from the initial documentation. Much of that is due to the informal nature of process training and a reliance on implicit knowledge. It becomes a case of “No, that’s the way we do things here” instead of following the documentation. Manual business processes are organic. How you do something may depend on who you’re sitting next to in the office, rather than any formal approach.
Having tools that identify and optimize processes is an important foundation for any form of process automation, especially as we often must rely on manual walkthroughs. We need to be able to see how information and documents flow through a business in order to be able to identify places where systems can be improved. Maybe there’s an unnecessary approval step between data going into line-of-business applications and then being booked into a CRM tool, where it sits for several days.
Modern process mining tools take advantage of the fact that much of the data in our businesses is already labeled. It’s tied to database tables or sourced from the line-of-business applications we have chosen to use as systems of record. We can use these systems to identify the data associated with, say, a contract, and where it needs to be used, as well as who needs to use it.
With that data we can then identify the process flows associated with it, using performance indicators to identify inefficiencies, as well as where we can automate manual processes—for example, by surfacing approvals as adaptive cards in Microsoft Teams or in Outlook. Thus we can turn what might have been an intensive or time-consuming task into a piece of “microwork” that can be handled without interrupting other tasks.
Process mining comes to Power Automate
Acquired with Minit, the new process mining tools in the Power Platform fill the gap between older technologies like BizTalk Server and the newer low-code tools of Power Automate, while providing a foundation for a future generation of capabilities based on machine learning. The company recently announced that these tools would become generally available at the start of August 2023.
Microsoft’s approach to process mining uses event data extracted from your systems of record to build a model of the processes that use those systems and their data, showing how the data flows from system to system. You can then apply performance indicators to those processes, to determine the efficiency of different paths and identify what changes can be made to improve them.
To make use off Power Automate Process Mining, you’ll need access to your application log files. This can take the form of a connection to the app or an export in a common format like CSV. One you have access to the log data you want to use in Power Automate, the process mining tool uses the familiar Power Query Editor to transform data and add attributes to help identify the underlying activities, for example indicating when an event starts and ends. This labeling allows the tool to trace the steps of your business processes from beginning to end and show the different paths they can take.
A desktop application helps with additional analysis, offering process editing features. At the heart of the mining output is a process map. You can see how many times a process was discovered in the source data, how often it ran, and how many times the same activity repeated during the run. Process timings show how long the whole process took, as well as the duration of each activity. Together these metrics help you prioritize processes and find places where they can be optimized. Some of this data is summarized as key performance indicators, giving a quick view of where you might be able to get quick wins, for example reducing the number of loops in a process.
Using process maps to build code
The chief benefit of this analysis is discovering where there might be bottlenecks. Once you’ve identified these activities, you can then see if the delays or repetitions are happening at a human or software level, and what kind of resource needs to be deployed to reduce delays.
KPIs and visualizations allow you to quickly see where changes could improve processes, whether that’s by adding automation or choosing a specific process path over alternatives. The output of a process mining exercise like this isn’t code, but a design document that you can use to improve your code.
Is a step slow because it’s using an inefficient API call? Armed with the KPI data, you’re now able to use other performance monitoring tools to investigate how a specific service operates. Traditional application monitoring tools may not have been able to identify issues, as the call may be infrequent or have little or no resource impact. It’s only when we see the business impact of design decisions that we can investigate alternative approaches.
Process mining is a powerful tool that not only reduces the risks involved with automating manual processes, but also provides insights into how enterprise applications are running and how they affect business operations.
An AI future for process mining?
One interesting aspect of Power Automate’s process mining tool is that it generates a directed graph from the data. At the same time, you’ve labeled the data you’ve used to generate that graph. It’s an approach that seems tailor made for use with AI tools like the OpenAI Codex model used by the Power Platform to generate applications.
At Inspire 2023, Microsoft demonstrated a future version of these process mining tools that could generate a process flow automatically based on discovered workflows, using Power Automate connectors and actions, suggesting possible workflows—including the addition of human approvals where necessary.
Adding AI to process mining would allow us to go straight from process map to code. Because you’re working with labeled data and endpoint metadata and descriptions, the model will be grounded from the start, significantly reducing the risk of error, even when using generative AI models.
Having tooling like this to build on should give Microsoft an advantage, especially when technologies like the Microsoft Dataverse allow you to have a common process model for all your internal flows. There’s a grand vision that comes out of this, where processes are adaptive and self-optimize over time, allowing new features to emerge as more data is gathered. It’s going to be an interesting few years as these technologies mature and become widely available.